Spaces:
Runtime error
Runtime error
domain_investigation_tools_rod
#2
by
RodDoSanz
- opened
- README.md +1 -76
- app.py +14 -123
- packages.txt +0 -1
- pyproject.toml +2 -14
- requirements-dev.txt +13 -57
- requirements.txt +24 -98
- subdomains/subdomains.txt +0 -999
- tdagent/constants.py +0 -8
- tdagent/tools/get_domain_information.py +0 -368
- tdagent/tools/get_url_content.py +36 -95
- tdagent/tools/internal_company_user_search.py +11 -15
- tdagent/tools/lookup_company_cloud_account_information.py +32 -37
- tdagent/tools/query_abuse_ip_db.py +42 -54
- tdagent/tools/rdap.py +0 -110
- tdagent/tools/retrieve_from_mitre_attack.py +0 -59
- tdagent/tools/send_email.py +32 -30
- tdagent/tools/virus_total.py +37 -32
- tdagent/tools/whois.py +0 -49
- tdagent/utils/__init__.py +0 -0
- tdagent/utils/json_utils.py +0 -14
- uv.lock +0 -0
README.md
CHANGED
@@ -8,84 +8,9 @@ sdk_version: 5.32.1
|
|
8 |
app_file: app.py
|
9 |
pinned: false
|
10 |
license: apache-2.0
|
11 |
-
|
12 |
-
- mcp-server-track
|
13 |
-
short_description: Cybersecurity MCP tools to enhance threat insights
|
14 |
---
|
15 |
|
16 |
-
|
17 |
-
# TDAgentTools & TDAgent: Empowering Cybersecurity with Agentic AI
|
18 |
-
|
19 |
-
Welcome to TDAgentTools & TDAgent, our innovative proof of concept (PoC) crafted for the Agents-MCP Hackathon. Our initiatives focus on leveraging Agentic AI to enhance cybersecurity threat analysis, providing robust tools for data enrichment and strategic advice for incident handling.
|
20 |
-
|
21 |
-
## Team Introduction
|
22 |
-
|
23 |
-
We are an AI-focused team within a company, dedicated to empowering other teams by implementing AI solutions. Our expertise lies in automating processes to enhance productivity and tackle complex tasks that AI excels in. Our hackathon team members include:
|
24 |
-
|
25 |
-
- Pedro Completo Bento
|
26 |
-
- Josep Pon Farreny
|
27 |
-
- Sofia Jeronimo dos Santos
|
28 |
-
- Rodrigo Dominguez Sanz
|
29 |
-
- Miguel Rodin
|
30 |
-
|
31 |
-
## Project Overview
|
32 |
-
|
33 |
-
### Track 1: MCP Tool - TDAgentTools
|
34 |
-
|
35 |
-
TDAgentTools serves as an MCP server built using Gradio, offering a wide array of cybersecurity intelligence tools. These tools enable users to augment their LLMs' capabilities by integrating with various publicly available cybersecurity intel resources. Our TDAgentTools are accessible via the following link: [TDAgentTools Space](https://huggingface.co/spaces/Agents-MCP-Hackathon/TDAgentTools).
|
36 |
-
|
37 |
-
#### Available Tools:
|
38 |
-
1. **TDAgentTools_get_url_http_content**: Retrieve URL content through an HTTP GET request.
|
39 |
-
2. **TDAgentTools_query_abuseipdb**: Query AbuseIPDB to check if an IP is reported for abusive behavior.
|
40 |
-
3. **TDAgentTools_query_rdap**: Gather information about internet resources such as domain names and IP addresses.
|
41 |
-
4. **TDAgentTools_get_virus_total_url_info**: Fetch URL information using VirusTotal URL Scanner.
|
42 |
-
5. **TDAgentTools_get_geolocation**: Obtain location details from an IP address.
|
43 |
-
6. **TDAgentTools_enumerate_dns**: Access DNS configuration details for a given domain.
|
44 |
-
7. **TDAgentTools_scrap_subdomains_for_domain**: Retrieve subdomains related to a domain.
|
45 |
-
8. **TDAgentTools_retrieve_ioc_from_threatfox**: Get potential IoC information from ThreatFox.
|
46 |
-
9. **TDAgentTools_get_stix_object_of_attack_id**: Access a STIX object using an ATT&CK ID.
|
47 |
-
10. **TDAgentTools_lookup_user**: Seek user details from the Company User Lookup System.
|
48 |
-
11. **TDAgentTools_lookup_cloud_account**: Investigate cloud account information.
|
49 |
-
12. **TDAgentTools_send_email**: Simulate emailing from [email protected].
|
50 |
-
|
51 |
-
> **Note:** TDAgentTools rely on publicly provided APIs and some of which require API keys. If any of these API keys are revoked, certain tools may not function as intended.
|
52 |
-
|
53 |
-
[Track1 Demo link](https://youtu.be/c7Yg_jOD6J0)
|
54 |
-
|
55 |
-
|
56 |
-
### Track 3: Agentic Demo Showcase - TDAgent
|
57 |
-
|
58 |
-
TDAgent is an adaptive and interactive AI agent. This agent facilitates a dynamic AI experience, allowing users to switch the LLM used and adjust the system prompt to refine the agent’s behavior and objectives. It uses TDAgentTools to enrich threat data. Explore it here: [TDAgent Space](https://huggingface.co/spaces/Agents-MCP-Hackathon/TDAgent).
|
59 |
-
|
60 |
-
#### Key Features:
|
61 |
-
- **Intelligent API Interactions**: The agent autonomously interacts with APIs for data enrichment and analysis without explicit user guidance.
|
62 |
-
- **Enhanced Data Enrichment**: Automatically enriches initial incident data, providing deeper insights.
|
63 |
-
- **Actionable Intelligence**: Suggests actions based on enriched data and analysis, displaying concise outputs for clearer communication.
|
64 |
-
- **Versatile Adaptability**: Capable of switching LLMs for varied results and enhanced debugging.
|
65 |
-
|
66 |
-
## Motivation and Goals
|
67 |
-
|
68 |
-
Our primary motivation is to explore Agentic AI applications in the cybersecurity realm, focusing on AI agent support for:
|
69 |
-
1. Enriching reported threat data.
|
70 |
-
2. Assisting analysts in threat analysis.
|
71 |
-
|
72 |
-
We aimed to:
|
73 |
-
- Explore Agentic AI technologies like Gradio and MCP.
|
74 |
-
- Enhance AI agent data enrichment with custom tools.
|
75 |
-
- Enable agent autonomy in API interaction and threat assessment.
|
76 |
-
- Equip the agent to propose specific incident response actions.
|
77 |
-
|
78 |
-
[Track3 Demo link](https://youtu.be/C6Z9EOW-3lE)
|
79 |
-
|
80 |
-
## Insights & Conclusions
|
81 |
-
|
82 |
-
- **Agent's Autonomy**: Demonstrated autonomous API interactions and data enrichment capabilities.
|
83 |
-
- **Enhanced Decision-Making**: The agent suggests data-driven insights beyond API outputs.
|
84 |
-
- **Future Improvements**: Plan to fine-tune threat escalation logic and introduce additional decision layers for enhanced threat management.
|
85 |
-
|
86 |
-
Our projects successfully demonstrated rapid prototyping with Gradio and Hugging Face Spaces, achieving all intended objectives while providing an engaging and rewarding experience for our team. This PoC shows the potential for future expansions and refinements in the realm of cybersecurity AI support!
|
87 |
-
|
88 |
-
|
89 |
# TDA Agent Tools
|
90 |
|
91 |
# Development setup
|
|
|
8 |
app_file: app.py
|
9 |
pinned: false
|
10 |
license: apache-2.0
|
11 |
+
short_description: tdb
|
|
|
|
|
12 |
---
|
13 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
14 |
# TDA Agent Tools
|
15 |
|
16 |
# Development setup
|
app.py
CHANGED
@@ -1,133 +1,24 @@
|
|
1 |
-
from pathlib import Path
|
2 |
-
from typing import NamedTuple
|
3 |
-
|
4 |
import gradio as gr
|
5 |
-
import gradio.themes as gr_themes
|
6 |
-
import markdown
|
7 |
|
8 |
-
from tdagent.tools.
|
9 |
-
dns_enumeration_tool,
|
10 |
-
extractor_of_ioc_from_threatfox_tool,
|
11 |
-
geo_location_tool,
|
12 |
-
scrap_subdomains_tool,
|
13 |
-
)
|
14 |
-
from tdagent.tools.get_url_content import gr_make_http_request
|
15 |
from tdagent.tools.internal_company_user_search import gr_internal_company
|
16 |
-
from tdagent.tools.
|
17 |
-
|
18 |
-
)
|
19 |
from tdagent.tools.query_abuse_ip_db import gr_query_abuseipdb
|
20 |
-
from tdagent.tools.rdap import gr_query_rdap
|
21 |
-
from tdagent.tools.retrieve_from_mitre_attack import gr_get_stix_of_attack_id
|
22 |
from tdagent.tools.send_email import gr_send_email
|
23 |
-
from tdagent.tools.virus_total import
|
24 |
-
|
25 |
-
|
26 |
-
|
27 |
-
|
28 |
-
|
29 |
-
|
30 |
-
|
31 |
-
|
32 |
-
def _read_markdown_body_as_html(path: str = "README.md") -> str:
|
33 |
-
with Path(path).open(encoding="utf-8") as f: # Default mode is "r"
|
34 |
-
lines = f.readlines()
|
35 |
-
|
36 |
-
# Skip YAML front matter if present
|
37 |
-
if lines and lines[0].strip() == "---":
|
38 |
-
for i in range(1, len(lines)):
|
39 |
-
if lines[i].strip() == "---":
|
40 |
-
lines = lines[i + 1:] # skip metadata block
|
41 |
-
break
|
42 |
-
|
43 |
-
markdown_body = "".join(lines).strip()
|
44 |
-
return markdown.markdown(markdown_body)
|
45 |
-
|
46 |
-
|
47 |
-
class ToolInfo(NamedTuple):
|
48 |
-
"""Gradio MCP tool info."""
|
49 |
-
|
50 |
-
name: str
|
51 |
-
interface: gr.Interface
|
52 |
-
|
53 |
-
|
54 |
-
TOOLS = (
|
55 |
-
ToolInfo("Make an HTTP request to a URL with specified method and parameters", gr_make_http_request),
|
56 |
-
ToolInfo("Query AbuseIPDB", gr_query_abuseipdb),
|
57 |
-
# Whois does not work from Spaces (port 43 blocked)
|
58 |
-
# ToolInfo("Query WHOIS", gr_query_whois),
|
59 |
-
ToolInfo("Query RDAP", gr_query_rdap),
|
60 |
-
ToolInfo("Virus Total URL info", gr_virus_total_url_info),
|
61 |
-
ToolInfo("Get IP's Location", geo_location_tool),
|
62 |
-
ToolInfo("DNS Enumerator", dns_enumeration_tool),
|
63 |
-
ToolInfo("Subdomain Retriever", scrap_subdomains_tool),
|
64 |
-
ToolInfo("Extractor of IoCs", extractor_of_ioc_from_threatfox_tool),
|
65 |
-
ToolInfo("ATT&CK STIX information", gr_get_stix_of_attack_id),
|
66 |
-
## Fake tools
|
67 |
-
ToolInfo("Fake company directory", gr_internal_company),
|
68 |
-
ToolInfo(
|
69 |
-
"Fake company cloud accounts",
|
70 |
gr_lookup_company_cloud_account_information,
|
71 |
-
|
72 |
-
|
73 |
)
|
74 |
|
75 |
-
## Application Interface ##
|
76 |
-
|
77 |
-
custom_css = """
|
78 |
-
.main-header {
|
79 |
-
background: linear-gradient(135deg, #00a388 0%, #ffae00 100%);
|
80 |
-
padding: 30px;
|
81 |
-
border-radius: 5px;
|
82 |
-
margin-bottom: 20px;
|
83 |
-
text-align: center;
|
84 |
-
}
|
85 |
-
"""
|
86 |
-
with (
|
87 |
-
gr.Blocks(
|
88 |
-
theme=gr_themes.Origin(
|
89 |
-
primary_hue="teal",
|
90 |
-
spacing_size="sm",
|
91 |
-
font="sans-serif",
|
92 |
-
),
|
93 |
-
title="TDAgent",
|
94 |
-
fill_height=True,
|
95 |
-
fill_width=True,
|
96 |
-
css=custom_css,
|
97 |
-
) as gr_app,
|
98 |
-
):
|
99 |
-
gr.HTML(
|
100 |
-
"""
|
101 |
-
<div class="main-header">
|
102 |
-
<h1>👩💻 TDAgentTools & TDAgent 👨💻</h1>
|
103 |
-
<p style="font-size: 1.2em; margin: 10px 0 0 0;">
|
104 |
-
Empowering Cybersecurity with Agentic AI
|
105 |
-
</p>
|
106 |
-
</div>
|
107 |
-
""",
|
108 |
-
)
|
109 |
-
with gr.Tabs():
|
110 |
-
with gr.TabItem("About"):
|
111 |
-
html_content = _read_markdown_body_as_html("README.md")
|
112 |
-
gr.Markdown(html_content)
|
113 |
-
with gr.TabItem("TDAgentTools"):
|
114 |
-
gr.TabbedInterface(
|
115 |
-
interface_list=[t_info.interface for t_info in TOOLS],
|
116 |
-
tab_names=[t_info.name for t_info in TOOLS],
|
117 |
-
title="TDAgentTools",
|
118 |
-
)
|
119 |
-
with gr.TabItem("Demo"):
|
120 |
-
gr.Markdown(
|
121 |
-
"""
|
122 |
-
This is a demo of TDAgentTools, a simple MCP server.
|
123 |
-
Be carefull with using well-known urls for malware distribution
|
124 |
-
when using the url content extractor tool.
|
125 |
-
""",
|
126 |
-
)
|
127 |
-
gr.HTML(
|
128 |
-
"""<iframe width="560" height="315" src="https://youtube.com/embed/c7Yg_jOD6J0" frameborder="0" allowfullscreen></iframe>""",
|
129 |
-
# noqa: E501
|
130 |
-
)
|
131 |
-
|
132 |
if __name__ == "__main__":
|
133 |
gr_app.launch(mcp_server=True)
|
|
|
|
|
|
|
|
|
1 |
import gradio as gr
|
|
|
|
|
2 |
|
3 |
+
from tdagent.tools.get_url_content import gr_get_url_http_content
|
|
|
|
|
|
|
|
|
|
|
|
|
4 |
from tdagent.tools.internal_company_user_search import gr_internal_company
|
5 |
+
from tdagent.tools.letter_counter import gr_letter_counter
|
6 |
+
from tdagent.tools.lookup_company_cloud_account_information import gr_lookup_company_cloud_account_information
|
|
|
7 |
from tdagent.tools.query_abuse_ip_db import gr_query_abuseipdb
|
|
|
|
|
8 |
from tdagent.tools.send_email import gr_send_email
|
9 |
+
from tdagent.tools.virus_total import gr_virus_total
|
10 |
+
|
11 |
+
gr_app = gr.TabbedInterface(
|
12 |
+
[
|
13 |
+
gr_get_url_http_content,
|
14 |
+
gr_letter_counter,
|
15 |
+
gr_query_abuseipdb,
|
16 |
+
gr_virus_total,
|
17 |
+
gr_internal_company,
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
18 |
gr_lookup_company_cloud_account_information,
|
19 |
+
gr_send_email,
|
20 |
+
],
|
21 |
)
|
22 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
23 |
if __name__ == "__main__":
|
24 |
gr_app.launch(mcp_server=True)
|
packages.txt
DELETED
@@ -1 +0,0 @@
|
|
1 |
-
whois
|
|
|
|
pyproject.toml
CHANGED
@@ -12,18 +12,7 @@ authors = [
|
|
12 |
requires-python = ">=3.10,<4"
|
13 |
readme = "README.md"
|
14 |
license = ""
|
15 |
-
dependencies = [
|
16 |
-
"attackcti>=0.5.4",
|
17 |
-
"audioop-lts>=0.2.1 ; python_full_version >= '3.13'",
|
18 |
-
"black>=25.1.0",
|
19 |
-
"cachetools>=6.0.0",
|
20 |
-
"dnspython>=2.7.0",
|
21 |
-
"gradio[mcp]>=5.32.1",
|
22 |
-
"markdown>=3.8",
|
23 |
-
"python-whois>=0.9.5",
|
24 |
-
"requests>=2.32.3",
|
25 |
-
"vt-py~=0.21.0",
|
26 |
-
]
|
27 |
|
28 |
[project.scripts]
|
29 |
|
@@ -109,7 +98,7 @@ line-length = 88
|
|
109 |
|
110 |
[tool.ruff.lint]
|
111 |
select = ["ALL"]
|
112 |
-
ignore = ["D100", "D104", "D107", "D401", "EM102", "ERA001", "TRY003"
|
113 |
|
114 |
[tool.ruff.lint.flake8-quotes]
|
115 |
inline-quotes = "double"
|
@@ -133,5 +122,4 @@ convention = "google"
|
|
133 |
[tool.ruff.lint.per-file-ignores]
|
134 |
"*/__init__.py" = ["F401"]
|
135 |
"tdagent/cli/**/*.py" = ["D103", "T201"]
|
136 |
-
"tdagent/tools/rdap.py" = ["PLR2004"]
|
137 |
"tests/*.py" = ["D103", "PLR2004", "S101"]
|
|
|
12 |
requires-python = ">=3.10,<4"
|
13 |
readme = "README.md"
|
14 |
license = ""
|
15 |
+
dependencies = ["gradio[mcp]>=5.32.1", "requests>=2.32.3", "vt-py~=0.21.0"]
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
16 |
|
17 |
[project.scripts]
|
18 |
|
|
|
98 |
|
99 |
[tool.ruff.lint]
|
100 |
select = ["ALL"]
|
101 |
+
ignore = ["D100", "D104", "D107", "D401", "EM102", "ERA001", "TRY003"]
|
102 |
|
103 |
[tool.ruff.lint.flake8-quotes]
|
104 |
inline-quotes = "double"
|
|
|
122 |
[tool.ruff.lint.per-file-ignores]
|
123 |
"*/__init__.py" = ["F401"]
|
124 |
"tdagent/cli/**/*.py" = ["D103", "T201"]
|
|
|
125 |
"tests/*.py" = ["D103", "PLR2004", "S101"]
|
requirements-dev.txt
CHANGED
@@ -6,14 +6,12 @@ aiofiles==24.1.0
|
|
6 |
# vt-py
|
7 |
aiohappyeyeballs==2.6.1
|
8 |
# via aiohttp
|
9 |
-
aiohttp==3.12.
|
10 |
# via vt-py
|
11 |
aiosignal==1.3.2
|
12 |
# via aiohttp
|
13 |
annotated-types==0.7.0
|
14 |
# via pydantic
|
15 |
-
antlr4-python3-runtime==4.9.3
|
16 |
-
# via stix2-patterns
|
17 |
anyio==4.9.0
|
18 |
# via
|
19 |
# gradio
|
@@ -23,22 +21,14 @@ anyio==4.9.0
|
|
23 |
# starlette
|
24 |
async-timeout==5.0.1 ; python_full_version < '3.11'
|
25 |
# via aiohttp
|
26 |
-
attackcti==0.5.4
|
27 |
-
# via tdagent
|
28 |
attrs==25.3.0
|
29 |
# via aiohttp
|
30 |
audioop-lts==0.2.1 ; python_full_version >= '3.13'
|
31 |
-
# via
|
32 |
-
# gradio
|
33 |
-
# tdagent
|
34 |
-
black==25.1.0
|
35 |
-
# via tdagent
|
36 |
boolean-py==5.0
|
37 |
# via license-expression
|
38 |
cachecontrol==0.14.3
|
39 |
# via pip-audit
|
40 |
-
cachetools==6.0.0
|
41 |
-
# via tdagent
|
42 |
certifi==2025.4.26
|
43 |
# via
|
44 |
# httpcore
|
@@ -48,9 +38,8 @@ cfgv==3.4.0
|
|
48 |
# via pre-commit
|
49 |
charset-normalizer==3.4.2
|
50 |
# via requests
|
51 |
-
click==8.2.1
|
52 |
# via
|
53 |
-
# black
|
54 |
# typer
|
55 |
# uvicorn
|
56 |
colorama==0.4.6 ; sys_platform == 'win32'
|
@@ -66,8 +55,6 @@ defusedxml==0.7.1
|
|
66 |
# via py-serializable
|
67 |
distlib==0.3.9
|
68 |
# via virtualenv
|
69 |
-
dnspython==2.7.0
|
70 |
-
# via tdagent
|
71 |
exceptiongroup==1.3.0 ; python_full_version < '3.11'
|
72 |
# via
|
73 |
# anyio
|
@@ -89,7 +76,7 @@ fsspec==2025.5.1
|
|
89 |
# via
|
90 |
# gradio-client
|
91 |
# huggingface-hub
|
92 |
-
gradio==5.
|
93 |
# via tdagent
|
94 |
gradio-client==1.10.2
|
95 |
# via gradio
|
@@ -99,7 +86,7 @@ h11==0.16.0
|
|
99 |
# via
|
100 |
# httpcore
|
101 |
# uvicorn
|
102 |
-
hf-xet==1.1.
|
103 |
# via huggingface-hub
|
104 |
httpcore==1.0.9
|
105 |
# via httpx
|
@@ -129,8 +116,6 @@ jinja2==3.1.6
|
|
129 |
# via gradio
|
130 |
license-expression==30.4.1
|
131 |
# via cyclonedx-python-lib
|
132 |
-
markdown==3.8
|
133 |
-
# via tdagent
|
134 |
markdown-it-py==3.0.0
|
135 |
# via rich
|
136 |
markupsafe==3.0.2
|
@@ -149,9 +134,7 @@ multidict==6.4.4
|
|
149 |
# yarl
|
150 |
mypy==1.16.0
|
151 |
mypy-extensions==1.1.0
|
152 |
-
# via
|
153 |
-
# black
|
154 |
-
# mypy
|
155 |
nodeenv==1.9.1
|
156 |
# via pre-commit
|
157 |
numpy==2.2.6
|
@@ -160,23 +143,20 @@ numpy==2.2.6
|
|
160 |
# pandas
|
161 |
orjson==3.10.18
|
162 |
# via gradio
|
163 |
-
packageurl-python==0.
|
164 |
# via cyclonedx-python-lib
|
165 |
packaging==25.0
|
166 |
# via
|
167 |
-
# black
|
168 |
# gradio
|
169 |
# gradio-client
|
170 |
# huggingface-hub
|
171 |
# pip-audit
|
172 |
# pip-requirements-parser
|
173 |
# pytest
|
174 |
-
pandas==2.3
|
175 |
# via gradio
|
176 |
pathspec==0.12.1
|
177 |
-
# via
|
178 |
-
# black
|
179 |
-
# mypy
|
180 |
pillow==11.2.1
|
181 |
# via gradio
|
182 |
pip==25.1.1
|
@@ -188,7 +168,6 @@ pip-requirements-parser==32.0.1
|
|
188 |
# via pip-audit
|
189 |
platformdirs==4.3.8
|
190 |
# via
|
191 |
-
# black
|
192 |
# pip-audit
|
193 |
# virtualenv
|
194 |
pluggy==1.6.0
|
@@ -202,7 +181,6 @@ py-serializable==2.0.0
|
|
202 |
# via cyclonedx-python-lib
|
203 |
pydantic==2.11.5
|
204 |
# via
|
205 |
-
# attackcti
|
206 |
# fastapi
|
207 |
# gradio
|
208 |
# mcp
|
@@ -224,22 +202,15 @@ pytest==7.4.4
|
|
224 |
pytest-cov==4.1.0
|
225 |
pytest-randomly==3.16.0
|
226 |
python-dateutil==2.9.0.post0
|
227 |
-
# via
|
228 |
-
# pandas
|
229 |
-
# python-whois
|
230 |
python-dotenv==1.1.0
|
231 |
# via pydantic-settings
|
232 |
python-multipart==0.0.20
|
233 |
# via
|
234 |
# gradio
|
235 |
# mcp
|
236 |
-
python-whois==0.9.5
|
237 |
-
# via tdagent
|
238 |
pytz==2025.2
|
239 |
-
# via
|
240 |
-
# pandas
|
241 |
-
# stix2
|
242 |
-
# taxii2-client
|
243 |
pyyaml==6.0.2
|
244 |
# via
|
245 |
# gradio
|
@@ -250,8 +221,6 @@ requests==2.32.3
|
|
250 |
# cachecontrol
|
251 |
# huggingface-hub
|
252 |
# pip-audit
|
253 |
-
# stix2
|
254 |
-
# taxii2-client
|
255 |
# tdagent
|
256 |
rich==14.0.0
|
257 |
# via
|
@@ -265,13 +234,8 @@ semantic-version==2.10.0
|
|
265 |
# via gradio
|
266 |
shellingham==1.5.4 ; sys_platform != 'emscripten'
|
267 |
# via typer
|
268 |
-
simplejson==3.20.1
|
269 |
-
# via stix2
|
270 |
six==1.17.0
|
271 |
-
# via
|
272 |
-
# python-dateutil
|
273 |
-
# stix2-patterns
|
274 |
-
# taxii2-client
|
275 |
sniffio==1.3.1
|
276 |
# via anyio
|
277 |
sortedcontainers==2.4.0
|
@@ -283,21 +247,14 @@ starlette==0.46.2
|
|
283 |
# fastapi
|
284 |
# gradio
|
285 |
# mcp
|
286 |
-
stix2==3.0.1
|
287 |
-
# via attackcti
|
288 |
-
stix2-patterns==2.0.0
|
289 |
-
# via stix2
|
290 |
-
taxii2-client==2.3.0
|
291 |
-
# via attackcti
|
292 |
toml==0.10.2
|
293 |
# via pip-audit
|
294 |
tomli==2.2.1 ; python_full_version <= '3.11'
|
295 |
# via
|
296 |
-
# black
|
297 |
# coverage
|
298 |
# mypy
|
299 |
# pytest
|
300 |
-
tomlkit==0.13.
|
301 |
# via gradio
|
302 |
tqdm==4.67.1
|
303 |
# via huggingface-hub
|
@@ -306,7 +263,6 @@ typer==0.16.0 ; sys_platform != 'emscripten'
|
|
306 |
typing-extensions==4.14.0
|
307 |
# via
|
308 |
# anyio
|
309 |
-
# black
|
310 |
# exceptiongroup
|
311 |
# fastapi
|
312 |
# gradio
|
|
|
6 |
# vt-py
|
7 |
aiohappyeyeballs==2.6.1
|
8 |
# via aiohttp
|
9 |
+
aiohttp==3.12.8
|
10 |
# via vt-py
|
11 |
aiosignal==1.3.2
|
12 |
# via aiohttp
|
13 |
annotated-types==0.7.0
|
14 |
# via pydantic
|
|
|
|
|
15 |
anyio==4.9.0
|
16 |
# via
|
17 |
# gradio
|
|
|
21 |
# starlette
|
22 |
async-timeout==5.0.1 ; python_full_version < '3.11'
|
23 |
# via aiohttp
|
|
|
|
|
24 |
attrs==25.3.0
|
25 |
# via aiohttp
|
26 |
audioop-lts==0.2.1 ; python_full_version >= '3.13'
|
27 |
+
# via gradio
|
|
|
|
|
|
|
|
|
28 |
boolean-py==5.0
|
29 |
# via license-expression
|
30 |
cachecontrol==0.14.3
|
31 |
# via pip-audit
|
|
|
|
|
32 |
certifi==2025.4.26
|
33 |
# via
|
34 |
# httpcore
|
|
|
38 |
# via pre-commit
|
39 |
charset-normalizer==3.4.2
|
40 |
# via requests
|
41 |
+
click==8.2.1 ; sys_platform != 'emscripten'
|
42 |
# via
|
|
|
43 |
# typer
|
44 |
# uvicorn
|
45 |
colorama==0.4.6 ; sys_platform == 'win32'
|
|
|
55 |
# via py-serializable
|
56 |
distlib==0.3.9
|
57 |
# via virtualenv
|
|
|
|
|
58 |
exceptiongroup==1.3.0 ; python_full_version < '3.11'
|
59 |
# via
|
60 |
# anyio
|
|
|
76 |
# via
|
77 |
# gradio-client
|
78 |
# huggingface-hub
|
79 |
+
gradio==5.32.1
|
80 |
# via tdagent
|
81 |
gradio-client==1.10.2
|
82 |
# via gradio
|
|
|
86 |
# via
|
87 |
# httpcore
|
88 |
# uvicorn
|
89 |
+
hf-xet==1.1.2 ; platform_machine == 'aarch64' or platform_machine == 'amd64' or platform_machine == 'arm64' or platform_machine == 'x86_64'
|
90 |
# via huggingface-hub
|
91 |
httpcore==1.0.9
|
92 |
# via httpx
|
|
|
116 |
# via gradio
|
117 |
license-expression==30.4.1
|
118 |
# via cyclonedx-python-lib
|
|
|
|
|
119 |
markdown-it-py==3.0.0
|
120 |
# via rich
|
121 |
markupsafe==3.0.2
|
|
|
134 |
# yarl
|
135 |
mypy==1.16.0
|
136 |
mypy-extensions==1.1.0
|
137 |
+
# via mypy
|
|
|
|
|
138 |
nodeenv==1.9.1
|
139 |
# via pre-commit
|
140 |
numpy==2.2.6
|
|
|
143 |
# pandas
|
144 |
orjson==3.10.18
|
145 |
# via gradio
|
146 |
+
packageurl-python==0.16.0
|
147 |
# via cyclonedx-python-lib
|
148 |
packaging==25.0
|
149 |
# via
|
|
|
150 |
# gradio
|
151 |
# gradio-client
|
152 |
# huggingface-hub
|
153 |
# pip-audit
|
154 |
# pip-requirements-parser
|
155 |
# pytest
|
156 |
+
pandas==2.2.3
|
157 |
# via gradio
|
158 |
pathspec==0.12.1
|
159 |
+
# via mypy
|
|
|
|
|
160 |
pillow==11.2.1
|
161 |
# via gradio
|
162 |
pip==25.1.1
|
|
|
168 |
# via pip-audit
|
169 |
platformdirs==4.3.8
|
170 |
# via
|
|
|
171 |
# pip-audit
|
172 |
# virtualenv
|
173 |
pluggy==1.6.0
|
|
|
181 |
# via cyclonedx-python-lib
|
182 |
pydantic==2.11.5
|
183 |
# via
|
|
|
184 |
# fastapi
|
185 |
# gradio
|
186 |
# mcp
|
|
|
202 |
pytest-cov==4.1.0
|
203 |
pytest-randomly==3.16.0
|
204 |
python-dateutil==2.9.0.post0
|
205 |
+
# via pandas
|
|
|
|
|
206 |
python-dotenv==1.1.0
|
207 |
# via pydantic-settings
|
208 |
python-multipart==0.0.20
|
209 |
# via
|
210 |
# gradio
|
211 |
# mcp
|
|
|
|
|
212 |
pytz==2025.2
|
213 |
+
# via pandas
|
|
|
|
|
|
|
214 |
pyyaml==6.0.2
|
215 |
# via
|
216 |
# gradio
|
|
|
221 |
# cachecontrol
|
222 |
# huggingface-hub
|
223 |
# pip-audit
|
|
|
|
|
224 |
# tdagent
|
225 |
rich==14.0.0
|
226 |
# via
|
|
|
234 |
# via gradio
|
235 |
shellingham==1.5.4 ; sys_platform != 'emscripten'
|
236 |
# via typer
|
|
|
|
|
237 |
six==1.17.0
|
238 |
+
# via python-dateutil
|
|
|
|
|
|
|
239 |
sniffio==1.3.1
|
240 |
# via anyio
|
241 |
sortedcontainers==2.4.0
|
|
|
247 |
# fastapi
|
248 |
# gradio
|
249 |
# mcp
|
|
|
|
|
|
|
|
|
|
|
|
|
250 |
toml==0.10.2
|
251 |
# via pip-audit
|
252 |
tomli==2.2.1 ; python_full_version <= '3.11'
|
253 |
# via
|
|
|
254 |
# coverage
|
255 |
# mypy
|
256 |
# pytest
|
257 |
+
tomlkit==0.13.2
|
258 |
# via gradio
|
259 |
tqdm==4.67.1
|
260 |
# via huggingface-hub
|
|
|
263 |
typing-extensions==4.14.0
|
264 |
# via
|
265 |
# anyio
|
|
|
266 |
# exceptiongroup
|
267 |
# fastapi
|
268 |
# gradio
|
requirements.txt
CHANGED
@@ -1,19 +1,17 @@
|
|
1 |
# This file was autogenerated by uv via the following command:
|
2 |
-
# uv
|
3 |
aiofiles==24.1.0
|
4 |
# via
|
5 |
# gradio
|
6 |
# vt-py
|
7 |
aiohappyeyeballs==2.6.1
|
8 |
# via aiohttp
|
9 |
-
aiohttp==3.12.
|
10 |
# via vt-py
|
11 |
aiosignal==1.3.2
|
12 |
# via aiohttp
|
13 |
annotated-types==0.7.0
|
14 |
# via pydantic
|
15 |
-
antlr4-python3-runtime==4.9.3
|
16 |
-
# via stix2-patterns
|
17 |
anyio==4.9.0
|
18 |
# via
|
19 |
# gradio
|
@@ -21,20 +19,10 @@ anyio==4.9.0
|
|
21 |
# mcp
|
22 |
# sse-starlette
|
23 |
# starlette
|
24 |
-
async-timeout==5.0.1
|
25 |
# via aiohttp
|
26 |
-
attackcti==0.5.4
|
27 |
-
# via tdagent
|
28 |
attrs==25.3.0
|
29 |
# via aiohttp
|
30 |
-
audioop-lts==0.2.1 ; python_full_version >= '3.13'
|
31 |
-
# via
|
32 |
-
# gradio
|
33 |
-
# tdagent
|
34 |
-
black==25.1.0
|
35 |
-
# via tdagent
|
36 |
-
cachetools==6.0.0
|
37 |
-
# via tdagent
|
38 |
certifi==2025.4.26
|
39 |
# via
|
40 |
# httpcore
|
@@ -44,22 +32,10 @@ charset-normalizer==3.4.2
|
|
44 |
# via requests
|
45 |
click==8.2.1
|
46 |
# via
|
47 |
-
# black
|
48 |
# typer
|
49 |
# uvicorn
|
50 |
-
|
51 |
-
# via
|
52 |
-
# click
|
53 |
-
# pytest
|
54 |
-
# tqdm
|
55 |
-
coverage==7.8.2
|
56 |
-
# via pytest-cov
|
57 |
-
dnspython==2.7.0
|
58 |
-
# via tdagent
|
59 |
-
exceptiongroup==1.3.0 ; python_full_version < '3.11'
|
60 |
-
# via
|
61 |
-
# anyio
|
62 |
-
# pytest
|
63 |
fastapi==0.115.12
|
64 |
# via gradio
|
65 |
ffmpy==0.6.0
|
@@ -74,8 +50,8 @@ fsspec==2025.5.1
|
|
74 |
# via
|
75 |
# gradio-client
|
76 |
# huggingface-hub
|
77 |
-
gradio==5.
|
78 |
-
# via tdagent
|
79 |
gradio-client==1.10.2
|
80 |
# via gradio
|
81 |
groovy==0.1.2
|
@@ -84,7 +60,7 @@ h11==0.16.0
|
|
84 |
# via
|
85 |
# httpcore
|
86 |
# uvicorn
|
87 |
-
hf-xet==1.1.
|
88 |
# via huggingface-hub
|
89 |
httpcore==1.0.9
|
90 |
# via httpx
|
@@ -106,13 +82,9 @@ idna==3.10
|
|
106 |
# httpx
|
107 |
# requests
|
108 |
# yarl
|
109 |
-
iniconfig==2.1.0
|
110 |
-
# via pytest
|
111 |
jinja2==3.1.6
|
112 |
# via gradio
|
113 |
-
markdown==3.
|
114 |
-
# via tdagent
|
115 |
-
markdown-it-py==3.0.0 ; sys_platform != 'emscripten'
|
116 |
# via rich
|
117 |
markupsafe==3.0.2
|
118 |
# via
|
@@ -120,14 +92,12 @@ markupsafe==3.0.2
|
|
120 |
# jinja2
|
121 |
mcp==1.9.0
|
122 |
# via gradio
|
123 |
-
mdurl==0.1.2
|
124 |
# via markdown-it-py
|
125 |
multidict==6.4.4
|
126 |
# via
|
127 |
# aiohttp
|
128 |
# yarl
|
129 |
-
mypy-extensions==1.1.0
|
130 |
-
# via black
|
131 |
numpy==2.2.6
|
132 |
# via
|
133 |
# gradio
|
@@ -136,28 +106,19 @@ orjson==3.10.18
|
|
136 |
# via gradio
|
137 |
packaging==25.0
|
138 |
# via
|
139 |
-
# black
|
140 |
# gradio
|
141 |
# gradio-client
|
142 |
# huggingface-hub
|
143 |
-
|
144 |
-
pandas==2.3.0
|
145 |
# via gradio
|
146 |
-
pathspec==0.12.1
|
147 |
-
# via black
|
148 |
pillow==11.2.1
|
149 |
# via gradio
|
150 |
-
platformdirs==4.3.8
|
151 |
-
# via black
|
152 |
-
pluggy==1.6.0
|
153 |
-
# via pytest
|
154 |
propcache==0.3.1
|
155 |
# via
|
156 |
# aiohttp
|
157 |
# yarl
|
158 |
pydantic==2.11.5
|
159 |
# via
|
160 |
-
# attackcti
|
161 |
# fastapi
|
162 |
# gradio
|
163 |
# mcp
|
@@ -168,58 +129,38 @@ pydantic-settings==2.9.1
|
|
168 |
# via mcp
|
169 |
pydub==0.25.1
|
170 |
# via gradio
|
171 |
-
pygments==2.19.1
|
172 |
# via rich
|
173 |
-
pytest==7.4.4
|
174 |
-
# via
|
175 |
-
# pytest-cov
|
176 |
-
# pytest-randomly
|
177 |
-
pytest-cov==4.1.0
|
178 |
-
pytest-randomly==3.16.0
|
179 |
python-dateutil==2.9.0.post0
|
180 |
-
# via
|
181 |
-
# pandas
|
182 |
-
# python-whois
|
183 |
python-dotenv==1.1.0
|
184 |
# via pydantic-settings
|
185 |
python-multipart==0.0.20
|
186 |
# via
|
187 |
# gradio
|
188 |
# mcp
|
189 |
-
python-whois==0.9.5
|
190 |
-
# via tdagent
|
191 |
pytz==2025.2
|
192 |
-
# via
|
193 |
-
# pandas
|
194 |
-
# stix2
|
195 |
-
# taxii2-client
|
196 |
pyyaml==6.0.2
|
197 |
# via
|
198 |
# gradio
|
199 |
# huggingface-hub
|
200 |
requests==2.32.3
|
201 |
# via
|
|
|
202 |
# huggingface-hub
|
203 |
-
|
204 |
-
# taxii2-client
|
205 |
-
# tdagent
|
206 |
-
rich==14.0.0 ; sys_platform != 'emscripten'
|
207 |
# via typer
|
208 |
-
ruff==0.11.12
|
209 |
# via gradio
|
210 |
safehttpx==0.1.6
|
211 |
# via gradio
|
212 |
semantic-version==2.10.0
|
213 |
# via gradio
|
214 |
-
shellingham==1.5.4
|
215 |
# via typer
|
216 |
-
simplejson==3.20.1
|
217 |
-
# via stix2
|
218 |
six==1.17.0
|
219 |
-
# via
|
220 |
-
# python-dateutil
|
221 |
-
# stix2-patterns
|
222 |
-
# taxii2-client
|
223 |
sniffio==1.3.1
|
224 |
# via anyio
|
225 |
sse-starlette==2.3.6
|
@@ -229,27 +170,15 @@ starlette==0.46.2
|
|
229 |
# fastapi
|
230 |
# gradio
|
231 |
# mcp
|
232 |
-
|
233 |
-
# via attackcti
|
234 |
-
stix2-patterns==2.0.0
|
235 |
-
# via stix2
|
236 |
-
taxii2-client==2.3.0
|
237 |
-
# via attackcti
|
238 |
-
tomli==2.2.1 ; python_full_version <= '3.11'
|
239 |
-
# via
|
240 |
-
# black
|
241 |
-
# coverage
|
242 |
-
# pytest
|
243 |
-
tomlkit==0.13.3
|
244 |
# via gradio
|
245 |
tqdm==4.67.1
|
246 |
# via huggingface-hub
|
247 |
-
typer==0.16.0
|
248 |
# via gradio
|
249 |
typing-extensions==4.14.0
|
250 |
# via
|
251 |
# anyio
|
252 |
-
# black
|
253 |
# exceptiongroup
|
254 |
# fastapi
|
255 |
# gradio
|
@@ -269,17 +198,14 @@ typing-inspection==0.4.1
|
|
269 |
tzdata==2025.2
|
270 |
# via pandas
|
271 |
urllib3==2.4.0
|
272 |
-
# via
|
273 |
-
|
274 |
-
# requests
|
275 |
-
uvicorn==0.34.3 ; sys_platform != 'emscripten'
|
276 |
# via
|
277 |
# gradio
|
278 |
# mcp
|
279 |
vt-py==0.21.0
|
280 |
-
# via tdagent
|
281 |
websockets==15.0.1
|
282 |
# via gradio-client
|
283 |
-
xdoctest==1.2.0
|
284 |
yarl==1.20.0
|
285 |
# via aiohttp
|
|
|
1 |
# This file was autogenerated by uv via the following command:
|
2 |
+
# uv pip compile pyproject.toml -o requirements.txt
|
3 |
aiofiles==24.1.0
|
4 |
# via
|
5 |
# gradio
|
6 |
# vt-py
|
7 |
aiohappyeyeballs==2.6.1
|
8 |
# via aiohttp
|
9 |
+
aiohttp==3.12.8
|
10 |
# via vt-py
|
11 |
aiosignal==1.3.2
|
12 |
# via aiohttp
|
13 |
annotated-types==0.7.0
|
14 |
# via pydantic
|
|
|
|
|
15 |
anyio==4.9.0
|
16 |
# via
|
17 |
# gradio
|
|
|
19 |
# mcp
|
20 |
# sse-starlette
|
21 |
# starlette
|
22 |
+
async-timeout==5.0.1
|
23 |
# via aiohttp
|
|
|
|
|
24 |
attrs==25.3.0
|
25 |
# via aiohttp
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
26 |
certifi==2025.4.26
|
27 |
# via
|
28 |
# httpcore
|
|
|
32 |
# via requests
|
33 |
click==8.2.1
|
34 |
# via
|
|
|
35 |
# typer
|
36 |
# uvicorn
|
37 |
+
exceptiongroup==1.3.0
|
38 |
+
# via anyio
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
39 |
fastapi==0.115.12
|
40 |
# via gradio
|
41 |
ffmpy==0.6.0
|
|
|
50 |
# via
|
51 |
# gradio-client
|
52 |
# huggingface-hub
|
53 |
+
gradio==5.32.1
|
54 |
+
# via tdagent (pyproject.toml)
|
55 |
gradio-client==1.10.2
|
56 |
# via gradio
|
57 |
groovy==0.1.2
|
|
|
60 |
# via
|
61 |
# httpcore
|
62 |
# uvicorn
|
63 |
+
hf-xet==1.1.2
|
64 |
# via huggingface-hub
|
65 |
httpcore==1.0.9
|
66 |
# via httpx
|
|
|
82 |
# httpx
|
83 |
# requests
|
84 |
# yarl
|
|
|
|
|
85 |
jinja2==3.1.6
|
86 |
# via gradio
|
87 |
+
markdown-it-py==3.0.0
|
|
|
|
|
88 |
# via rich
|
89 |
markupsafe==3.0.2
|
90 |
# via
|
|
|
92 |
# jinja2
|
93 |
mcp==1.9.0
|
94 |
# via gradio
|
95 |
+
mdurl==0.1.2
|
96 |
# via markdown-it-py
|
97 |
multidict==6.4.4
|
98 |
# via
|
99 |
# aiohttp
|
100 |
# yarl
|
|
|
|
|
101 |
numpy==2.2.6
|
102 |
# via
|
103 |
# gradio
|
|
|
106 |
# via gradio
|
107 |
packaging==25.0
|
108 |
# via
|
|
|
109 |
# gradio
|
110 |
# gradio-client
|
111 |
# huggingface-hub
|
112 |
+
pandas==2.2.3
|
|
|
113 |
# via gradio
|
|
|
|
|
114 |
pillow==11.2.1
|
115 |
# via gradio
|
|
|
|
|
|
|
|
|
116 |
propcache==0.3.1
|
117 |
# via
|
118 |
# aiohttp
|
119 |
# yarl
|
120 |
pydantic==2.11.5
|
121 |
# via
|
|
|
122 |
# fastapi
|
123 |
# gradio
|
124 |
# mcp
|
|
|
129 |
# via mcp
|
130 |
pydub==0.25.1
|
131 |
# via gradio
|
132 |
+
pygments==2.19.1
|
133 |
# via rich
|
|
|
|
|
|
|
|
|
|
|
|
|
134 |
python-dateutil==2.9.0.post0
|
135 |
+
# via pandas
|
|
|
|
|
136 |
python-dotenv==1.1.0
|
137 |
# via pydantic-settings
|
138 |
python-multipart==0.0.20
|
139 |
# via
|
140 |
# gradio
|
141 |
# mcp
|
|
|
|
|
142 |
pytz==2025.2
|
143 |
+
# via pandas
|
|
|
|
|
|
|
144 |
pyyaml==6.0.2
|
145 |
# via
|
146 |
# gradio
|
147 |
# huggingface-hub
|
148 |
requests==2.32.3
|
149 |
# via
|
150 |
+
# tdagent (pyproject.toml)
|
151 |
# huggingface-hub
|
152 |
+
rich==14.0.0
|
|
|
|
|
|
|
153 |
# via typer
|
154 |
+
ruff==0.11.12
|
155 |
# via gradio
|
156 |
safehttpx==0.1.6
|
157 |
# via gradio
|
158 |
semantic-version==2.10.0
|
159 |
# via gradio
|
160 |
+
shellingham==1.5.4
|
161 |
# via typer
|
|
|
|
|
162 |
six==1.17.0
|
163 |
+
# via python-dateutil
|
|
|
|
|
|
|
164 |
sniffio==1.3.1
|
165 |
# via anyio
|
166 |
sse-starlette==2.3.6
|
|
|
170 |
# fastapi
|
171 |
# gradio
|
172 |
# mcp
|
173 |
+
tomlkit==0.13.2
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
174 |
# via gradio
|
175 |
tqdm==4.67.1
|
176 |
# via huggingface-hub
|
177 |
+
typer==0.16.0
|
178 |
# via gradio
|
179 |
typing-extensions==4.14.0
|
180 |
# via
|
181 |
# anyio
|
|
|
182 |
# exceptiongroup
|
183 |
# fastapi
|
184 |
# gradio
|
|
|
198 |
tzdata==2025.2
|
199 |
# via pandas
|
200 |
urllib3==2.4.0
|
201 |
+
# via requests
|
202 |
+
uvicorn==0.34.3
|
|
|
|
|
203 |
# via
|
204 |
# gradio
|
205 |
# mcp
|
206 |
vt-py==0.21.0
|
207 |
+
# via tdagent (pyproject.toml)
|
208 |
websockets==15.0.1
|
209 |
# via gradio-client
|
|
|
210 |
yarl==1.20.0
|
211 |
# via aiohttp
|
subdomains/subdomains.txt
DELETED
@@ -1,999 +0,0 @@
|
|
1 |
-
www
|
2 |
-
mail
|
3 |
-
ftp
|
4 |
-
localhost
|
5 |
-
webmail
|
6 |
-
smtp
|
7 |
-
pop
|
8 |
-
ns1
|
9 |
-
webdisk
|
10 |
-
ns2
|
11 |
-
cpanel
|
12 |
-
whm
|
13 |
-
autodiscover
|
14 |
-
autoconfig
|
15 |
-
m
|
16 |
-
imap
|
17 |
-
test
|
18 |
-
ns
|
19 |
-
blog
|
20 |
-
pop3
|
21 |
-
dev
|
22 |
-
www2
|
23 |
-
admin
|
24 |
-
forum
|
25 |
-
news
|
26 |
-
vpn
|
27 |
-
ns3
|
28 |
-
mail2
|
29 |
-
new
|
30 |
-
mysql
|
31 |
-
old
|
32 |
-
lists
|
33 |
-
support
|
34 |
-
mobile
|
35 |
-
mx
|
36 |
-
static
|
37 |
-
docs
|
38 |
-
beta
|
39 |
-
shop
|
40 |
-
sql
|
41 |
-
secure
|
42 |
-
demo
|
43 |
-
cp
|
44 |
-
calendar
|
45 |
-
wiki
|
46 |
-
web
|
47 |
-
media
|
48 |
-
email
|
49 |
-
images
|
50 |
-
img
|
51 |
-
www1
|
52 |
-
intranet
|
53 |
-
portal
|
54 |
-
video
|
55 |
-
sip
|
56 |
-
dns2
|
57 |
-
api
|
58 |
-
cdn
|
59 |
-
stats
|
60 |
-
dns1
|
61 |
-
ns4
|
62 |
-
www3
|
63 |
-
dns
|
64 |
-
search
|
65 |
-
staging
|
66 |
-
server
|
67 |
-
mx1
|
68 |
-
chat
|
69 |
-
wap
|
70 |
-
my
|
71 |
-
svn
|
72 |
-
mail1
|
73 |
-
sites
|
74 |
-
proxy
|
75 |
-
ads
|
76 |
-
host
|
77 |
-
crm
|
78 |
-
cms
|
79 |
-
backup
|
80 |
-
mx2
|
81 |
-
lyncdiscover
|
82 |
-
info
|
83 |
-
apps
|
84 |
-
download
|
85 |
-
remote
|
86 |
-
db
|
87 |
-
forums
|
88 |
-
store
|
89 |
-
relay
|
90 |
-
files
|
91 |
-
newsletter
|
92 |
-
app
|
93 |
-
live
|
94 |
-
owa
|
95 |
-
en
|
96 |
-
start
|
97 |
-
sms
|
98 |
-
office
|
99 |
-
exchange
|
100 |
-
ipv4
|
101 |
-
mail3
|
102 |
-
help
|
103 |
-
blogs
|
104 |
-
helpdesk
|
105 |
-
web1
|
106 |
-
home
|
107 |
-
library
|
108 |
-
ftp2
|
109 |
-
ntp
|
110 |
-
monitor
|
111 |
-
login
|
112 |
-
service
|
113 |
-
correo
|
114 |
-
www4
|
115 |
-
moodle
|
116 |
-
it
|
117 |
-
gateway
|
118 |
-
gw
|
119 |
-
i
|
120 |
-
stat
|
121 |
-
stage
|
122 |
-
ldap
|
123 |
-
tv
|
124 |
-
ssl
|
125 |
-
web2
|
126 |
-
ns5
|
127 |
-
upload
|
128 |
-
nagios
|
129 |
-
smtp2
|
130 |
-
online
|
131 |
-
ad
|
132 |
-
survey
|
133 |
-
data
|
134 |
-
radio
|
135 |
-
extranet
|
136 |
-
test2
|
137 |
-
mssql
|
138 |
-
dns3
|
139 |
-
jobs
|
140 |
-
services
|
141 |
-
panel
|
142 |
-
irc
|
143 |
-
hosting
|
144 |
-
cloud
|
145 |
-
de
|
146 |
-
gmail
|
147 |
-
s
|
148 |
-
bbs
|
149 |
-
cs
|
150 |
-
ww
|
151 |
-
mrtg
|
152 |
-
git
|
153 |
-
image
|
154 |
-
members
|
155 |
-
poczta
|
156 |
-
s1
|
157 |
-
meet
|
158 |
-
preview
|
159 |
-
fr
|
160 |
-
cloudflare-resolve-to
|
161 |
-
dev2
|
162 |
-
photo
|
163 |
-
jabber
|
164 |
-
legacy
|
165 |
-
go
|
166 |
-
es
|
167 |
-
ssh
|
168 |
-
redmine
|
169 |
-
partner
|
170 |
-
vps
|
171 |
-
server1
|
172 |
-
sv
|
173 |
-
ns6
|
174 |
-
webmail2
|
175 |
-
av
|
176 |
-
community
|
177 |
-
cacti
|
178 |
-
time
|
179 |
-
sftp
|
180 |
-
lib
|
181 |
-
facebook
|
182 |
-
www5
|
183 |
-
smtp1
|
184 |
-
feeds
|
185 |
-
w
|
186 |
-
games
|
187 |
-
ts
|
188 |
-
alumni
|
189 |
-
dl
|
190 |
-
s2
|
191 |
-
phpmyadmin
|
192 |
-
archive
|
193 |
-
cn
|
194 |
-
tools
|
195 |
-
stream
|
196 |
-
projects
|
197 |
-
elearning
|
198 |
-
im
|
199 |
-
iphone
|
200 |
-
control
|
201 |
-
voip
|
202 |
-
test1
|
203 |
-
ws
|
204 |
-
rss
|
205 |
-
sp
|
206 |
-
wwww
|
207 |
-
vpn2
|
208 |
-
jira
|
209 |
-
list
|
210 |
-
connect
|
211 |
-
gallery
|
212 |
-
billing
|
213 |
-
mailer
|
214 |
-
update
|
215 |
-
pda
|
216 |
-
game
|
217 |
-
ns0
|
218 |
-
testing
|
219 |
-
sandbox
|
220 |
-
job
|
221 |
-
events
|
222 |
-
dialin
|
223 |
-
ml
|
224 |
-
fb
|
225 |
-
videos
|
226 |
-
music
|
227 |
-
a
|
228 |
-
partners
|
229 |
-
mailhost
|
230 |
-
downloads
|
231 |
-
reports
|
232 |
-
ca
|
233 |
-
router
|
234 |
-
speedtest
|
235 |
-
local
|
236 |
-
training
|
237 |
-
edu
|
238 |
-
bugs
|
239 |
-
manage
|
240 |
-
s3
|
241 |
-
status
|
242 |
-
host2
|
243 |
-
ww2
|
244 |
-
marketing
|
245 |
-
conference
|
246 |
-
content
|
247 |
-
network-ip
|
248 |
-
broadcast-ip
|
249 |
-
english
|
250 |
-
catalog
|
251 |
-
msoid
|
252 |
-
mailadmin
|
253 |
-
pay
|
254 |
-
access
|
255 |
-
streaming
|
256 |
-
project
|
257 |
-
t
|
258 |
-
sso
|
259 |
-
alpha
|
260 |
-
photos
|
261 |
-
staff
|
262 |
-
e
|
263 |
-
auth
|
264 |
-
v2
|
265 |
-
web5
|
266 |
-
web3
|
267 |
-
mail4
|
268 |
-
devel
|
269 |
-
post
|
270 |
-
us
|
271 |
-
images2
|
272 |
-
master
|
273 |
-
rt
|
274 |
-
ftp1
|
275 |
-
qa
|
276 |
-
wp
|
277 |
-
dns4
|
278 |
-
www6
|
279 |
-
ru
|
280 |
-
student
|
281 |
-
w3
|
282 |
-
citrix
|
283 |
-
trac
|
284 |
-
doc
|
285 |
-
img2
|
286 |
-
css
|
287 |
-
mx3
|
288 |
-
adm
|
289 |
-
web4
|
290 |
-
hr
|
291 |
-
mailserver
|
292 |
-
travel
|
293 |
-
sharepoint
|
294 |
-
sport
|
295 |
-
member
|
296 |
-
bb
|
297 |
-
agenda
|
298 |
-
link
|
299 |
-
server2
|
300 |
-
vod
|
301 |
-
uk
|
302 |
-
fw
|
303 |
-
promo
|
304 |
-
vip
|
305 |
-
noc
|
306 |
-
design
|
307 |
-
temp
|
308 |
-
gate
|
309 |
-
ns7
|
310 |
-
file
|
311 |
-
ms
|
312 |
-
map
|
313 |
-
cache
|
314 |
-
painel
|
315 |
-
js
|
316 |
-
event
|
317 |
-
mailing
|
318 |
-
db1
|
319 |
-
c
|
320 |
-
auto
|
321 |
-
img1
|
322 |
-
vpn1
|
323 |
-
business
|
324 |
-
mirror
|
325 |
-
share
|
326 |
-
cdn2
|
327 |
-
site
|
328 |
-
maps
|
329 |
-
tickets
|
330 |
-
tracker
|
331 |
-
domains
|
332 |
-
club
|
333 |
-
images1
|
334 |
-
zimbra
|
335 |
-
cvs
|
336 |
-
b2b
|
337 |
-
oa
|
338 |
-
intra
|
339 |
-
zabbix
|
340 |
-
ns8
|
341 |
-
assets
|
342 |
-
main
|
343 |
-
spam
|
344 |
-
lms
|
345 |
-
social
|
346 |
-
faq
|
347 |
-
feedback
|
348 |
-
loopback
|
349 |
-
groups
|
350 |
-
m2
|
351 |
-
cas
|
352 |
-
loghost
|
353 |
-
xml
|
354 |
-
nl
|
355 |
-
research
|
356 |
-
art
|
357 |
-
munin
|
358 |
-
dev1
|
359 |
-
gis
|
360 |
-
sales
|
361 |
-
images3
|
362 |
-
report
|
363 |
-
google
|
364 |
-
idp
|
365 |
-
cisco
|
366 |
-
careers
|
367 |
-
seo
|
368 |
-
dc
|
369 |
-
lab
|
370 |
-
d
|
371 |
-
firewall
|
372 |
-
fs
|
373 |
-
eng
|
374 |
-
ann
|
375 |
-
mail01
|
376 |
-
mantis
|
377 |
-
v
|
378 |
-
affiliates
|
379 |
-
webconf
|
380 |
-
track
|
381 |
-
ticket
|
382 |
-
pm
|
383 |
-
db2
|
384 |
-
b
|
385 |
-
clients
|
386 |
-
tech
|
387 |
-
erp
|
388 |
-
monitoring
|
389 |
-
cdn1
|
390 |
-
images4
|
391 |
-
payment
|
392 |
-
origin
|
393 |
-
client
|
394 |
-
foto
|
395 |
-
domain
|
396 |
-
pt
|
397 |
-
pma
|
398 |
-
directory
|
399 |
-
cc
|
400 |
-
public
|
401 |
-
finance
|
402 |
-
ns11
|
403 |
-
test3
|
404 |
-
wordpress
|
405 |
-
corp
|
406 |
-
sslvpn
|
407 |
-
cal
|
408 |
-
mailman
|
409 |
-
book
|
410 |
-
ip
|
411 |
-
zeus
|
412 |
-
ns10
|
413 |
-
hermes
|
414 |
-
storage
|
415 |
-
free
|
416 |
-
static1
|
417 |
-
pbx
|
418 |
-
banner
|
419 |
-
mobil
|
420 |
-
kb
|
421 |
-
mail5
|
422 |
-
direct
|
423 |
-
ipfixe
|
424 |
-
wifi
|
425 |
-
development
|
426 |
-
board
|
427 |
-
ns01
|
428 |
-
st
|
429 |
-
reviews
|
430 |
-
radius
|
431 |
-
pro
|
432 |
-
atlas
|
433 |
-
links
|
434 |
-
in
|
435 |
-
oldmail
|
436 |
-
register
|
437 |
-
s4
|
438 |
-
images6
|
439 |
-
static2
|
440 |
-
id
|
441 |
-
shopping
|
442 |
-
drupal
|
443 |
-
analytics
|
444 |
-
m1
|
445 |
-
images5
|
446 |
-
images7
|
447 |
-
img3
|
448 |
-
mx01
|
449 |
-
www7
|
450 |
-
redirect
|
451 |
-
sitebuilder
|
452 |
-
smtp3
|
453 |
-
adserver
|
454 |
-
net
|
455 |
-
user
|
456 |
-
forms
|
457 |
-
outlook
|
458 |
-
press
|
459 |
-
vc
|
460 |
-
health
|
461 |
-
work
|
462 |
-
mb
|
463 |
-
mm
|
464 |
-
f
|
465 |
-
pgsql
|
466 |
-
jp
|
467 |
-
sports
|
468 |
-
preprod
|
469 |
-
g
|
470 |
-
p
|
471 |
-
mdm
|
472 |
-
ar
|
473 |
-
lync
|
474 |
-
market
|
475 |
-
dbadmin
|
476 |
-
barracuda
|
477 |
-
affiliate
|
478 |
-
mars
|
479 |
-
users
|
480 |
-
images8
|
481 |
-
biblioteca
|
482 |
-
mc
|
483 |
-
ns12
|
484 |
-
math
|
485 |
-
ntp1
|
486 |
-
web01
|
487 |
-
software
|
488 |
-
pr
|
489 |
-
jupiter
|
490 |
-
labs
|
491 |
-
linux
|
492 |
-
sc
|
493 |
-
love
|
494 |
-
fax
|
495 |
-
php
|
496 |
-
lp
|
497 |
-
tracking
|
498 |
-
thumbs
|
499 |
-
up
|
500 |
-
tw
|
501 |
-
campus
|
502 |
-
reg
|
503 |
-
digital
|
504 |
-
demo2
|
505 |
-
da
|
506 |
-
tr
|
507 |
-
otrs
|
508 |
-
web6
|
509 |
-
ns02
|
510 |
-
mailgw
|
511 |
-
education
|
512 |
-
order
|
513 |
-
piwik
|
514 |
-
banners
|
515 |
-
rs
|
516 |
-
se
|
517 |
-
venus
|
518 |
-
internal
|
519 |
-
webservices
|
520 |
-
cm
|
521 |
-
whois
|
522 |
-
sync
|
523 |
-
lb
|
524 |
-
is
|
525 |
-
code
|
526 |
-
click
|
527 |
-
w2
|
528 |
-
bugzilla
|
529 |
-
virtual
|
530 |
-
origin-www
|
531 |
-
top
|
532 |
-
customer
|
533 |
-
pub
|
534 |
-
hotel
|
535 |
-
openx
|
536 |
-
log
|
537 |
-
uat
|
538 |
-
cdn3
|
539 |
-
images0
|
540 |
-
cgi
|
541 |
-
posta
|
542 |
-
reseller
|
543 |
-
soft
|
544 |
-
movie
|
545 |
-
mba
|
546 |
-
n
|
547 |
-
r
|
548 |
-
developer
|
549 |
-
nms
|
550 |
-
ns9
|
551 |
-
webcam
|
552 |
-
construtor
|
553 |
-
ebook
|
554 |
-
ftp3
|
555 |
-
join
|
556 |
-
dashboard
|
557 |
-
bi
|
558 |
-
wpad
|
559 |
-
admin2
|
560 |
-
agent
|
561 |
-
wm
|
562 |
-
books
|
563 |
-
joomla
|
564 |
-
hotels
|
565 |
-
ezproxy
|
566 |
-
ds
|
567 |
-
sa
|
568 |
-
katalog
|
569 |
-
team
|
570 |
-
emkt
|
571 |
-
antispam
|
572 |
-
adv
|
573 |
-
mercury
|
574 |
-
flash
|
575 |
-
myadmin
|
576 |
-
sklep
|
577 |
-
newsite
|
578 |
-
law
|
579 |
-
pl
|
580 |
-
ntp2
|
581 |
-
x
|
582 |
-
srv1
|
583 |
-
mp3
|
584 |
-
archives
|
585 |
-
proxy2
|
586 |
-
ps
|
587 |
-
pic
|
588 |
-
ir
|
589 |
-
orion
|
590 |
-
srv
|
591 |
-
mt
|
592 |
-
ocs
|
593 |
-
server3
|
594 |
-
meeting
|
595 |
-
v1
|
596 |
-
delta
|
597 |
-
titan
|
598 |
-
manager
|
599 |
-
subscribe
|
600 |
-
develop
|
601 |
-
wsus
|
602 |
-
oascentral
|
603 |
-
mobi
|
604 |
-
people
|
605 |
-
galleries
|
606 |
-
wwwtest
|
607 |
-
backoffice
|
608 |
-
sg
|
609 |
-
repo
|
610 |
-
soporte
|
611 |
-
www8
|
612 |
-
eu
|
613 |
-
ead
|
614 |
-
students
|
615 |
-
hq
|
616 |
-
awstats
|
617 |
-
ec
|
618 |
-
security
|
619 |
-
school
|
620 |
-
corporate
|
621 |
-
podcast
|
622 |
-
vote
|
623 |
-
conf
|
624 |
-
magento
|
625 |
-
mx4
|
626 |
-
webservice
|
627 |
-
tour
|
628 |
-
s5
|
629 |
-
power
|
630 |
-
correio
|
631 |
-
mon
|
632 |
-
mobilemail
|
633 |
-
weather
|
634 |
-
international
|
635 |
-
prod
|
636 |
-
account
|
637 |
-
xx
|
638 |
-
pages
|
639 |
-
pgadmin
|
640 |
-
bfn2
|
641 |
-
webserver
|
642 |
-
www-test
|
643 |
-
maintenance
|
644 |
-
me
|
645 |
-
magazine
|
646 |
-
syslog
|
647 |
-
int
|
648 |
-
view
|
649 |
-
enews
|
650 |
-
ci
|
651 |
-
au
|
652 |
-
mis
|
653 |
-
dev3
|
654 |
-
pdf
|
655 |
-
mailgate
|
656 |
-
v3
|
657 |
-
ss
|
658 |
-
internet
|
659 |
-
host1
|
660 |
-
smtp01
|
661 |
-
journal
|
662 |
-
wireless
|
663 |
-
opac
|
664 |
-
w1
|
665 |
-
signup
|
666 |
-
database
|
667 |
-
demo1
|
668 |
-
br
|
669 |
-
android
|
670 |
-
career
|
671 |
-
listserv
|
672 |
-
bt
|
673 |
-
spb
|
674 |
-
cam
|
675 |
-
contacts
|
676 |
-
webtest
|
677 |
-
resources
|
678 |
-
1
|
679 |
-
life
|
680 |
-
mail6
|
681 |
-
transfer
|
682 |
-
app1
|
683 |
-
confluence
|
684 |
-
controlpanel
|
685 |
-
secure2
|
686 |
-
puppet
|
687 |
-
classifieds
|
688 |
-
tunet
|
689 |
-
edge
|
690 |
-
biz
|
691 |
-
host3
|
692 |
-
red
|
693 |
-
newmail
|
694 |
-
mx02
|
695 |
-
sb
|
696 |
-
physics
|
697 |
-
ap
|
698 |
-
epaper
|
699 |
-
sts
|
700 |
-
proxy1
|
701 |
-
ww1
|
702 |
-
stg
|
703 |
-
sd
|
704 |
-
science
|
705 |
-
star
|
706 |
-
www9
|
707 |
-
phoenix
|
708 |
-
pluto
|
709 |
-
webdav
|
710 |
-
booking
|
711 |
-
eshop
|
712 |
-
edit
|
713 |
-
panelstats
|
714 |
-
xmpp
|
715 |
-
food
|
716 |
-
cert
|
717 |
-
adfs
|
718 |
-
mail02
|
719 |
-
cat
|
720 |
-
edm
|
721 |
-
vcenter
|
722 |
-
mysql2
|
723 |
-
sun
|
724 |
-
phone
|
725 |
-
surveys
|
726 |
-
smart
|
727 |
-
system
|
728 |
-
twitter
|
729 |
-
updates
|
730 |
-
webmail1
|
731 |
-
logs
|
732 |
-
sitedefender
|
733 |
-
as
|
734 |
-
cbf1
|
735 |
-
sugar
|
736 |
-
contact
|
737 |
-
vm
|
738 |
-
ipad
|
739 |
-
traffic
|
740 |
-
dm
|
741 |
-
saturn
|
742 |
-
bo
|
743 |
-
network
|
744 |
-
ac
|
745 |
-
ns13
|
746 |
-
webdev
|
747 |
-
libguides
|
748 |
-
asp
|
749 |
-
tm
|
750 |
-
core
|
751 |
-
mms
|
752 |
-
abc
|
753 |
-
scripts
|
754 |
-
fm
|
755 |
-
sm
|
756 |
-
test4
|
757 |
-
nas
|
758 |
-
newsletters
|
759 |
-
rsc
|
760 |
-
cluster
|
761 |
-
learn
|
762 |
-
panelstatsmail
|
763 |
-
lb1
|
764 |
-
usa
|
765 |
-
apollo
|
766 |
-
pre
|
767 |
-
terminal
|
768 |
-
l
|
769 |
-
tc
|
770 |
-
movies
|
771 |
-
sh
|
772 |
-
fms
|
773 |
-
dms
|
774 |
-
z
|
775 |
-
base
|
776 |
-
jwc
|
777 |
-
gs
|
778 |
-
kvm
|
779 |
-
bfn1
|
780 |
-
card
|
781 |
-
web02
|
782 |
-
lg
|
783 |
-
editor
|
784 |
-
metrics
|
785 |
-
feed
|
786 |
-
repository
|
787 |
-
asterisk
|
788 |
-
sns
|
789 |
-
global
|
790 |
-
counter
|
791 |
-
ch
|
792 |
-
sistemas
|
793 |
-
pc
|
794 |
-
china
|
795 |
-
u
|
796 |
-
payments
|
797 |
-
ma
|
798 |
-
pics
|
799 |
-
www10
|
800 |
-
e-learning
|
801 |
-
auction
|
802 |
-
hub
|
803 |
-
sf
|
804 |
-
cbf8
|
805 |
-
forum2
|
806 |
-
ns14
|
807 |
-
app2
|
808 |
-
passport
|
809 |
-
hd
|
810 |
-
talk
|
811 |
-
ex
|
812 |
-
debian
|
813 |
-
ct
|
814 |
-
rc
|
815 |
-
2012
|
816 |
-
imap4
|
817 |
-
blog2
|
818 |
-
ce
|
819 |
-
sk
|
820 |
-
relay2
|
821 |
-
green
|
822 |
-
print
|
823 |
-
geo
|
824 |
-
multimedia
|
825 |
-
iptv
|
826 |
-
backup2
|
827 |
-
webapps
|
828 |
-
audio
|
829 |
-
ro
|
830 |
-
smtp4
|
831 |
-
pg
|
832 |
-
ldap2
|
833 |
-
backend
|
834 |
-
profile
|
835 |
-
oldwww
|
836 |
-
drive
|
837 |
-
bill
|
838 |
-
listas
|
839 |
-
orders
|
840 |
-
win
|
841 |
-
mag
|
842 |
-
apply
|
843 |
-
bounce
|
844 |
-
mta
|
845 |
-
hp
|
846 |
-
suporte
|
847 |
-
dir
|
848 |
-
pa
|
849 |
-
sys
|
850 |
-
mx0
|
851 |
-
ems
|
852 |
-
antivirus
|
853 |
-
web8
|
854 |
-
inside
|
855 |
-
play
|
856 |
-
nic
|
857 |
-
welcome
|
858 |
-
premium
|
859 |
-
exam
|
860 |
-
sub
|
861 |
-
cz
|
862 |
-
omega
|
863 |
-
boutique
|
864 |
-
pp
|
865 |
-
management
|
866 |
-
planet
|
867 |
-
ww3
|
868 |
-
orange
|
869 |
-
c1
|
870 |
-
zzb
|
871 |
-
form
|
872 |
-
ecommerce
|
873 |
-
tmp
|
874 |
-
plus
|
875 |
-
openvpn
|
876 |
-
fw1
|
877 |
-
hk
|
878 |
-
owncloud
|
879 |
-
history
|
880 |
-
clientes
|
881 |
-
srv2
|
882 |
-
img4
|
883 |
-
open
|
884 |
-
registration
|
885 |
-
mp
|
886 |
-
blackboard
|
887 |
-
fc
|
888 |
-
static3
|
889 |
-
server4
|
890 |
-
s6
|
891 |
-
ecard
|
892 |
-
dspace
|
893 |
-
dns01
|
894 |
-
md
|
895 |
-
mcp
|
896 |
-
ares
|
897 |
-
spf
|
898 |
-
kms
|
899 |
-
intranet2
|
900 |
-
accounts
|
901 |
-
webapp
|
902 |
-
ask
|
903 |
-
rd
|
904 |
-
www-dev
|
905 |
-
gw2
|
906 |
-
mall
|
907 |
-
bg
|
908 |
-
teste
|
909 |
-
ldap1
|
910 |
-
real
|
911 |
-
m3
|
912 |
-
wave
|
913 |
-
movil
|
914 |
-
portal2
|
915 |
-
kids
|
916 |
-
gw1
|
917 |
-
ra
|
918 |
-
tienda
|
919 |
-
private
|
920 |
-
po
|
921 |
-
2013
|
922 |
-
cdn4
|
923 |
-
gps
|
924 |
-
km
|
925 |
-
ent
|
926 |
-
tt
|
927 |
-
ns21
|
928 |
-
at
|
929 |
-
athena
|
930 |
-
cbf2
|
931 |
-
webmail3
|
932 |
-
mob
|
933 |
-
matrix
|
934 |
-
ns15
|
935 |
-
send
|
936 |
-
lb2
|
937 |
-
pos
|
938 |
-
2
|
939 |
-
cl
|
940 |
-
renew
|
941 |
-
admissions
|
942 |
-
am
|
943 |
-
beta2
|
944 |
-
gamma
|
945 |
-
mx5
|
946 |
-
portfolio
|
947 |
-
contest
|
948 |
-
box
|
949 |
-
mg
|
950 |
-
wwwold
|
951 |
-
neptune
|
952 |
-
mac
|
953 |
-
pms
|
954 |
-
traveler
|
955 |
-
media2
|
956 |
-
studio
|
957 |
-
sw
|
958 |
-
imp
|
959 |
-
bs
|
960 |
-
alfa
|
961 |
-
cbf4
|
962 |
-
servicedesk
|
963 |
-
wmail
|
964 |
-
video2
|
965 |
-
switch
|
966 |
-
sam
|
967 |
-
sky
|
968 |
-
ee
|
969 |
-
widget
|
970 |
-
reklama
|
971 |
-
msn
|
972 |
-
paris
|
973 |
-
tms
|
974 |
-
th
|
975 |
-
vega
|
976 |
-
trade
|
977 |
-
intern
|
978 |
-
ext
|
979 |
-
oldsite
|
980 |
-
learning
|
981 |
-
group
|
982 |
-
f1
|
983 |
-
ns22
|
984 |
-
ns20
|
985 |
-
demo3
|
986 |
-
bm
|
987 |
-
dom
|
988 |
-
pe
|
989 |
-
annuaire
|
990 |
-
portail
|
991 |
-
graphics
|
992 |
-
iris
|
993 |
-
one
|
994 |
-
robot
|
995 |
-
ams
|
996 |
-
s7
|
997 |
-
foro
|
998 |
-
gaia
|
999 |
-
vpn3
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
tdagent/constants.py
DELETED
@@ -1,8 +0,0 @@
|
|
1 |
-
import enum
|
2 |
-
|
3 |
-
|
4 |
-
class HttpContentType(str, enum.Enum):
|
5 |
-
"""Http content type values."""
|
6 |
-
|
7 |
-
HTML = "text/html"
|
8 |
-
JSON = "application/json"
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
tdagent/tools/get_domain_information.py
DELETED
@@ -1,368 +0,0 @@
|
|
1 |
-
import json
|
2 |
-
import os
|
3 |
-
from concurrent.futures import ThreadPoolExecutor
|
4 |
-
from pathlib import Path
|
5 |
-
from typing import Any
|
6 |
-
|
7 |
-
import cachetools
|
8 |
-
import gradio as gr
|
9 |
-
import requests
|
10 |
-
import urllib3
|
11 |
-
from dns import message
|
12 |
-
|
13 |
-
|
14 |
-
_DNS_SERVER = "https://dns.google/dns-query" # can use others
|
15 |
-
_DNS_RECORD_TYPES = [
|
16 |
-
"A",
|
17 |
-
"AAAA",
|
18 |
-
"CNAME",
|
19 |
-
"MX",
|
20 |
-
"NS",
|
21 |
-
"SOA",
|
22 |
-
"TXT",
|
23 |
-
"RP",
|
24 |
-
"LOC",
|
25 |
-
"CAA",
|
26 |
-
"SPF",
|
27 |
-
"SRV",
|
28 |
-
"NSEC",
|
29 |
-
"RRSIG",
|
30 |
-
]
|
31 |
-
|
32 |
-
_COMMON_SUBDOMAINS_TXT_PATH = Path("./subdomains/subdomains.txt")
|
33 |
-
|
34 |
-
_CACHE_MAX_SIZE = 4096
|
35 |
-
_CACHE_TTL_SECONDS = 3600
|
36 |
-
|
37 |
-
|
38 |
-
@cachetools.cached(
|
39 |
-
cache=cachetools.TTLCache(maxsize=_CACHE_MAX_SIZE, ttl=_CACHE_TTL_SECONDS),
|
40 |
-
)
|
41 |
-
def get_geolocation(ip: str) -> dict[str, Any] | str:
|
42 |
-
"""Get location information from an ip address.
|
43 |
-
|
44 |
-
Returns the following information on an ip address:
|
45 |
-
1. IPv4
|
46 |
-
2. city
|
47 |
-
4. country_code
|
48 |
-
5. country_name
|
49 |
-
6. latitude
|
50 |
-
7. longitude
|
51 |
-
8. postal
|
52 |
-
9. state
|
53 |
-
|
54 |
-
Example:
|
55 |
-
>>> from pprint import pprint
|
56 |
-
>>> pprint(get_location("103.100.104.0"))
|
57 |
-
... {'IPv4': '103.100.104.0',
|
58 |
-
'city': None,
|
59 |
-
'country_code': 'NZ',
|
60 |
-
'country_name': 'New Zealand',
|
61 |
-
'latitude': -41,
|
62 |
-
'longitude': 174,
|
63 |
-
'postal': None,
|
64 |
-
'state': None}
|
65 |
-
|
66 |
-
Args:
|
67 |
-
ip: ip address
|
68 |
-
|
69 |
-
Returns:
|
70 |
-
Location information on the ip address.
|
71 |
-
"""
|
72 |
-
try:
|
73 |
-
return requests.get(
|
74 |
-
f"https://geolocation-db.com/json/{ip.strip()}",
|
75 |
-
timeout=1,
|
76 |
-
).json()
|
77 |
-
except Exception as e: # noqa: BLE001
|
78 |
-
return str(e)
|
79 |
-
|
80 |
-
|
81 |
-
def _request_dns_record( # noqa: D417
|
82 |
-
domain: str,
|
83 |
-
record_type: str,
|
84 |
-
timeout: float = 0.5,
|
85 |
-
) -> list[str]:
|
86 |
-
"""Utility to build dns resolve requests that do not use port 53.
|
87 |
-
|
88 |
-
Args:
|
89 |
-
domain: domain to investigate
|
90 |
-
record_type: record type
|
91 |
-
|
92 |
-
Returns:
|
93 |
-
Information about the dns record type for the domain.
|
94 |
-
"""
|
95 |
-
q = message.make_query(domain, record_type)
|
96 |
-
response = requests.post(
|
97 |
-
_DNS_SERVER,
|
98 |
-
headers={
|
99 |
-
"Content-Type": "application/dns-message",
|
100 |
-
"Accept": "application/dns-message",
|
101 |
-
},
|
102 |
-
data=q.to_wire(),
|
103 |
-
verify=True,
|
104 |
-
timeout=timeout,
|
105 |
-
)
|
106 |
-
dns_message = message.from_wire(response.content)
|
107 |
-
return [str(rdata) for rdata in dns_message.answer[0]] if dns_message.answer else []
|
108 |
-
|
109 |
-
|
110 |
-
# see: https://thepythoncode.com/article/dns-enumeration-with-python
|
111 |
-
# https://dnspython.readthedocs.io
|
112 |
-
@cachetools.cached(
|
113 |
-
cache=cachetools.TTLCache(maxsize=_CACHE_MAX_SIZE, ttl=_CACHE_TTL_SECONDS),
|
114 |
-
)
|
115 |
-
def enumerate_dns(domain_name: str) -> dict[str, Any] | None:
|
116 |
-
r"""Enumerates information about a specific domain's DNS configuration.
|
117 |
-
|
118 |
-
Information collected about the domain name:
|
119 |
-
1. A records: the IPv4 associated with the domain
|
120 |
-
2. AAAA records: the IPv6 associated with the domain
|
121 |
-
3. CAA records: used by owners to specify which Certificate Authorities
|
122 |
-
are authorized to issue SSL/TLS certificates for their domains.
|
123 |
-
4. CNAME records: alias of one name to another - the DNS lookup will
|
124 |
-
continue by retrying the lookup with the new name.
|
125 |
-
5. LOC records: geographic location associated with a domain name.
|
126 |
-
6. MX records: associated email servers to the domain.
|
127 |
-
7. NS records: DNS servers that are authoritative for a particular domain.
|
128 |
-
These may be use to inquire information about the domain.
|
129 |
-
8. SOA records: defines authoritative information about a DNS zone,
|
130 |
-
including zone transfers and cache expiration.
|
131 |
-
9. TXT records: used for domain verification and email security.
|
132 |
-
10. RP records: the responsible person for a domain.
|
133 |
-
11. SPF records: defines authorized email servers.
|
134 |
-
12. SRV records: specifies location of specific services
|
135 |
-
(port and host) for the domain.
|
136 |
-
14. NSEC records: proves non-existence of DNS records
|
137 |
-
and prevents zone enumeration.
|
138 |
-
15. RRSIG records: contains cryptographic signatures for DNSSEC-signed
|
139 |
-
records, providing authentication and integrity.
|
140 |
-
|
141 |
-
Example:
|
142 |
-
>>> from pprint import pprint
|
143 |
-
>>> pprint(enumerate_dns("youtube.com"))
|
144 |
-
... {'A': 'youtube.com. 300 IN A 142.250.200.142',
|
145 |
-
'AAAA': 'youtube.com. 286 IN AAAA 2a00:1450:4003:80f::200e',
|
146 |
-
'CAA': 'youtube.com. 14352 IN CAA 0 issue "pki.goog"',
|
147 |
-
'CNAME': None,
|
148 |
-
'LOC': None,
|
149 |
-
'MX': 'youtube.com. 300 IN MX 0 smtp.google.com.',
|
150 |
-
'NS': 'youtube.com. 21600 IN NS ns4.google.com.\n'
|
151 |
-
'youtube.com. 21600 IN NS ns1.google.com.\n'
|
152 |
-
'youtube.com. 21600 IN NS ns2.google.com.\n'
|
153 |
-
'youtube.com. 21600 IN NS ns3.google.com.',
|
154 |
-
'NSEC': None,
|
155 |
-
'RP': None,
|
156 |
-
'RRSIG': None,
|
157 |
-
'SOA': 'youtube.com. 60 IN SOA ns1.google.com. dns-admin.google.com. '
|
158 |
-
'766113658 900 900 1800 60',
|
159 |
-
'SPF': None,
|
160 |
-
'SRV': None,
|
161 |
-
'TXT': 'youtube.com. 3586 IN TXT "v=spf1 include:google.com mx -all"\n'
|
162 |
-
'youtube.com. 3586 IN TXT '
|
163 |
-
'"facebook-domain-verification=64jdes7le4h7e7lfpi22rijygx58j1"\n'
|
164 |
-
'youtube.com. 3586 IN TXT '
|
165 |
-
'"google-site-verification=QtQWEwHWM8tHiJ4s-jJWzEQrD_fF3luPnpzNDH-Nw-w"'}
|
166 |
-
|
167 |
-
Args:
|
168 |
-
domain_name: domain name for which to
|
169 |
-
enumerate the DNS configuration.
|
170 |
-
|
171 |
-
Returns:
|
172 |
-
The domain's DNS configuration.
|
173 |
-
"""
|
174 |
-
enumeration = {}
|
175 |
-
for record_type in _DNS_RECORD_TYPES:
|
176 |
-
try:
|
177 |
-
record = _request_dns_record(domain_name.strip(), record_type, timeout=1)
|
178 |
-
if record:
|
179 |
-
enumeration[record_type] = record
|
180 |
-
except Exception as e: # noqa: BLE001, PERF203
|
181 |
-
enumeration[record_type] = [str(e)]
|
182 |
-
return enumeration if enumeration else None
|
183 |
-
|
184 |
-
|
185 |
-
def resolve_subdomain_ipv4(domain: str) -> str | None:
|
186 |
-
"""Resolve the IPv4 address of a domain.
|
187 |
-
|
188 |
-
Args:
|
189 |
-
domain: domain name
|
190 |
-
|
191 |
-
Returns:
|
192 |
-
The domain is returned provided
|
193 |
-
it was resolved. Otherwise nothing
|
194 |
-
is returned.
|
195 |
-
"""
|
196 |
-
try:
|
197 |
-
ipv4 = _request_dns_record(domain, "A", timeout=0.6)
|
198 |
-
if ipv4:
|
199 |
-
return domain
|
200 |
-
msg = "Cannot resolve it: it is likely non-existing"
|
201 |
-
raise Exception(msg) # noqa: TRY002, TRY301
|
202 |
-
except Exception: # noqa: BLE001
|
203 |
-
return None
|
204 |
-
|
205 |
-
|
206 |
-
@cachetools.cached(
|
207 |
-
cache=cachetools.TTLCache(maxsize=_CACHE_MAX_SIZE, ttl=_CACHE_TTL_SECONDS),
|
208 |
-
)
|
209 |
-
def scrap_subdomains_for_domain(domain_name: str) -> list[str]:
|
210 |
-
"""Retrieves subdomains associated to a domain if any.
|
211 |
-
|
212 |
-
The information retrieved from a domain is its subdomains
|
213 |
-
provided they are the top 1000 subdomain prefixes as
|
214 |
-
indicated by https://github.com/rbsec/dnscan/tree/master
|
215 |
-
|
216 |
-
Importantly, it finds subdomains only if their prefixes
|
217 |
-
are along the top 1000 most common. Hence, it may not
|
218 |
-
yield all the subdomains associated to the domain.
|
219 |
-
|
220 |
-
Example:
|
221 |
-
>>> scrap_subdomains_for_domain("github.com")
|
222 |
-
... ['www.github.com', 'smtp.github.com', 'ns1.github.com',
|
223 |
-
'ns2.github.com','autodiscover.github.com', 'test.github.com',
|
224 |
-
'blog.github.com', 'admin.github.com', 'support.github.com',
|
225 |
-
'docs.github.com', 'shop.github.com', 'wiki.github.com',
|
226 |
-
'api.github.com', 'live.github.com', 'help.github.com',
|
227 |
-
'jobs.github.com', 'services.github.com', 'de.github.com',
|
228 |
-
'cs.github.com', 'fr.github.com', 'ssh.github.com',
|
229 |
-
'partner.github.com', 'community.github.com',
|
230 |
-
'mailer.github.com', 'training.github.com', ...]
|
231 |
-
|
232 |
-
Args:
|
233 |
-
domain_name: domain name for which to retrieve a
|
234 |
-
list of subdomains
|
235 |
-
|
236 |
-
Returns:
|
237 |
-
List of subdomains if any.
|
238 |
-
"""
|
239 |
-
try:
|
240 |
-
with open(_COMMON_SUBDOMAINS_TXT_PATH) as file: # noqa: PTH123
|
241 |
-
subdomains = [line.strip() for line in file if line.strip()]
|
242 |
-
except FileNotFoundError:
|
243 |
-
return []
|
244 |
-
|
245 |
-
potential_subdomains = [
|
246 |
-
f"{subdomain}.{domain_name.strip()}" for subdomain in subdomains
|
247 |
-
]
|
248 |
-
with ThreadPoolExecutor(max_workers=None) as executor:
|
249 |
-
results = executor.map(resolve_subdomain_ipv4, potential_subdomains)
|
250 |
-
return [domain for domain in results if domain]
|
251 |
-
|
252 |
-
|
253 |
-
@cachetools.cached(
|
254 |
-
cache=cachetools.TTLCache(maxsize=_CACHE_MAX_SIZE, ttl=_CACHE_TTL_SECONDS),
|
255 |
-
)
|
256 |
-
def retrieve_ioc_from_threatfox(potentially_ioc: str) -> str:
|
257 |
-
r"""Retrieves information about a potential IoC from ThreatFox.
|
258 |
-
|
259 |
-
It may be used to retrieve information of indicators of compromise
|
260 |
-
(IOCs) associated with malware, with the infosec community, AV
|
261 |
-
vendors and cyber threat intelligence providers.
|
262 |
-
|
263 |
-
Examples:
|
264 |
-
>>> retrieve_ioc_from_threatfox("139.180.203.104")
|
265 |
-
... {
|
266 |
-
"query_status": "ok",
|
267 |
-
"data": [
|
268 |
-
{
|
269 |
-
"id": "12",
|
270 |
-
"ioc": "139.180.203.104:443",
|
271 |
-
"threat_type": "botnet_cc",
|
272 |
-
"threat_type_desc": "Indicator that identifies a botnet command&control...",
|
273 |
-
"ioc_type": "ip:port",
|
274 |
-
"ioc_type_desc": "ip:port combination that is used for botnet Command&...,
|
275 |
-
"malware": "win.cobalt_strike",
|
276 |
-
"malware_printable": "Cobalt Strike",
|
277 |
-
"malware_alias": "Agentemis,BEACON,CobaltStrike",
|
278 |
-
"malware_malpedia": "https:\/\/malpedia.caad.fkie.fraunhofer.de\/...",
|
279 |
-
"confidence_level": 75,
|
280 |
-
"first_seen": "2020-12-06 09:10:23 UTC",
|
281 |
-
"last_seen": null,
|
282 |
-
"reference": null,
|
283 |
-
"reporter": "abuse_ch",
|
284 |
-
"tags": null,
|
285 |
-
"malware_samples": [
|
286 |
-
{
|
287 |
-
"time_stamp": "2021-03-23 08:18:06 UTC",
|
288 |
-
"md5_hash": "5b7e82e051ade4b14d163eea2a17bf8b",
|
289 |
-
"sha256_hash": "b325c92fa540edeb89b95dbfd4400c1cb33599c66859....",
|
290 |
-
"malware_bazaar": "https:\/\/bazaar.abuse.ch\/sample\/b325c...\/"
|
291 |
-
},
|
292 |
-
]
|
293 |
-
|
294 |
-
}
|
295 |
-
]
|
296 |
-
}
|
297 |
-
|
298 |
-
Args:
|
299 |
-
potentially_ioc: this can be a url, a domain, a hash,
|
300 |
-
or any other type of IoC.
|
301 |
-
|
302 |
-
Returns:
|
303 |
-
Information of the input as an IoC: threat type, malware type andsamples,
|
304 |
-
confidence level, first/last seen dates, and more IoC information.
|
305 |
-
"""
|
306 |
-
headers = {"Auth-Key": os.environ["THREATFOX_APIKEY"]}
|
307 |
-
pool = urllib3.HTTPSConnectionPool(
|
308 |
-
"threatfox-api.abuse.ch",
|
309 |
-
port=443,
|
310 |
-
maxsize=50,
|
311 |
-
headers=headers,
|
312 |
-
timeout=5,
|
313 |
-
)
|
314 |
-
data = {
|
315 |
-
"query": "search_ioc",
|
316 |
-
"search_term": potentially_ioc.strip(),
|
317 |
-
}
|
318 |
-
json_data = json.dumps(data)
|
319 |
-
try:
|
320 |
-
response = pool.request("POST", "/api/v1/", body=json_data)
|
321 |
-
return response.data.decode("utf-8", "ignore")
|
322 |
-
except Exception as e: # noqa: BLE001
|
323 |
-
return str(e)
|
324 |
-
|
325 |
-
|
326 |
-
geo_location_tool = gr.Interface(
|
327 |
-
fn=get_geolocation,
|
328 |
-
inputs=gr.Textbox(label="ip"),
|
329 |
-
outputs=gr.JSON(label="Geolocation of IP"),
|
330 |
-
title="Domain Associated Geolocation Finder",
|
331 |
-
description="Retrieves the geolocation associated to an input ip address",
|
332 |
-
theme="default",
|
333 |
-
examples=["1.0.3.255", "59.34.7.3"],
|
334 |
-
)
|
335 |
-
|
336 |
-
dns_enumeration_tool = gr.Interface(
|
337 |
-
fn=enumerate_dns,
|
338 |
-
inputs=gr.Textbox(label="domain"),
|
339 |
-
outputs=gr.JSON(label="DNS records"),
|
340 |
-
title="DNS record enumerator of domains",
|
341 |
-
description="Retrieves several dns record types for the input domain names",
|
342 |
-
theme="default",
|
343 |
-
examples=["owasp.org", "nist.gov"],
|
344 |
-
)
|
345 |
-
|
346 |
-
scrap_subdomains_tool = gr.Interface(
|
347 |
-
fn=scrap_subdomains_for_domain,
|
348 |
-
inputs=gr.Textbox(label="domain"),
|
349 |
-
outputs=gr.JSON(label="Subdomains managed by domain"),
|
350 |
-
title="Subdomains Extractor of domains",
|
351 |
-
description="Retrieves the subdomains for the input domain if they are common",
|
352 |
-
theme="default",
|
353 |
-
examples=["github.com", "netacea.com"],
|
354 |
-
)
|
355 |
-
|
356 |
-
extractor_of_ioc_from_threatfox_tool = gr.Interface(
|
357 |
-
fn=retrieve_ioc_from_threatfox,
|
358 |
-
inputs=gr.Textbox(label="IoC - url, domains or hash"),
|
359 |
-
outputs=gr.Text(label="Entity information as an IoC"),
|
360 |
-
title="IoC information extractor associated to particular entities",
|
361 |
-
description=(
|
362 |
-
"If information as an Indicator of Compromise (IoC) exists "
|
363 |
-
"for the input url, domain or hash, it retrieves it"
|
364 |
-
),
|
365 |
-
theme="default",
|
366 |
-
examples=["advertipros.com", "dev.couplesparks.com"],
|
367 |
-
example_labels=["👾 IoC 1", "👾 IoC 2"],
|
368 |
-
)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
tdagent/tools/get_url_content.py
CHANGED
@@ -1,124 +1,65 @@
|
|
|
|
1 |
from collections.abc import Sequence
|
2 |
-
from typing import Literal, Optional, Dict, Any
|
3 |
-
import json
|
4 |
|
5 |
import gradio as gr
|
6 |
import requests
|
7 |
|
8 |
-
from tdagent.constants import HttpContentType
|
9 |
|
10 |
-
|
11 |
-
|
12 |
|
13 |
-
|
|
|
|
|
|
|
|
|
14 |
url: str,
|
15 |
-
|
16 |
-
|
17 |
-
body: str = "",
|
18 |
-
timeout: float = 30,
|
19 |
-
custom_headers: str = ""
|
20 |
) -> tuple[str, str]:
|
21 |
-
"""
|
22 |
|
23 |
Args:
|
24 |
-
url: The URL to
|
25 |
-
|
26 |
-
|
27 |
-
|
28 |
timeout: Request timeout in seconds. Defaults to 30.
|
29 |
-
custom_headers: JSON string of additional headers.
|
30 |
|
31 |
Returns:
|
32 |
-
A pair of strings (content, error_message).
|
|
|
|
|
|
|
|
|
33 |
"""
|
34 |
-
# Initialize headers dictionary
|
35 |
headers = {}
|
36 |
|
37 |
-
# Parse content type
|
38 |
if content_type:
|
39 |
-
headers["Accept"] = content_type
|
40 |
-
|
41 |
-
|
42 |
-
|
43 |
-
|
44 |
-
|
45 |
-
headers.update(custom_headers_dict)
|
46 |
-
except json.JSONDecodeError:
|
47 |
-
return "", "Invalid JSON format in custom headers"
|
48 |
-
|
49 |
-
# Prepare request parameters
|
50 |
-
request_params: Dict[str, Any] = {
|
51 |
-
"url": url,
|
52 |
-
"headers": headers,
|
53 |
-
"timeout": timeout,
|
54 |
-
}
|
55 |
-
|
56 |
-
# Add body for methods that support it
|
57 |
-
if method in ["POST", "PUT", "PATCH"] and body:
|
58 |
-
request_params["data"] = body
|
59 |
-
|
60 |
-
try:
|
61 |
-
response = requests.request(method, **request_params)
|
62 |
-
except requests.exceptions.MissingSchema as err:
|
63 |
-
return "", str(err)
|
64 |
-
except requests.exceptions.RequestException as err:
|
65 |
-
return "", str(err)
|
66 |
|
67 |
try:
|
68 |
response.raise_for_status()
|
69 |
except requests.HTTPError as err:
|
70 |
return "", str(err)
|
71 |
|
72 |
-
# For HEAD requests, return headers as content
|
73 |
-
if method == "HEAD":
|
74 |
-
return str(dict(response.headers)), ""
|
75 |
-
|
76 |
return response.text, ""
|
77 |
|
78 |
-
|
79 |
-
|
80 |
-
fn=
|
81 |
-
inputs=[
|
82 |
-
|
83 |
-
|
84 |
-
choices=["GET", "POST", "PUT", "DELETE", "PATCH", "HEAD"],
|
85 |
-
label="HTTP Method",
|
86 |
-
value="GET"
|
87 |
-
),
|
88 |
-
gr.Textbox(
|
89 |
-
label="Content Type",
|
90 |
-
placeholder="text/html,application/json"
|
91 |
-
),
|
92 |
-
gr.Textbox(
|
93 |
-
label="Request Body (for POST/PUT/PATCH)",
|
94 |
-
lines=3,
|
95 |
-
placeholder='{"key": "value"}'
|
96 |
-
),
|
97 |
-
gr.Number(
|
98 |
-
label="Timeout (seconds)",
|
99 |
-
value=30,
|
100 |
-
minimum=1,
|
101 |
-
maximum=300
|
102 |
-
),
|
103 |
-
gr.Textbox(
|
104 |
-
label="Custom Headers (JSON format)",
|
105 |
-
placeholder='{"Authorization": "Bearer token"}'
|
106 |
-
)
|
107 |
-
],
|
108 |
-
outputs=gr.Text(label="Response"),
|
109 |
-
title="Make HTTP Requests",
|
110 |
description=(
|
111 |
-
"
|
112 |
-
"
|
113 |
-
"
|
114 |
-
"
|
115 |
-
"Be cautious when accessing unknown URLs."
|
116 |
),
|
117 |
-
examples=[
|
118 |
-
["https://google.com", "GET", "text/html", "", 30, ""],
|
119 |
-
["https://api.example.com/data", "POST", "application/json", '{"key": "value"}', 30, '{"Authorization": "Bearer token"}'],
|
120 |
-
],
|
121 |
)
|
122 |
-
|
123 |
-
if __name__ == "__main__":
|
124 |
-
gr_make_http_request.launch()
|
|
|
1 |
+
import enum
|
2 |
from collections.abc import Sequence
|
|
|
|
|
3 |
|
4 |
import gradio as gr
|
5 |
import requests
|
6 |
|
|
|
7 |
|
8 |
+
class HttpContentType(str, enum.Enum):
|
9 |
+
"""Http content type values."""
|
10 |
|
11 |
+
HTML = "text/html"
|
12 |
+
JSON = "application/json"
|
13 |
+
|
14 |
+
|
15 |
+
def get_url_http_content(
|
16 |
url: str,
|
17 |
+
content_type: Sequence[HttpContentType] | None = None,
|
18 |
+
timeout: int = 30,
|
|
|
|
|
|
|
19 |
) -> tuple[str, str]:
|
20 |
+
"""Get the content of a URL using an HTTP GET request.
|
21 |
|
22 |
Args:
|
23 |
+
url: The URL to fetch the content from.
|
24 |
+
content_type: If given it should contain the expected
|
25 |
+
content types in the response body. The server may
|
26 |
+
not honor the requested content types.
|
27 |
timeout: Request timeout in seconds. Defaults to 30.
|
|
|
28 |
|
29 |
Returns:
|
30 |
+
A pair of strings (content, error_message). If there is an
|
31 |
+
error getting content from the URL the `content` will be
|
32 |
+
empty and `error_message` will, usually, contain the error
|
33 |
+
cause. Otherwise, `error_message` will be empty and the
|
34 |
+
content will be filled with data fetched from the URL.
|
35 |
"""
|
|
|
36 |
headers = {}
|
37 |
|
|
|
38 |
if content_type:
|
39 |
+
headers["Accept"] = ",".join(content_type)
|
40 |
+
response = requests.get(
|
41 |
+
url,
|
42 |
+
headers=headers,
|
43 |
+
timeout=timeout,
|
44 |
+
)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
45 |
|
46 |
try:
|
47 |
response.raise_for_status()
|
48 |
except requests.HTTPError as err:
|
49 |
return "", str(err)
|
50 |
|
|
|
|
|
|
|
|
|
51 |
return response.text, ""
|
52 |
|
53 |
+
|
54 |
+
gr_get_url_http_content = gr.Interface(
|
55 |
+
fn=get_url_http_content,
|
56 |
+
inputs=["text", "text"],
|
57 |
+
outputs="text",
|
58 |
+
title="Get the content of a URL using an HTTP GET request.",
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
59 |
description=(
|
60 |
+
"Get the content of a URL in one of the specified content types."
|
61 |
+
" The server may not honor the content type and if it fails the"
|
62 |
+
" reason should also be returned with the corresponding HTTP"
|
63 |
+
" error code."
|
|
|
64 |
),
|
|
|
|
|
|
|
|
|
65 |
)
|
|
|
|
|
|
tdagent/tools/internal_company_user_search.py
CHANGED
@@ -1,6 +1,5 @@
|
|
1 |
import gradio as gr
|
2 |
|
3 |
-
|
4 |
# Fake user database
|
5 |
users_db = {
|
6 |
"jsmith": {
|
@@ -10,7 +9,7 @@ users_db = {
|
|
10 |
"user_id": "US789456",
|
11 |
"jobtitle": "Software Engineer",
|
12 |
"department": "Engineering",
|
13 |
-
"country": "United States"
|
14 |
},
|
15 |
"mhacker": {
|
16 |
"username": "mhacker",
|
@@ -19,19 +18,16 @@ users_db = {
|
|
19 |
"user_id": "US123789",
|
20 |
"jobtitle": "Security Specialist",
|
21 |
"department": "Pentests",
|
22 |
-
"country": "Germany"
|
23 |
-
}
|
24 |
}
|
25 |
|
26 |
|
27 |
-
def lookup_user(username
|
28 |
-
"""
|
29 |
-
|
30 |
Company User Lookup System. Enter a username to get user details.
|
31 |
-
|
32 |
-
Returns:
|
33 |
-
A formatted string with user details if found, otherwise an
|
34 |
-
error message.
|
35 |
"""
|
36 |
if username in users_db:
|
37 |
user = users_db[username]
|
@@ -43,8 +39,8 @@ User ID: {user['user_id']}
|
|
43 |
Job Title: {user['jobtitle']}
|
44 |
Department: {user['department']}
|
45 |
Country: {user['country']}"""
|
46 |
-
|
47 |
-
|
48 |
Username: Not found
|
49 |
Email: N/A
|
50 |
Name: N/A
|
@@ -60,6 +56,6 @@ gr_internal_company = gr.Interface(
|
|
60 |
inputs=["text"],
|
61 |
outputs=["text"],
|
62 |
title="Company User Lookup System",
|
63 |
-
description="Company User Lookup System.",
|
64 |
-
theme="default"
|
65 |
)
|
|
|
1 |
import gradio as gr
|
2 |
|
|
|
3 |
# Fake user database
|
4 |
users_db = {
|
5 |
"jsmith": {
|
|
|
9 |
"user_id": "US789456",
|
10 |
"jobtitle": "Software Engineer",
|
11 |
"department": "Engineering",
|
12 |
+
"country": "United States"
|
13 |
},
|
14 |
"mhacker": {
|
15 |
"username": "mhacker",
|
|
|
18 |
"user_id": "US123789",
|
19 |
"jobtitle": "Security Specialist",
|
20 |
"department": "Pentests",
|
21 |
+
"country": "Germany"
|
22 |
+
}
|
23 |
}
|
24 |
|
25 |
|
26 |
+
def lookup_user(username):
|
27 |
+
"""
|
28 |
+
Function to lookup user information.
|
29 |
Company User Lookup System. Enter a username to get user details.
|
30 |
+
Returns a formatted string with user details if found, otherwise returns error message
|
|
|
|
|
|
|
31 |
"""
|
32 |
if username in users_db:
|
33 |
user = users_db[username]
|
|
|
39 |
Job Title: {user['jobtitle']}
|
40 |
Department: {user['department']}
|
41 |
Country: {user['country']}"""
|
42 |
+
else:
|
43 |
+
return """User Not Found
|
44 |
Username: Not found
|
45 |
Email: N/A
|
46 |
Name: N/A
|
|
|
56 |
inputs=["text"],
|
57 |
outputs=["text"],
|
58 |
title="Company User Lookup System",
|
59 |
+
description="Company User Lookup System. Enter a username to get user details",
|
60 |
+
theme="default"
|
61 |
)
|
tdagent/tools/lookup_company_cloud_account_information.py
CHANGED
@@ -1,6 +1,5 @@
|
|
1 |
import gradio as gr
|
2 |
|
3 |
-
|
4 |
# Fake cloud accounts database
|
5 |
cloud_accounts = {
|
6 |
("AWS", "123456789012"): {
|
@@ -9,7 +8,7 @@ cloud_accounts = {
|
|
9 |
"cloud_account_name": "Production-Main",
|
10 |
"owner_user_id": "AWS001",
|
11 |
"owner_email": "[email protected]",
|
12 |
-
"deployed_applications": ["ERP System", "Customer Portal", "Data Lake"]
|
13 |
},
|
14 |
("AWS", "098765432109"): {
|
15 |
"public_cloud_provider": "AWS",
|
@@ -17,7 +16,7 @@ cloud_accounts = {
|
|
17 |
"cloud_account_name": "Development-Team1",
|
18 |
"owner_user_id": "AWS002",
|
19 |
"owner_email": "[email protected]",
|
20 |
-
"deployed_applications": ["Test Environment", "CI/CD Pipeline"]
|
21 |
},
|
22 |
("Azure", "sub-abc-123-def-456"): {
|
23 |
"public_cloud_provider": "Azure",
|
@@ -25,7 +24,7 @@ cloud_accounts = {
|
|
25 |
"cloud_account_name": "Enterprise-Solutions",
|
26 |
"owner_user_id": "AZ001",
|
27 |
"owner_email": "[email protected]",
|
28 |
-
"deployed_applications": ["Microsoft 365 Integration", "Azure AD Connect"]
|
29 |
},
|
30 |
("Azure", "sub-xyz-789-uvw-321"): {
|
31 |
"public_cloud_provider": "Azure",
|
@@ -33,7 +32,7 @@ cloud_accounts = {
|
|
33 |
"cloud_account_name": "Research-Division",
|
34 |
"owner_user_id": "AZ002",
|
35 |
"owner_email": "[email protected]",
|
36 |
-
"deployed_applications": ["ML Platform", "Research Portal"]
|
37 |
},
|
38 |
("GCP", "project-id-123456"): {
|
39 |
"public_cloud_provider": "GCP",
|
@@ -41,7 +40,7 @@ cloud_accounts = {
|
|
41 |
"cloud_account_name": "Analytics-Platform",
|
42 |
"owner_user_id": "GCP001",
|
43 |
"owner_email": "[email protected]",
|
44 |
-
"deployed_applications": ["BigQuery Data Warehouse", "Kubernetes Cluster"]
|
45 |
},
|
46 |
("GCP", "project-id-789012"): {
|
47 |
"public_cloud_provider": "GCP",
|
@@ -49,49 +48,45 @@ cloud_accounts = {
|
|
49 |
"cloud_account_name": "ML-Operations",
|
50 |
"owner_user_id": "GCP002",
|
51 |
"owner_email": "[email protected]",
|
52 |
-
"deployed_applications": ["TensorFlow Training", "Model Serving Platform"]
|
53 |
-
}
|
54 |
}
|
55 |
|
56 |
|
57 |
-
def lookup_cloud_account(public_cloud_provider
|
58 |
-
"""
|
59 |
-
|
60 |
-
Returns a formatted string with account details if
|
61 |
-
found, otherwise returns error message.
|
62 |
"""
|
63 |
account_key = (public_cloud_provider, account_id)
|
64 |
|
65 |
if account_key in cloud_accounts:
|
66 |
account = cloud_accounts[account_key]
|
67 |
-
return {
|
68 |
-
|
69 |
-
|
70 |
-
|
71 |
-
|
72 |
-
|
73 |
-
|
74 |
-
|
75 |
-
|
76 |
-
|
77 |
-
|
78 |
-
|
79 |
-
|
80 |
-
|
81 |
-
|
82 |
|
83 |
|
84 |
# Create Gradio Interface
|
85 |
gr_lookup_company_cloud_account_information = gr.Interface(
|
86 |
fn=lookup_cloud_account,
|
87 |
-
inputs=[
|
88 |
-
|
89 |
-
|
90 |
-
|
91 |
-
|
92 |
-
),
|
93 |
-
"text",
|
94 |
-
],
|
95 |
outputs="text",
|
96 |
title="Company Cloud Account Lookup System",
|
97 |
description="""
|
@@ -102,5 +97,5 @@ gr_lookup_company_cloud_account_information = gr.Interface(
|
|
102 |
Azure: sub-abc-123-def-456, sub-xyz-789-uvw-321
|
103 |
GCP: project-id-123456, project-id-789012
|
104 |
""",
|
105 |
-
theme="default"
|
106 |
)
|
|
|
1 |
import gradio as gr
|
2 |
|
|
|
3 |
# Fake cloud accounts database
|
4 |
cloud_accounts = {
|
5 |
("AWS", "123456789012"): {
|
|
|
8 |
"cloud_account_name": "Production-Main",
|
9 |
"owner_user_id": "AWS001",
|
10 |
"owner_email": "[email protected]",
|
11 |
+
"deployed_applications": ["ERP System", "Customer Portal", "Data Lake"]
|
12 |
},
|
13 |
("AWS", "098765432109"): {
|
14 |
"public_cloud_provider": "AWS",
|
|
|
16 |
"cloud_account_name": "Development-Team1",
|
17 |
"owner_user_id": "AWS002",
|
18 |
"owner_email": "[email protected]",
|
19 |
+
"deployed_applications": ["Test Environment", "CI/CD Pipeline"]
|
20 |
},
|
21 |
("Azure", "sub-abc-123-def-456"): {
|
22 |
"public_cloud_provider": "Azure",
|
|
|
24 |
"cloud_account_name": "Enterprise-Solutions",
|
25 |
"owner_user_id": "AZ001",
|
26 |
"owner_email": "[email protected]",
|
27 |
+
"deployed_applications": ["Microsoft 365 Integration", "Azure AD Connect"]
|
28 |
},
|
29 |
("Azure", "sub-xyz-789-uvw-321"): {
|
30 |
"public_cloud_provider": "Azure",
|
|
|
32 |
"cloud_account_name": "Research-Division",
|
33 |
"owner_user_id": "AZ002",
|
34 |
"owner_email": "[email protected]",
|
35 |
+
"deployed_applications": ["ML Platform", "Research Portal"]
|
36 |
},
|
37 |
("GCP", "project-id-123456"): {
|
38 |
"public_cloud_provider": "GCP",
|
|
|
40 |
"cloud_account_name": "Analytics-Platform",
|
41 |
"owner_user_id": "GCP001",
|
42 |
"owner_email": "[email protected]",
|
43 |
+
"deployed_applications": ["BigQuery Data Warehouse", "Kubernetes Cluster"]
|
44 |
},
|
45 |
("GCP", "project-id-789012"): {
|
46 |
"public_cloud_provider": "GCP",
|
|
|
48 |
"cloud_account_name": "ML-Operations",
|
49 |
"owner_user_id": "GCP002",
|
50 |
"owner_email": "[email protected]",
|
51 |
+
"deployed_applications": ["TensorFlow Training", "Model Serving Platform"]
|
52 |
+
}
|
53 |
}
|
54 |
|
55 |
|
56 |
+
def lookup_cloud_account(public_cloud_provider, account_id):
|
57 |
+
"""
|
58 |
+
Function to lookup cloud account information
|
59 |
+
Returns a formatted string with account details if found, otherwise returns error message
|
|
|
60 |
"""
|
61 |
account_key = (public_cloud_provider, account_id)
|
62 |
|
63 |
if account_key in cloud_accounts:
|
64 |
account = cloud_accounts[account_key]
|
65 |
+
return f"""{{
|
66 |
+
"public_cloud_provider": "{account['public_cloud_provider']}",
|
67 |
+
"account_id": "{account['account_id']}",
|
68 |
+
"cloud_account_name": "{account['cloud_account_name']}",
|
69 |
+
"owner_user_id": "{account['owner_user_id']}",
|
70 |
+
"owner_email": "{account['owner_email']}",
|
71 |
+
"deployed_applications": {str(account['deployed_applications'])}
|
72 |
+
}}"""
|
73 |
+
else:
|
74 |
+
return f"""{{
|
75 |
+
"error": "Account not found",
|
76 |
+
"public_cloud_provider": "{public_cloud_provider}",
|
77 |
+
"account_id": "{account_id}",
|
78 |
+
"message": "No cloud account information found"
|
79 |
+
}}"""
|
80 |
|
81 |
|
82 |
# Create Gradio Interface
|
83 |
gr_lookup_company_cloud_account_information = gr.Interface(
|
84 |
fn=lookup_cloud_account,
|
85 |
+
inputs=[gr.Dropdown(
|
86 |
+
choices=["AWS", "Azure", "GCP"],
|
87 |
+
label="Cloud Provider",
|
88 |
+
info="Select the cloud provider"
|
89 |
+
), "text"],
|
|
|
|
|
|
|
90 |
outputs="text",
|
91 |
title="Company Cloud Account Lookup System",
|
92 |
description="""
|
|
|
97 |
Azure: sub-abc-123-def-456, sub-xyz-789-uvw-321
|
98 |
GCP: project-id-123456, project-id-789012
|
99 |
""",
|
100 |
+
theme="default"
|
101 |
)
|
tdagent/tools/query_abuse_ip_db.py
CHANGED
@@ -1,19 +1,16 @@
|
|
1 |
-
from __future__ import annotations
|
2 |
-
|
3 |
import os
|
4 |
-
from dataclasses import dataclass
|
5 |
-
from datetime import datetime
|
6 |
-
from typing import Any
|
7 |
|
8 |
-
import gradio as gr
|
9 |
import requests
|
|
|
|
|
|
|
|
|
10 |
|
11 |
|
12 |
# API docs: https://docs.abuseipdb.com/#check-endpoint
|
13 |
|
14 |
-
|
15 |
@dataclass
|
16 |
-
class AbuseReport:
|
17 |
date: str
|
18 |
categories: str
|
19 |
comment: str
|
@@ -21,7 +18,7 @@ class AbuseReport: # noqa: D101
|
|
21 |
|
22 |
|
23 |
@dataclass
|
24 |
-
class IPCheckResult:
|
25 |
ip_address: str
|
26 |
abuse_confidence_score: int
|
27 |
total_reports: int
|
@@ -29,10 +26,10 @@ class IPCheckResult: # noqa: D101
|
|
29 |
domain: str
|
30 |
isp: str
|
31 |
last_reported: str
|
32 |
-
reports:
|
33 |
|
34 |
def format_summary(self) -> str:
|
35 |
-
"""Format a summary of the IP check result
|
36 |
return f"""
|
37 |
IP Address: {self.ip_address}
|
38 |
Abuse Confidence Score: {self.abuse_confidence_score}%
|
@@ -44,8 +41,9 @@ class IPCheckResult: # noqa: D101
|
|
44 |
"""
|
45 |
|
46 |
|
47 |
-
def check_ip(ip_address: str, api_key: str, days: str = "30") ->
|
48 |
-
"""
|
|
|
49 |
|
50 |
Args:
|
51 |
ip_address: The IP address to check
|
@@ -55,33 +53,30 @@ def check_ip(ip_address: str, api_key: str, days: str = "30") -> dict[str, Any]:
|
|
55 |
Returns:
|
56 |
API response data as a dictionary
|
57 |
"""
|
58 |
-
url =
|
59 |
|
60 |
-
headers = {
|
|
|
|
|
|
|
61 |
|
62 |
params = {
|
63 |
-
|
64 |
-
|
65 |
-
|
66 |
}
|
67 |
|
68 |
try:
|
69 |
-
response = requests.get(
|
70 |
-
url,
|
71 |
-
headers=headers,
|
72 |
-
params=params,
|
73 |
-
timeout=30,
|
74 |
-
)
|
75 |
response.raise_for_status() # Raise exception for HTTP errors
|
76 |
return response.json()
|
77 |
except requests.exceptions.RequestException as e:
|
78 |
return {"error": str(e)}
|
79 |
|
80 |
|
81 |
-
def parse_response(
|
82 |
-
|
83 |
-
|
84 |
-
"""Parse the API response into a dataclass.
|
85 |
|
86 |
Args:
|
87 |
response: The API response dictionary
|
@@ -98,21 +93,18 @@ def parse_response(
|
|
98 |
|
99 |
# Create a list of AbuseReport objects
|
100 |
reports = []
|
101 |
-
if data
|
102 |
for report in data["reports"]:
|
103 |
-
reported_at = datetime.fromisoformat(
|
104 |
-
|
105 |
-
).strftime("%Y-%m-%d %H:%M:%S")
|
106 |
categories = ", ".join([str(cat) for cat in report.get("categories", [])])
|
107 |
|
108 |
-
reports.append(
|
109 |
-
|
110 |
-
|
111 |
-
|
112 |
-
|
113 |
-
|
114 |
-
),
|
115 |
-
)
|
116 |
|
117 |
# Create the main result object
|
118 |
result = IPCheckResult(
|
@@ -123,14 +115,15 @@ def parse_response(
|
|
123 |
domain=data.get("domain", "N/A"),
|
124 |
isp=data.get("isp", "N/A"),
|
125 |
last_reported=data.get("lastReportedAt", "Never"),
|
126 |
-
reports=reports
|
127 |
)
|
128 |
|
129 |
-
return result,
|
130 |
|
131 |
|
132 |
def query_abuseipdb(ip_address: str, days: int = 30) -> str:
|
133 |
-
"""
|
|
|
134 |
|
135 |
Args:
|
136 |
ip_address: The IP address to check
|
@@ -148,21 +141,16 @@ def query_abuseipdb(ip_address: str, days: int = 30) -> str:
|
|
148 |
response = check_ip(ip_address, api_key, str(days))
|
149 |
result, error = parse_response(response)
|
150 |
|
151 |
-
if
|
152 |
-
return
|
153 |
|
154 |
-
return
|
155 |
|
156 |
|
157 |
gr_query_abuseipdb = gr.Interface(
|
158 |
fn=query_abuseipdb,
|
159 |
-
inputs=
|
160 |
-
outputs=
|
161 |
title="AbuseIPDB IP Checker",
|
162 |
-
description=
|
163 |
-
"Check if an IP address has been reported for abusive behavior"
|
164 |
-
" using AbuseIP DB API"
|
165 |
-
),
|
166 |
-
examples=["5.252.155.14", "77.239.99.248"],
|
167 |
-
example_labels=["👾 Malicious IP 1", "👾 Malicious IP 2"],
|
168 |
)
|
|
|
|
|
|
|
1 |
import os
|
|
|
|
|
|
|
2 |
|
|
|
3 |
import requests
|
4 |
+
from datetime import datetime
|
5 |
+
from dataclasses import dataclass
|
6 |
+
from typing import List, Optional, Dict, Any, Tuple, Union
|
7 |
+
import gradio as gr
|
8 |
|
9 |
|
10 |
# API docs: https://docs.abuseipdb.com/#check-endpoint
|
11 |
|
|
|
12 |
@dataclass
|
13 |
+
class AbuseReport:
|
14 |
date: str
|
15 |
categories: str
|
16 |
comment: str
|
|
|
18 |
|
19 |
|
20 |
@dataclass
|
21 |
+
class IPCheckResult:
|
22 |
ip_address: str
|
23 |
abuse_confidence_score: int
|
24 |
total_reports: int
|
|
|
26 |
domain: str
|
27 |
isp: str
|
28 |
last_reported: str
|
29 |
+
reports: List[AbuseReport]
|
30 |
|
31 |
def format_summary(self) -> str:
|
32 |
+
"""Format a summary of the IP check result"""
|
33 |
return f"""
|
34 |
IP Address: {self.ip_address}
|
35 |
Abuse Confidence Score: {self.abuse_confidence_score}%
|
|
|
41 |
"""
|
42 |
|
43 |
|
44 |
+
def check_ip(ip_address: str, api_key: str, days: str = "30") -> Dict[str, Any]:
|
45 |
+
"""
|
46 |
+
Query the AbuseIPDB API to check if an IP address has been reported.
|
47 |
|
48 |
Args:
|
49 |
ip_address: The IP address to check
|
|
|
53 |
Returns:
|
54 |
API response data as a dictionary
|
55 |
"""
|
56 |
+
url = 'https://api.abuseipdb.com/api/v2/check'
|
57 |
|
58 |
+
headers = {
|
59 |
+
'Accept': 'application/json',
|
60 |
+
'Key': api_key
|
61 |
+
}
|
62 |
|
63 |
params = {
|
64 |
+
'ipAddress': ip_address,
|
65 |
+
'maxAgeInDays': days,
|
66 |
+
'verbose': True
|
67 |
}
|
68 |
|
69 |
try:
|
70 |
+
response = requests.get(url, headers=headers, params=params)
|
|
|
|
|
|
|
|
|
|
|
71 |
response.raise_for_status() # Raise exception for HTTP errors
|
72 |
return response.json()
|
73 |
except requests.exceptions.RequestException as e:
|
74 |
return {"error": str(e)}
|
75 |
|
76 |
|
77 |
+
def parse_response(response: Dict[str, Any]) -> Tuple[Optional[IPCheckResult], Optional[str]]:
|
78 |
+
"""
|
79 |
+
Parse the API response into a dataclass
|
|
|
80 |
|
81 |
Args:
|
82 |
response: The API response dictionary
|
|
|
93 |
|
94 |
# Create a list of AbuseReport objects
|
95 |
reports = []
|
96 |
+
if "reports" in data and data["reports"]:
|
97 |
for report in data["reports"]:
|
98 |
+
reported_at = datetime.fromisoformat(report["reportedAt"].replace("Z", "+00:00")).strftime(
|
99 |
+
"%Y-%m-%d %H:%M:%S")
|
|
|
100 |
categories = ", ".join([str(cat) for cat in report.get("categories", [])])
|
101 |
|
102 |
+
reports.append(AbuseReport(
|
103 |
+
date=reported_at,
|
104 |
+
categories=categories,
|
105 |
+
comment=report.get("comment", ""),
|
106 |
+
reporter=str(report.get("reporterId", "Anonymous"))
|
107 |
+
))
|
|
|
|
|
108 |
|
109 |
# Create the main result object
|
110 |
result = IPCheckResult(
|
|
|
115 |
domain=data.get("domain", "N/A"),
|
116 |
isp=data.get("isp", "N/A"),
|
117 |
last_reported=data.get("lastReportedAt", "Never"),
|
118 |
+
reports=reports
|
119 |
)
|
120 |
|
121 |
+
return result, None
|
122 |
|
123 |
|
124 |
def query_abuseipdb(ip_address: str, days: int = 30) -> str:
|
125 |
+
"""
|
126 |
+
Main function to query AbuseIPDB and format the response for Gradio
|
127 |
|
128 |
Args:
|
129 |
ip_address: The IP address to check
|
|
|
141 |
response = check_ip(ip_address, api_key, str(days))
|
142 |
result, error = parse_response(response)
|
143 |
|
144 |
+
if error:
|
145 |
+
return error
|
146 |
|
147 |
+
return result.format_summary()
|
148 |
|
149 |
|
150 |
gr_query_abuseipdb = gr.Interface(
|
151 |
fn=query_abuseipdb,
|
152 |
+
inputs=["text"],
|
153 |
+
outputs="text",
|
154 |
title="AbuseIPDB IP Checker",
|
155 |
+
description="Check if an IP address has been reported for abusive behavior using AbuseIP DB API",
|
|
|
|
|
|
|
|
|
|
|
156 |
)
|
tdagent/tools/rdap.py
DELETED
@@ -1,110 +0,0 @@
|
|
1 |
-
import enum
|
2 |
-
|
3 |
-
import cachetools
|
4 |
-
import gradio as gr
|
5 |
-
import requests
|
6 |
-
import whois
|
7 |
-
|
8 |
-
from tdagent.constants import HttpContentType
|
9 |
-
|
10 |
-
|
11 |
-
# one of domain, ip, autnum, entity etc
|
12 |
-
_RDAP_URL_TEMPLATE = r"https://rdap.org/{rdap_type}/{rdap_object}"
|
13 |
-
_CACHE_MAX_SIZE = 4096
|
14 |
-
_CACHE_TTL_SECONDS = 3600
|
15 |
-
|
16 |
-
|
17 |
-
class RdapTypes(str, enum.Enum):
|
18 |
-
"""RDAP object types."""
|
19 |
-
|
20 |
-
DOMAIN = "domain"
|
21 |
-
IP = "ip"
|
22 |
-
AUTNUM = "autnum"
|
23 |
-
ENTITY = "entity"
|
24 |
-
|
25 |
-
|
26 |
-
@cachetools.cached(
|
27 |
-
cache=cachetools.TTLCache(maxsize=_CACHE_MAX_SIZE, ttl=_CACHE_TTL_SECONDS),
|
28 |
-
)
|
29 |
-
def query_rdap( # noqa: PLR0911
|
30 |
-
url_or_ip: str,
|
31 |
-
timeout: int = 30,
|
32 |
-
) -> dict[str, str | int | float]:
|
33 |
-
"""Query RDAP to get information about Internet resources.
|
34 |
-
|
35 |
-
The Registration Data Access Protocol (RDAP) is the successor to WHOIS.
|
36 |
-
Like WHOIS, RDAP provides access to information about Internet resources
|
37 |
-
(domain names, autonomous system numbers, and IP addresses).
|
38 |
-
|
39 |
-
Args:
|
40 |
-
url_or_ip: URL, domain or IP to query for RDAP information.
|
41 |
-
timeout: Request timeout in seconds. Defaults to 30.
|
42 |
-
|
43 |
-
Returns:
|
44 |
-
A JSON formatted string with RDAP information. In there is
|
45 |
-
an error, the JSON will contain the key "error" with an
|
46 |
-
error message.
|
47 |
-
"""
|
48 |
-
rdap_type = RdapTypes.DOMAIN
|
49 |
-
rdap_object = url_or_ip
|
50 |
-
if whois.IPV4_OR_V6.match(url_or_ip):
|
51 |
-
rdap_type = RdapTypes.IP
|
52 |
-
else:
|
53 |
-
rdap_object = whois.extract_domain(url_or_ip)
|
54 |
-
|
55 |
-
query_url = _RDAP_URL_TEMPLATE.format(rdap_type=rdap_type, rdap_object=rdap_object)
|
56 |
-
response = requests.get(
|
57 |
-
query_url,
|
58 |
-
timeout=timeout,
|
59 |
-
headers={"Accept": HttpContentType.JSON},
|
60 |
-
)
|
61 |
-
|
62 |
-
try:
|
63 |
-
response.raise_for_status()
|
64 |
-
except requests.HTTPError as err:
|
65 |
-
if err.response.status_code == 302:
|
66 |
-
if "Location" in err.response.headers:
|
67 |
-
return {
|
68 |
-
"message": "Follow the location to find RDAP information",
|
69 |
-
"location": err.response.headers["Location"],
|
70 |
-
}
|
71 |
-
return {
|
72 |
-
"error": (
|
73 |
-
"Information not found in RDAP.org but it knows of"
|
74 |
-
" a service which is authoritative for the requested resource."
|
75 |
-
),
|
76 |
-
}
|
77 |
-
if err.response.status_code == 400:
|
78 |
-
return {
|
79 |
-
"error": (
|
80 |
-
"Invalid request (malformed path, unsupported object "
|
81 |
-
" type, invalid IP address, etc)"
|
82 |
-
),
|
83 |
-
}
|
84 |
-
if err.response.status_code == 403:
|
85 |
-
return {
|
86 |
-
"error": "You've been blocked due to abuse or other misbehavior",
|
87 |
-
}
|
88 |
-
if err.response.status_code == 404:
|
89 |
-
return {
|
90 |
-
"error": (
|
91 |
-
"RDAP.org doesn't know of an RDAP service which is"
|
92 |
-
" authoritative for the requested resource. RDAP.org"
|
93 |
-
" only knows about servers that are registered with IANA"
|
94 |
-
),
|
95 |
-
}
|
96 |
-
return {
|
97 |
-
"error": str(err),
|
98 |
-
}
|
99 |
-
|
100 |
-
return response.json()
|
101 |
-
|
102 |
-
|
103 |
-
gr_query_rdap = gr.Interface(
|
104 |
-
fn=query_rdap,
|
105 |
-
inputs=gr.Textbox(label="url or ip"),
|
106 |
-
outputs=gr.JSON(label="Report from RDAP"),
|
107 |
-
title="Get RDAP information for a given URL.",
|
108 |
-
description="Query a RDAP database to gather information about a url or domain.",
|
109 |
-
examples=["8.8.8.8", "pastebin.com"],
|
110 |
-
)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
tdagent/tools/retrieve_from_mitre_attack.py
DELETED
@@ -1,59 +0,0 @@
|
|
1 |
-
from typing import Any
|
2 |
-
|
3 |
-
import cachetools
|
4 |
-
import gradio as gr
|
5 |
-
from attackcti import attack_client
|
6 |
-
|
7 |
-
|
8 |
-
_CACHE_MAX_SIZE = 4096
|
9 |
-
_CACHE_TTL_SECONDS = 3600
|
10 |
-
|
11 |
-
|
12 |
-
@cachetools.cached(
|
13 |
-
cache=cachetools.TTLCache(maxsize=_CACHE_MAX_SIZE, ttl=_CACHE_TTL_SECONDS),
|
14 |
-
)
|
15 |
-
def get_stix_object_of_attack_id(
|
16 |
-
attack_id: str,
|
17 |
-
object_type: str = "attack-pattern",
|
18 |
-
) -> dict[str, Any]:
|
19 |
-
"""Retrieves a STIX object identified by an ATT&CK ID in all ATT&CK matrices.
|
20 |
-
|
21 |
-
Args:
|
22 |
-
attack_id (str): The ATT&CK ID (e.g., 'T1234') of the STIX object to retrieve.
|
23 |
-
object_type (str): The type of STIX object to retrieve, such as
|
24 |
-
'attack-pattern', 'course-of-action', 'intrusion-set',
|
25 |
-
'malware', 'tool', or 'x-mitre-data-component'. Default is 'attack-pattern'
|
26 |
-
|
27 |
-
Returns:
|
28 |
-
A list containing the matched STIX object, either in its raw STIX format
|
29 |
-
or as a custom dictionary following the structure defined by the relevant
|
30 |
-
Pydantic model, depending on the 'stix_format' flag.
|
31 |
-
"""
|
32 |
-
try:
|
33 |
-
lift = attack_client()
|
34 |
-
return lift.get_object_by_attack_id(
|
35 |
-
object_type=object_type.strip(),
|
36 |
-
attack_id=attack_id.strip(),
|
37 |
-
stix_format=False,
|
38 |
-
)[0]
|
39 |
-
except Exception as e: # noqa: BLE001
|
40 |
-
return {"Exception": str(e)}
|
41 |
-
|
42 |
-
|
43 |
-
gr_get_stix_of_attack_id = gr.Interface(
|
44 |
-
fn=get_stix_object_of_attack_id,
|
45 |
-
inputs=[
|
46 |
-
gr.Textbox(label="Mitre technique ID"),
|
47 |
-
gr.Textbox(label="Mitre object type"),
|
48 |
-
],
|
49 |
-
outputs=gr.JSON(label="Mitre report"),
|
50 |
-
title="MITRE ATT&CK STIX information",
|
51 |
-
description=(
|
52 |
-
"Retrieves a specific STIX object identified by an ATT&CK ID across all ATT&CK"
|
53 |
-
" matrices"
|
54 |
-
),
|
55 |
-
examples=[
|
56 |
-
["T1568.002", "attack-pattern"],
|
57 |
-
["M1042", "course-of-action"],
|
58 |
-
],
|
59 |
-
)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
tdagent/tools/send_email.py
CHANGED
@@ -1,48 +1,53 @@
|
|
|
|
1 |
import datetime
|
2 |
import random
|
3 |
|
4 |
-
import gradio as gr
|
5 |
-
|
6 |
|
7 |
-
def send_email(recipient
|
8 |
-
"""
|
|
|
9 |
|
10 |
This function takes email details, formats them into a standard email structure
|
11 |
with headers and body, prints the formatted email to the console, and returns
|
12 |
a success message. The sender is always set to [email protected].
|
13 |
|
14 |
-
|
15 |
-
|
16 |
-
|
17 |
-
|
|
|
|
|
|
|
|
|
18 |
|
19 |
Returns:
|
|
|
|
|
20 |
A success message indicating that the email was sent, including
|
21 |
the recipient address and the current time
|
22 |
|
23 |
Example:
|
24 |
-
|
25 |
-
|
26 |
-
|
27 |
-
|
28 |
-
|
29 |
-
|
30 |
-
|
31 |
-
|
32 |
-
|
33 |
-
|
34 |
-
|
35 |
-
|
|
|
36 |
# Fixed sender email
|
37 |
sender = "[email protected]"
|
38 |
|
39 |
# Generate a random message ID
|
40 |
-
message_id = f"<{random.randint(100000, 999999)}.{random.randint(1000000000, 9999999999)}@mail-server>"
|
41 |
|
42 |
# Get current timestamp
|
43 |
-
timestamp = datetime.datetime.now(
|
44 |
-
"%a, %d %b %Y %H:%M:%S +0000",
|
45 |
-
)
|
46 |
|
47 |
# Format the email
|
48 |
email_format = f"""
|
@@ -58,13 +63,10 @@ Message-ID: {message_id}
|
|
58 |
"""
|
59 |
|
60 |
# Print the email to console
|
61 |
-
print(email_format)
|
62 |
|
63 |
# Return a success message
|
64 |
-
return (
|
65 |
-
f"Email successfully sent from {sender} to {recipient} at "
|
66 |
-
+ datetime.datetime.now(datetime.timezone.utc).strftime("%H:%M:%S")
|
67 |
-
)
|
68 |
|
69 |
|
70 |
# Create Gradio Interface
|
@@ -73,5 +75,5 @@ gr_send_email = gr.Interface(
|
|
73 |
inputs=["text", "text", "text"],
|
74 |
outputs="text",
|
75 |
title="Email Sender Simulator",
|
76 |
-
description="This tool simulates sending an email."
|
77 |
)
|
|
|
1 |
+
import gradio as gr
|
2 |
import datetime
|
3 |
import random
|
4 |
|
|
|
|
|
5 |
|
6 |
+
def send_email(recipient, subject, message):
|
7 |
+
"""
|
8 |
+
Simulates sending an email from [email protected] and prints it to the console in a well-formatted way.
|
9 |
|
10 |
This function takes email details, formats them into a standard email structure
|
11 |
with headers and body, prints the formatted email to the console, and returns
|
12 |
a success message. The sender is always set to [email protected].
|
13 |
|
14 |
+
Parameters:
|
15 |
+
-----------
|
16 |
+
recipient : str
|
17 |
+
The email address of the recipient (To field)
|
18 |
+
subject : str
|
19 |
+
The subject line of the email
|
20 |
+
message : str
|
21 |
+
The main body content of the email
|
22 |
|
23 |
Returns:
|
24 |
+
--------
|
25 |
+
str
|
26 |
A success message indicating that the email was sent, including
|
27 |
the recipient address and the current time
|
28 |
|
29 |
Example:
|
30 |
+
--------
|
31 |
+
>>> send_email("[email protected]", "Security Alert", "Please update your password.")
|
32 |
+
------ EMAIL SENT ------
|
33 |
+
From: cert@company.com
|
34 |
+
To: [email protected]
|
35 |
+
Subject: Security Alert
|
36 |
+
Date: Wed, 04 Jun 2025 16:40:58 +0000
|
37 |
+
Message-ID: <123456.7890123456@mail-server>
|
38 |
+
|
39 |
+
Please update your password.
|
40 |
+
------------------------
|
41 |
+
'Email successfully sent to [email protected] at 16:40:58'
|
42 |
+
"""
|
43 |
# Fixed sender email
|
44 |
sender = "[email protected]"
|
45 |
|
46 |
# Generate a random message ID
|
47 |
+
message_id = f"<{random.randint(100000, 999999)}.{random.randint(1000000000, 9999999999)}@mail-server>"
|
48 |
|
49 |
# Get current timestamp
|
50 |
+
timestamp = datetime.datetime.now().strftime("%a, %d %b %Y %H:%M:%S +0000")
|
|
|
|
|
51 |
|
52 |
# Format the email
|
53 |
email_format = f"""
|
|
|
63 |
"""
|
64 |
|
65 |
# Print the email to console
|
66 |
+
print(email_format)
|
67 |
|
68 |
# Return a success message
|
69 |
+
return f"Email successfully sent from {sender} to {recipient} at {datetime.datetime.now().strftime('%H:%M:%S')}"
|
|
|
|
|
|
|
70 |
|
71 |
|
72 |
# Create Gradio Interface
|
|
|
75 |
inputs=["text", "text", "text"],
|
76 |
outputs="text",
|
77 |
title="Email Sender Simulator",
|
78 |
+
description="This tool simulates sending an email by formatting and printing it to the console."
|
79 |
)
|
tdagent/tools/virus_total.py
CHANGED
@@ -1,26 +1,21 @@
|
|
1 |
-
import os
|
2 |
-
from datetime import datetime, timezone
|
3 |
-
|
4 |
-
import cachetools
|
5 |
import gradio as gr
|
6 |
import vt
|
7 |
-
|
8 |
-
|
9 |
-
|
10 |
-
|
|
|
|
|
11 |
|
12 |
# Get API key from environment variable
|
13 |
-
API_KEY = os.getenv(
|
14 |
-
|
15 |
|
16 |
-
@cachetools.cached(
|
17 |
-
cache=cachetools.TTLCache(maxsize=_CACHE_MAX_SIZE, ttl=_CACHE_TTL_SECONDS),
|
18 |
-
)
|
19 |
-
def get_virus_total_url_info(url: str) -> str:
|
20 |
-
"""Get URL Info from VirusTotal URL Scanner. Scan URL is not available."""
|
21 |
-
if not API_KEY:
|
22 |
-
return "Error: Virus total API key not configured."
|
23 |
|
|
|
|
|
|
|
|
|
|
|
24 |
try:
|
25 |
# Create a new client for each request (thread-safe)
|
26 |
with vt.Client(API_KEY) as client:
|
@@ -38,15 +33,12 @@ def get_virus_total_url_info(url: str) -> str:
|
|
38 |
if last_analysis_date:
|
39 |
# Convert to datetime if it's a timestamp
|
40 |
if isinstance(last_analysis_date, (int, float)):
|
41 |
-
last_analysis_date = datetime.
|
42 |
-
|
43 |
-
timezone.utc,
|
44 |
-
)
|
45 |
-
date_str = last_analysis_date.strftime("%Y-%m-%d %H:%M:%S UTC")
|
46 |
else:
|
47 |
date_str = "Not available"
|
48 |
|
49 |
-
|
50 |
URL: {url}
|
51 |
Last Analysis Date: {date_str}
|
52 |
|
@@ -61,18 +53,31 @@ Reputation Score: {url_analysis.reputation}
|
|
61 |
Times Submitted: {url_analysis.times_submitted}
|
62 |
|
63 |
Cache Status: Hit
|
64 |
-
"""
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
65 |
|
66 |
-
|
67 |
-
return f"Error: {err}"
|
68 |
|
69 |
|
70 |
-
|
71 |
-
fn=
|
72 |
-
inputs=
|
73 |
-
outputs=
|
74 |
title="VirusTotal URL Scanner",
|
75 |
description="Get URL Info from VirusTotal URL Scanner. Scan URL is not available",
|
76 |
-
examples=["https://advertipros.com//?u=script", "https://google.com"],
|
77 |
-
example_labels=["👾 Malicious URL", "🧑💻 Benign URL"],
|
78 |
)
|
|
|
|
|
|
|
|
|
|
|
1 |
import gradio as gr
|
2 |
import vt
|
3 |
+
import os
|
4 |
+
from datetime import datetime
|
5 |
+
from functools import lru_cache
|
6 |
+
from typing import Optional
|
7 |
+
import time
|
8 |
+
import asyncio
|
9 |
|
10 |
# Get API key from environment variable
|
11 |
+
API_KEY = os.getenv('VT_API_KEY')
|
|
|
12 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
13 |
|
14 |
+
@lru_cache(maxsize=100)
|
15 |
+
def get_url_info_cached(url: str, timestamp: Optional[int] = None) -> str:
|
16 |
+
"""
|
17 |
+
Get URL Info from VirusTotal URL Scanner. Scan URL is not available
|
18 |
+
"""
|
19 |
try:
|
20 |
# Create a new client for each request (thread-safe)
|
21 |
with vt.Client(API_KEY) as client:
|
|
|
33 |
if last_analysis_date:
|
34 |
# Convert to datetime if it's a timestamp
|
35 |
if isinstance(last_analysis_date, (int, float)):
|
36 |
+
last_analysis_date = datetime.utcfromtimestamp(last_analysis_date)
|
37 |
+
date_str = last_analysis_date.strftime('%Y-%m-%d %H:%M:%S UTC')
|
|
|
|
|
|
|
38 |
else:
|
39 |
date_str = "Not available"
|
40 |
|
41 |
+
result = f"""
|
42 |
URL: {url}
|
43 |
Last Analysis Date: {date_str}
|
44 |
|
|
|
53 |
Times Submitted: {url_analysis.times_submitted}
|
54 |
|
55 |
Cache Status: Hit
|
56 |
+
"""
|
57 |
+
|
58 |
+
return result
|
59 |
+
|
60 |
+
except Exception as e:
|
61 |
+
return f"Error: {str(e)}"
|
62 |
+
|
63 |
+
|
64 |
+
def get_url_info(url: str) -> str:
|
65 |
+
"""
|
66 |
+
Wrapper function to handle the cached URL info retrieval
|
67 |
+
"""
|
68 |
+
# Clean the URL to ensure consistent caching
|
69 |
+
url = url.strip().lower()
|
70 |
+
|
71 |
+
# Use current timestamp rounded to nearest hour to maintain cache for an hour
|
72 |
+
timestamp = int(time.time()) // 3600
|
73 |
|
74 |
+
return get_url_info_cached(url, timestamp)
|
|
|
75 |
|
76 |
|
77 |
+
gr_virus_total = gr.Interface(
|
78 |
+
fn=get_url_info,
|
79 |
+
inputs=["text"],
|
80 |
+
outputs="text",
|
81 |
title="VirusTotal URL Scanner",
|
82 |
description="Get URL Info from VirusTotal URL Scanner. Scan URL is not available",
|
|
|
|
|
83 |
)
|
tdagent/tools/whois.py
DELETED
@@ -1,49 +0,0 @@
|
|
1 |
-
import json
|
2 |
-
import shutil
|
3 |
-
|
4 |
-
import cachetools
|
5 |
-
import gradio as gr
|
6 |
-
import whois
|
7 |
-
|
8 |
-
from tdagent.utils.json_utils import TDAgentJsonEncoder
|
9 |
-
|
10 |
-
|
11 |
-
_WHOIS_BINARY = "whois"
|
12 |
-
_CACHE_MAX_SIZE = 4096
|
13 |
-
_CACHE_TTL_SECONDS = 3600
|
14 |
-
|
15 |
-
|
16 |
-
@cachetools.cached(
|
17 |
-
cache=cachetools.TTLCache(maxsize=_CACHE_MAX_SIZE, ttl=_CACHE_TTL_SECONDS),
|
18 |
-
)
|
19 |
-
def query_whois(url: str) -> str:
|
20 |
-
"""Query a WHOIS database to gather information about a url or domain.
|
21 |
-
|
22 |
-
WHOIS information includes: domain names, IP address blocks and autonomous
|
23 |
-
systems, but it is also used for a wider range of other information.
|
24 |
-
|
25 |
-
Args:
|
26 |
-
url: URL to query for WHOIS information.
|
27 |
-
|
28 |
-
Returns:
|
29 |
-
A JSON formatted string with the gathered information
|
30 |
-
"""
|
31 |
-
try:
|
32 |
-
whois_result = whois.whois(
|
33 |
-
url,
|
34 |
-
command=shutil.which(_WHOIS_BINARY) is not None,
|
35 |
-
executable=_WHOIS_BINARY,
|
36 |
-
)
|
37 |
-
except whois.parser.PywhoisError as err:
|
38 |
-
return json.dumps({"error": str(err)})
|
39 |
-
|
40 |
-
return json.dumps(whois_result, cls=TDAgentJsonEncoder)
|
41 |
-
|
42 |
-
|
43 |
-
gr_query_whois = gr.Interface(
|
44 |
-
fn=query_whois,
|
45 |
-
inputs=["text"],
|
46 |
-
outputs="text",
|
47 |
-
title="Get WHOIS information for a given URL.",
|
48 |
-
description="Query a WHOIS database to gather information about a url or domain.",
|
49 |
-
)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
tdagent/utils/__init__.py
DELETED
File without changes
|
tdagent/utils/json_utils.py
DELETED
@@ -1,14 +0,0 @@
|
|
1 |
-
import datetime
|
2 |
-
import json
|
3 |
-
|
4 |
-
|
5 |
-
class TDAgentJsonEncoder(json.JSONEncoder):
|
6 |
-
"""Extend JSON encoder with known types."""
|
7 |
-
|
8 |
-
def default(self, o: object) -> object: # noqa: D102
|
9 |
-
if isinstance(o, datetime.datetime):
|
10 |
-
return {"__type__": "datetime", "value": o.isoformat()}
|
11 |
-
if isinstance(o, datetime.date):
|
12 |
-
return {"__type__": "date", "value": o.isoformat()}
|
13 |
-
|
14 |
-
return super().default(o)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
uv.lock
CHANGED
The diff for this file is too large to render.
See raw diff
|
|