A trust and safety agent that interacts with Osprey for investigation, real-time analysis, and prevention implementations

add tools, update readme

+447 -23
+46 -10
README.md
··· 8 8 9 9 This allows it to: 10 10 11 - - **Rule Management** - Writes, validate, and deploys rules for Osprey 12 - - **Data Analysis** - Queries via Clickhouse to analyze what is happening on your network 11 + - **Rule Management** - Write, validate, and deploy rules for Osprey 12 + - **Data Analysis** - Query ClickHouse to analyze what is happening on your network 13 + - **Investigation** - Look up domains, IPs, URLs, and WHOIS records to investigate threats 14 + - **Content Detection** - Find similar posts to detect coordinated spam and templated abuse 13 15 - **Moderation** - Apply labels and take moderation actions via Ozone (not actually implemented yet...) 14 16 15 17 ## How It Works ··· 17 19 Phoebe uses a model API as its reasoning backer. The agent writes and executes Typescript code in a sandboxed Deno runtime to interact with its tools — querying event data, creating safety rules, and managing moderation actions. 18 20 19 21 ``` 20 - ┌──────────────────────────────────────┐ 21 - │ Model API │ 22 - ├──────────────────────────────────────┤ 23 - │ Tool Execution (Deno Sandbox) │ 24 - ├──────────┬───────────┬───────────────┤ 25 - │ Osprey │ ClickHouse│ Ozone │ 26 - │ (Rules) │ (Queries) │ (Moderation) │ 27 - └──────────┴───────────┴───────────────┘ 22 + ┌─────────────────────────────────────────────────────────┐ 23 + │ Model API │ 24 + ├─────────────────────────────────────────────────────────┤ 25 + │ Tool Execution (Deno Sandbox) │ 26 + ├──────────┬───────────┬──────────────┬───────────────────┤ 27 + │ Osprey │ ClickHouse│ Ozone │ Investigation │ 28 + │ (Rules) │ (Queries) │ (Moderation) │ (Domain/IP/WHOIS) │ 29 + └──────────┴───────────┴──────────────┴───────────────────┘ 28 30 ``` 29 31 30 32 #### Why not traditional tool calling? ··· 37 39 When executing code inside of Deno, Deno is ran with the bare minimum of permissions. For example, it cannot access the file system, the network (local or remote), or use NPM packages. Both execution time and memory limits are applied. All network requests are done in Python, 38 40 in code that _you_ write, not the agent. 39 41 42 + | Limit | Value | 43 + |-------|-------| 44 + | Max code size | 50,000 characters | 45 + | Max tool calls per execution | 25 | 46 + | Max output size | 1 MB | 47 + | Execution timeout | 60 seconds | 48 + | V8 heap memory | 256 MB | 49 + 50 + ## Tools 51 + 52 + Phoebe has access to the following tools, organized by namespace: 53 + 54 + | Namespace | Tool | Description | 55 + |-----------|------|-------------| 56 + | `clickhouse` | `query(sql)` | Execute SQL queries against Clickhouse | 57 + | `clickhouse` | `getSchema()` | Get the table schema and column info | 58 + | `osprey` | `getConfig()` | Get available features, labels, rules, and actions | 59 + | `osprey` | `getUdfs()` | Get available UDFs for rule writing | 60 + | `osprey` | `listRuleFiles(directory?)` | List existing `.sml` rule files | 61 + | `osprey` | `readRuleFile(file_path)` | Read an existing rule file | 62 + | `osprey` | `saveRule(file_path, content)` | Save or create a rule file | 63 + | `osprey` | `validateRules()` | Validate the ruleset | 64 + | `content` | `similarity(text, threshold?, limit?)` | Find similar posts using n-gram distance | 65 + | `domain` | `checkDomain(domain)` | DNS lookups and HTTP status checks | 66 + | `ip` | `lookup(ip)` | GeoIP and ASN lookups | 67 + | `url` | `expand(url)` | Follow redirect chains and detect shorteners | 68 + | `whois` | `lookup(domain)` | Domain registration and WHOIS info | 69 + | `ozone` | `applyLabel(subject, label)` | Apply a moderation label (not yet implemented) | 70 + | `ozone` | `removeLabel(subject, label)` | Remove a moderation label (not yet implemented) | 71 + 40 72 ## Prerequisites 41 73 42 74 - [Deno](https://deno.com/) runtime ··· 58 90 # Required 59 91 MODEL_API_KEY="sk-ant-api03-..." 60 92 MODEL_NAME="claude-sonnet-4-5-20250929" 93 + 94 + # Optional - Model API backend (default: anthropic) 95 + # MODEL_API="anthropic" # or "openai", "openapi" 96 + # MODEL_ENDPOINT="" # required for openapi, ie https://api.moonshot.ai/v1/completions 61 97 62 98 # Osprey 63 99 OSPREY_BASE_URL="http://localhost:5004"
+2
pyproject.toml
··· 11 11 "click>=8.3.1", 12 12 "clickhouse-connect>=0.10.0", 13 13 "dnspython>=2.8.0", 14 + "httpx>=0.28.0", 14 15 "pydantic>=2.12.5", 15 16 "pydantic-settings>=2.12.0", 17 + "python-whois>=0.9.4", 16 18 ] 17 19 18 20 [tool.uv.sources]
+5 -1
src/tools/__init__.py
··· 1 1 # Import tool definitions so they register themselves with TOOL_REGISTRY 2 2 import src.tools.definitions.clickhouse # noqa: F401 3 - import src.tools.definitions.domain 3 + import src.tools.definitions.content # noqa: F401 4 + import src.tools.definitions.domain # noqa: F401 5 + import src.tools.definitions.ip # noqa: F401 4 6 import src.tools.definitions.osprey # noqa: F401 5 7 import src.tools.definitions.ozone # noqa: F401 8 + import src.tools.definitions.url # noqa: F401 9 + import src.tools.definitions.whois # noqa: F401 6 10 from src.tools.executor import ToolExecutor 7 11 from src.tools.registry import ( 8 12 TOOL_REGISTRY,
+77
src/tools/definitions/content.py
··· 1 + from typing import Any 2 + 3 + from src.tools.registry import TOOL_REGISTRY, ToolContext, ToolParameter 4 + 5 + 6 + @TOOL_REGISTRY.tool( 7 + name="content.similarity", 8 + description="Find similar posts in the network using ClickHouse's ngramDistance function. Useful for detecting coordinated spam, copypasta, or templated abuse content. Returns posts ordered by similarity score.", 9 + parameters=[ 10 + ToolParameter( 11 + name="text", 12 + type="string", 13 + description="The input text to find similar posts for", 14 + ), 15 + ToolParameter( 16 + name="threshold", 17 + type="number", 18 + description="Similarity threshold (0.0 = identical, 1.0 = completely different). Lower values return more similar results.", 19 + required=False, 20 + default=0.4, 21 + ), 22 + ToolParameter( 23 + name="limit", 24 + type="number", 25 + description="Maximum number of results to return (max 100)", 26 + required=False, 27 + default=20, 28 + ), 29 + ], 30 + ) 31 + async def content_similarity( 32 + ctx: ToolContext, 33 + text: str, 34 + threshold: float = 0.4, 35 + limit: int = 20, 36 + ) -> dict[str, Any]: 37 + limit = min(max(1, int(limit)), 100) 38 + threshold = max(0.0, min(1.0, float(threshold))) 39 + 40 + escaped_text = text.replace("'", "\\'") 41 + 42 + sql = f""" 43 + SELECT 44 + UserId AS user_id, 45 + UserHandle AS handle, 46 + PostText AS post_text, 47 + ngramDistance(PostText, '{escaped_text}') AS distance, 48 + __timestamp AS timestamp 49 + FROM default.osprey_execution_results 50 + WHERE PostText != '' 51 + AND ngramDistance(PostText, '{escaped_text}') < {threshold} 52 + ORDER BY distance ASC 53 + LIMIT {limit} 54 + """ 55 + 56 + resp = await ctx.clickhouse.query(sql) 57 + 58 + rows = [] 59 + for row in resp.result_rows: # type: ignore 60 + rows.append( # type: ignore 61 + { 62 + "user_id": row[0], 63 + "handle": row[1], 64 + "post_text": row[2], 65 + "similarity_score": round(1.0 - row[3], 4), # type: ignore 66 + "distance": round(row[3], 4), # type: ignore 67 + "timestamp": str(row[4]), # type: ignore 68 + } 69 + ) 70 + 71 + return { 72 + "success": True, 73 + "input_text": text[:200], 74 + "threshold": threshold, 75 + "result_count": len(rows), # type: ignore 76 + "results": rows, 77 + }
+9 -9
src/tools/definitions/domain.py
··· 40 40 41 41 if record_type == "SOA": 42 42 # soa returns a single answer 43 - return str(answers[0]) if answers else None 43 + return str(answers[0]) if answers else None # type: ignore 44 44 elif record_type == "MX": 45 45 # mx have priority 46 46 return [f"{answer.preference} {answer.exchange}" for answer in answers] ··· 55 55 ] 56 56 else: 57 57 return [str(answer) for answer in answers] 58 - except (dns.resolver.NoAnswer, dns.resolver.NXDOMAIN, dns.resolver.NoNameservers): 58 + except (resolver.NoAnswer, resolver.NXDOMAIN, resolver.NoNameservers): # type: ignore 59 59 return [] if record_type != "SOA" else None 60 60 except Exception: 61 61 return [] if record_type != "SOA" else None 62 62 63 63 64 64 @TOOL_REGISTRY.tool( 65 - name="clickhouse.query", 65 + name="domain.checkDomain", 66 66 description="Lookup A, AAAA, NS, MX, TXT, CNAME, and SOA for a given input domain", 67 67 parameters=[ 68 68 ToolParameter( ··· 93 93 dns_results = await asyncio.gather(*dns_tasks.values(), return_exceptions=True) 94 94 dns_data = dict(zip(dns_tasks.keys(), dns_results)) 95 95 96 - a_records = ( 96 + a_records = ( # type: ignore 97 97 dns_data.get("a", []) 98 98 if not isinstance(dns_data.get("a"), Exception) 99 99 else [] 100 100 ) 101 - aaaa_records = ( 101 + aaaa_records = ( # type: ignore 102 102 dns_data.get("aaaa", []) 103 103 if not isinstance(dns_data.get("aaaa"), Exception) 104 104 else [] 105 105 ) 106 - cname_records = ( 106 + cname_records = ( # type: ignore 107 107 dns_data.get("cname", []) 108 108 if not isinstance(dns_data.get("cname"), Exception) 109 109 else [] ··· 114 114 result: dict[str, Any] = { 115 115 "success": True, 116 116 "domain": domain, 117 - "resolves": len(a_records) > 0 118 - or len(aaaa_records) > 0 119 - or len(cname_records) > 0, 117 + "resolves": len(a_records) > 0 # type: ignore 118 + or len(aaaa_records) > 0 # type: ignore 119 + or len(cname_records) > 0, # type: ignore 120 120 "dns": { 121 121 "a": a_records, 122 122 "aaaa": aaaa_records,
+79
src/tools/definitions/ip.py
··· 1 + import re 2 + from typing import Any 3 + 4 + import httpx 5 + 6 + from src.tools.registry import TOOL_REGISTRY, ToolContext, ToolParameter 7 + 8 + _IP_REGEX = re.compile( 9 + r"^((25[0-5]|2[0-4]\d|[01]?\d\d?)\.){3}(25[0-5]|2[0-4]\d|[01]?\d\d?)$" 10 + r"|^([0-9a-fA-F]{1,4}:){7}[0-9a-fA-F]{1,4}$" 11 + r"|^::$" 12 + r"|^([0-9a-fA-F]{1,4}:){1,7}:$" 13 + r"|^::[0-9a-fA-F]{1,4}(:[0-9a-fA-F]{1,4}){0,5}$" 14 + ) 15 + 16 + 17 + @TOOL_REGISTRY.tool( 18 + name="ip.lookup", 19 + description="GeoIP and ASN lookup for an IP address. Returns geographic location (country, region, city, coordinates, timezone), network information (ISP, org, ASN), and flags for mobile, proxy, and hosting IPs.", 20 + parameters=[ 21 + ToolParameter( 22 + name="ip", 23 + type="string", 24 + description="The IP address to look up (IPv4 or IPv6)", 25 + ), 26 + ], 27 + ) 28 + async def ip_lookup(ctx: ToolContext, ip: str) -> dict[str, Any]: 29 + ip = ip.strip() 30 + if not _IP_REGEX.match(ip): 31 + return {"success": False, "ip": ip, "error": "Invalid IP address format"} 32 + 33 + try: 34 + # ip-api.com free tier requires HTTP, not HTTPS 35 + async with httpx.AsyncClient(timeout=10.0) as client: 36 + response = await client.get( 37 + f"http://ip-api.com/json/{ip}", 38 + params={ 39 + "fields": "status,message,country,countryCode,region,regionName,city,zip,lat,lon,timezone,isp,org,as,asname,mobile,proxy,hosting,query" 40 + }, 41 + ) 42 + data = response.json() 43 + 44 + if data.get("status") == "fail": 45 + return { 46 + "success": False, 47 + "ip": ip, 48 + "error": data.get("message", "Lookup failed"), 49 + } 50 + 51 + return { 52 + "success": True, 53 + "ip": data.get("query", ip), 54 + "geo": { 55 + "country": data.get("country"), 56 + "country_code": data.get("countryCode"), 57 + "region": data.get("regionName"), 58 + "region_code": data.get("region"), 59 + "city": data.get("city"), 60 + "zip": data.get("zip"), 61 + "lat": data.get("lat"), 62 + "lon": data.get("lon"), 63 + "timezone": data.get("timezone"), 64 + }, 65 + "network": { 66 + "isp": data.get("isp"), 67 + "org": data.get("org"), 68 + "asn": data.get("as"), 69 + "asn_name": data.get("asname"), 70 + }, 71 + "flags": { 72 + "is_mobile": data.get("mobile", False), 73 + "is_proxy": data.get("proxy", False), 74 + "is_hosting": data.get("hosting", False), 75 + }, 76 + } 77 + 78 + except Exception as e: 79 + return {"success": False, "ip": ip, "error": str(e)}
+105
src/tools/definitions/url.py
··· 1 + from urllib.parse import urljoin, urlparse 2 + 3 + import httpx 4 + from whois import Any 5 + 6 + from src.tools.registry import TOOL_REGISTRY, ToolContext, ToolParameter 7 + 8 + _KNOWN_SHORTENERS = { 9 + "bit.ly", 10 + "tinyurl.com", 11 + "t.co", 12 + "goo.gl", 13 + "ow.ly", 14 + "is.gd", 15 + "buff.ly", 16 + "j.mp", 17 + "rb.gy", 18 + "shorturl.at", 19 + "tiny.cc", 20 + "bl.ink", 21 + "short.io", 22 + "cutt.ly", 23 + "rebrand.ly", 24 + } 25 + 26 + 27 + @TOOL_REGISTRY.tool( 28 + name="url.expand", 29 + description="Follow a URL through its redirect chain (up to 10 hops), recording each hop's URL and HTTP status code. Flags known URL shorteners. Useful for investigating obfuscated or shortened links in spam/phishing content.", 30 + parameters=[ 31 + ToolParameter( 32 + name="url", 33 + type="string", 34 + description="The URL to expand and follow through redirects", 35 + ), 36 + ], 37 + ) 38 + async def url_expand(ctx: ToolContext, url: str) -> dict[str, Any]: 39 + hops: list[dict[str, Any]] = [] 40 + current_url = url 41 + max_hops = 10 42 + visited: set[str] = set() 43 + 44 + async with httpx.AsyncClient(timeout=10.0, follow_redirects=False) as client: 45 + for i in range(max_hops): 46 + if current_url in visited: 47 + break 48 + visited.add(current_url) 49 + 50 + try: 51 + # try HEAD first to avoid downloading large bodies 52 + try: 53 + response = await client.head(current_url) 54 + except httpx.HTTPError: 55 + response = await client.get( 56 + current_url, 57 + headers={"Range": "bytes=0-0"}, 58 + ) 59 + 60 + hop = { 61 + "hop": i + 1, 62 + "url": current_url, 63 + "status_code": response.status_code, 64 + } 65 + 66 + parsed = urlparse(current_url) 67 + if parsed.hostname and parsed.hostname.lower() in _KNOWN_SHORTENERS: 68 + hop["is_shortener"] = True 69 + 70 + hops.append(hop) 71 + 72 + if response.status_code in (301, 302, 303, 307, 308): 73 + location = response.headers.get("Location") 74 + if not location: 75 + break 76 + # handle relative redirect URLs 77 + current_url = urljoin(current_url, location) 78 + else: 79 + break 80 + 81 + except Exception as e: 82 + hops.append( 83 + { 84 + "hop": i + 1, 85 + "url": current_url, 86 + "error": str(e), 87 + } 88 + ) 89 + break 90 + 91 + final_url = hops[-1]["url"] if hops else url 92 + parsed_input = urlparse(url) 93 + is_shortener = ( 94 + parsed_input.hostname is not None 95 + and parsed_input.hostname.lower() in _KNOWN_SHORTENERS 96 + ) 97 + 98 + return { 99 + "success": True, 100 + "input_url": url, 101 + "final_url": final_url, 102 + "is_shortener": is_shortener, 103 + "total_hops": len(hops), 104 + "hops": hops, 105 + }
+79
src/tools/definitions/whois.py
··· 1 + import asyncio 2 + from datetime import datetime, timezone 3 + from typing import Any 4 + 5 + import whois 6 + 7 + from src.tools.registry import TOOL_REGISTRY, ToolContext, ToolParameter 8 + 9 + 10 + def _normalize_date(value: Any) -> str | None: 11 + """normalize python-whois date values, which can be a single datetime, a list, or None.""" 12 + if value is None: 13 + return None 14 + if isinstance(value, list): 15 + value = value[0] if value else None 16 + if isinstance(value, datetime): 17 + return value.isoformat() 18 + if isinstance(value, str): 19 + return value 20 + return str(value) if value else None 21 + 22 + 23 + @TOOL_REGISTRY.tool( 24 + name="whois.lookup", 25 + description="Look up WHOIS registration data for a domain. Returns registrar, creation/expiration dates, name servers, registrant info, and domain age in days. Domain age is a key T&S signal — newly registered domains are heavily used for spam and phishing.", 26 + parameters=[ 27 + ToolParameter( 28 + name="domain", 29 + type="string", 30 + description="The domain name to look up (e.g. example.com)", 31 + ), 32 + ], 33 + ) 34 + async def whois_lookup(ctx: ToolContext, domain: str) -> dict[str, Any]: 35 + try: 36 + w = await asyncio.to_thread(whois.whois, domain) 37 + except Exception as e: 38 + return {"success": False, "domain": domain, "error": str(e)} 39 + 40 + creation_date = _normalize_date(w.creation_date) 41 + expiration_date = _normalize_date(w.expiration_date) 42 + updated_date = _normalize_date(w.updated_date) 43 + 44 + # compute domain age 45 + domain_age_days: int | None = None 46 + if creation_date: 47 + try: 48 + raw = w.creation_date 49 + if isinstance(raw, list): 50 + raw = raw[0] 51 + if isinstance(raw, datetime): 52 + delta = datetime.now(timezone.utc) - raw.replace(tzinfo=timezone.utc) 53 + domain_age_days = delta.days 54 + except Exception: 55 + pass 56 + 57 + name_servers = w.name_servers 58 + if isinstance(name_servers, set): 59 + name_servers = sorted(name_servers) 60 + 61 + return { 62 + "success": True, 63 + "domain": domain, 64 + "registrar": w.registrar, 65 + "creation_date": creation_date, 66 + "expiration_date": expiration_date, 67 + "updated_date": updated_date, 68 + "domain_age_days": domain_age_days, 69 + "name_servers": name_servers, 70 + "dnssec": w.dnssec if hasattr(w, "dnssec") else None, 71 + "registrant": { 72 + "name": w.name if hasattr(w, "name") else None, 73 + "org": w.org if hasattr(w, "org") else None, 74 + "country": w.country if hasattr(w, "country") else None, 75 + "state": w.state if hasattr(w, "state") else None, 76 + "city": w.city if hasattr(w, "city") else None, 77 + "emails": w.emails if hasattr(w, "emails") else None, 78 + }, 79 + }
+6 -1
src/tools/deno/tools.ts
··· 5 5 /** Get Osprey/network table schema information including tables and their columns. Schema is for the table default.osprey_execution_results */ 6 6 getSchema: (): Promise<unknown> => callTool("clickhouse.getSchema", {}), 7 7 8 + /** Execute a SQL query against ClickHouse and return the results. All queries must include a LIMIT, and all queries must be executed on default.osprey_execution_results. */ 9 + query: (sql: string): Promise<unknown> => callTool("clickhouse.query", { sql }), 10 + }; 11 + 12 + export const domain = { 8 13 /** Lookup A, AAAA, NS, MX, TXT, CNAME, and SOA for a given input domain */ 9 - query: (domain: string): Promise<unknown> => callTool("clickhouse.query", { domain }), 14 + checkDomain: (domain: string): Promise<unknown> => callTool("domain.checkDomain", { domain }), 10 15 }; 11 16 12 17 export const osprey = {
+2 -2
src/tools/registry.py
··· 106 106 if len(params) == 1: 107 107 param_names = {p.name for p in tool.parameters} 108 108 val = next(iter(params.values())) 109 - if isinstance(val, dict) and set(val.keys()) <= param_names: 110 - params = val 109 + if isinstance(val, dict) and set(val.keys()) <= param_names: # ignore: type 110 + params = val # type: ignore 111 111 112 112 return await tool.handler(ctx, **params) 113 113
+37
uv.lock
··· 604 604 { name = "click" }, 605 605 { name = "clickhouse-connect" }, 606 606 { name = "dnspython" }, 607 + { name = "httpx" }, 607 608 { name = "pydantic" }, 608 609 { name = "pydantic-settings" }, 610 + { name = "python-whois" }, 609 611 ] 610 612 611 613 [package.metadata] ··· 616 618 { name = "click", specifier = ">=8.3.1" }, 617 619 { name = "clickhouse-connect", specifier = ">=0.10.0" }, 618 620 { name = "dnspython", specifier = ">=2.8.0" }, 621 + { name = "httpx", specifier = ">=0.28.0" }, 619 622 { name = "pydantic", specifier = ">=2.12.5" }, 620 623 { name = "pydantic-settings", specifier = ">=2.12.0" }, 624 + { name = "python-whois", specifier = ">=0.9.4" }, 621 625 ] 622 626 623 627 [[package]] ··· 739 743 ] 740 744 741 745 [[package]] 746 + name = "python-dateutil" 747 + version = "2.9.0.post0" 748 + source = { registry = "https://pypi.org/simple" } 749 + dependencies = [ 750 + { name = "six" }, 751 + ] 752 + sdist = { url = "https://files.pythonhosted.org/packages/66/c0/0c8b6ad9f17a802ee498c46e004a0eb49bc148f2fd230864601a86dcf6db/python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3", size = 342432, upload-time = "2024-03-01T18:36:20.211Z" } 753 + wheels = [ 754 + { url = "https://files.pythonhosted.org/packages/ec/57/56b9bcc3c9c6a792fcbaf139543cee77261f3651ca9da0c93f5c1221264b/python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427", size = 229892, upload-time = "2024-03-01T18:36:18.57Z" }, 755 + ] 756 + 757 + [[package]] 742 758 name = "python-dotenv" 743 759 version = "1.2.1" 744 760 source = { registry = "https://pypi.org/simple" } ··· 760 776 ] 761 777 762 778 [[package]] 779 + name = "python-whois" 780 + version = "0.9.6" 781 + source = { registry = "https://pypi.org/simple" } 782 + dependencies = [ 783 + { name = "python-dateutil" }, 784 + ] 785 + sdist = { url = "https://files.pythonhosted.org/packages/f1/0c/537914eca91ee5ff281309a5ca71da23c0c975cd6658668a44d3fdcf1cc4/python_whois-0.9.6.tar.gz", hash = "sha256:2e6de7b6d70e305a85f4859cd17781ee3f0da3a02a8e94f23cb4cdcd2e400bfa", size = 125107, upload-time = "2025-10-07T04:36:14.913Z" } 786 + wheels = [ 787 + { url = "https://files.pythonhosted.org/packages/46/53/d0ceb3ae30da8e8ec2d9af11050178f3b4114d5aa6a7f7074199db3c806f/python_whois-0.9.6-py3-none-any.whl", hash = "sha256:153261941a4d238b1278a4ca9b5b5e0590ed3b4d0c534ba111c4434d5d339410", size = 116976, upload-time = "2025-10-07T04:36:12.328Z" }, 788 + ] 789 + 790 + [[package]] 763 791 name = "pytz" 764 792 version = "2025.2" 765 793 source = { registry = "https://pypi.org/simple" } 766 794 sdist = { url = "https://files.pythonhosted.org/packages/f8/bf/abbd3cdfb8fbc7fb3d4d38d320f2441b1e7cbe29be4f23797b4a2b5d8aac/pytz-2025.2.tar.gz", hash = "sha256:360b9e3dbb49a209c21ad61809c7fb453643e048b38924c765813546746e81c3", size = 320884, upload-time = "2025-03-25T02:25:00.538Z" } 767 795 wheels = [ 768 796 { url = "https://files.pythonhosted.org/packages/81/c4/34e93fe5f5429d7570ec1fa436f1986fb1f00c3e0f43a589fe2bbcd22c3f/pytz-2025.2-py2.py3-none-any.whl", hash = "sha256:5ddf76296dd8c44c26eb8f4b6f35488f3ccbf6fbbd7adee0b7262d43f0ec2f00", size = 509225, upload-time = "2025-03-25T02:24:58.468Z" }, 797 + ] 798 + 799 + [[package]] 800 + name = "six" 801 + version = "1.17.0" 802 + source = { registry = "https://pypi.org/simple" } 803 + sdist = { url = "https://files.pythonhosted.org/packages/94/e7/b2c673351809dca68a0e064b6af791aa332cf192da575fd474ed7d6f16a2/six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81", size = 34031, upload-time = "2024-12-04T17:35:28.174Z" } 804 + wheels = [ 805 + { url = "https://files.pythonhosted.org/packages/b7/ce/149a00dd41f10bc29e5921b496af8b574d8413afcd5e30dfa0ed46c2cc5e/six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274", size = 11050, upload-time = "2024-12-04T17:35:26.475Z" }, 769 806 ] 770 807 771 808 [[package]]