Kaynağa Gözat

almost there

adri 1 ay önce
işleme
3d705f790b
10 değiştirilmiş dosya ile 1442 ekleme ve 0 silme
  1. 2 0
      .gitignore
  2. 1 0
      .python-version
  3. 15 0
      Dockerfile
  4. 413 0
      README.md
  5. 668 0
      app.py
  6. 20 0
      docker-compose.yml
  7. 20 0
      env.example
  8. 6 0
      main.py
  9. 10 0
      pyproject.toml
  10. 287 0
      uv.lock

+ 2 - 0
.gitignore

@@ -0,0 +1,2 @@
+.DS_Store
+.venv

+ 1 - 0
.python-version

@@ -0,0 +1 @@
+3.11

+ 15 - 0
Dockerfile

@@ -0,0 +1,15 @@
+FROM python:3.11-slim
+
+WORKDIR /app
+
+# Install Flask and requests
+RUN pip install --no-cache-dir flask requests
+
+# Copy application
+COPY app.py .
+
+# Expose port
+EXPOSE 5000
+
+# Run the application
+CMD ["python", "app.py"]

+ 413 - 0
README.md

@@ -0,0 +1,413 @@
+# POST Request Monitor & OpenAI Proxy 📬
+
+A Docker app that proxies Google Gemini API requests to OpenAI-compatible endpoints with full multimodal support.
+
+## What It Does
+
+1. **Monitors POST requests** - Real-time web UI at `http://localhost:5005`
+2. **Gemini → OpenAI Conversion** - Automatically converts API formats
+3. **Multimodal Support** - Handles text, images, and video
+4. **Request Forwarding** - Sends to your OpenAI-compatible endpoint (vLLM, Ollama, OpenAI, etc.)
+5. **Response Conversion** - Converts OpenAI responses back to Gemini format
+
+**Perfect for:** Using Gemini-format applications with local vision models (InternVL3, Qwen2-VL) or OpenAI API.
+
+## Multimodal Capabilities
+
+- ✅ **Images**: All formats (JPEG, PNG, WebP, etc.) - Universally supported
+- ✅ **Video**: MP4, WebM, etc. - Supported by video-capable models (InternVL3, Qwen2-VL, GPT-4o)
+- ✅ **Multiple media**: Send multiple images/videos in single request
+- ✅ **Mixed content**: Text + images + video together
+
+## Quick Start
+
+### 1. Create the Docker Network
+
+If you don't already have the `llm_internal` network:
+```bash
+docker network create llm_internal
+```
+
+### 2. Configure Environment Variables
+
+Create a `.env` file (or copy from `.env.example`):
+```bash
+cp .env.example .env
+```
+
+Edit `.env` to set your OpenAI-compatible endpoint:
+```bash
+# Your OpenAI-compatible endpoint
+OPENAI_ENDPOINT=http://host.docker.internal:8000/v1/chat/completions
+
+# API key if required
+OPENAI_API_KEY=none
+
+# Model name - IMPORTANT: Use the exact model name
+# For InternVL3-8B-AWQ with vLLM/Ollama:
+OPENAI_MODEL=OpenGVLab/InternVL3-8B-AWQ
+
+# Video format - Use 'openai' for video-capable models
+# (InternVL3, Qwen2-VL, GPT-4o, etc.)
+VIDEO_FORMAT=openai
+```
+
+**For InternVL3-8B-AWQ**: This model **DOES support video** - use `VIDEO_FORMAT=openai`
+
+**Note**: If your other services are on the same `llm_internal` Docker network, you can use their container names instead of `host.docker.internal`. For example: `http://vllm-server:8000/v1/chat/completions`
+
+### 3. Start the Service
+
+#### Using Docker Compose (Recommended)
+```bash
+docker-compose up --build
+```
+
+#### Using Docker
+```bash
+docker build -t post-monitor .
+docker run -p 5005:5000 --network llm_internal post-monitor
+```
+
+## Usage
+
+1. **View the Web UI**: Open http://localhost:5005 in your browser
+   - See all incoming requests
+   - View converted OpenAI format  
+   - See responses from your endpoint
+
+2. **Configure your application** to send Gemini-format requests to:
+   - `http://localhost:5005/webhook/models/model:generateContent?key=none`
+   - Or any path under `/webhook/`
+
+3. **The proxy will**:
+   - Extract text, images, and videos from Gemini format
+   - Convert to OpenAI format
+   - Forward to your OpenAI-compatible endpoint
+   - Convert OpenAI response → Gemini format
+   - Return to your application
+
+The web interface will automatically refresh every 2 seconds to show new requests.
+
+## Example Requests
+
+### Text + Image + Video (Gemini Format)
+```bash
+# Text-only request
+curl -X POST "http://localhost:5005/webhook/models/model:generateContent?key=none" \
+  -H "Content-Type: application/json" \
+  -d '{
+    "contents": [{
+      "parts": [{
+        "text": "Explain quantum computing in simple terms"
+      }]
+    }],
+    "generationConfig": {
+      "maxOutputTokens": 4096
+    }
+  }'
+
+# With image (base64)
+curl -X POST "http://localhost:5005/webhook/models/model:generateContent?key=none" \
+  -H "Content-Type: application/json" \
+  -d '{
+    "contents": [{
+      "parts": [
+        {"text": "Describe this image"},
+        {
+          "inline_data": {
+            "mime_type": "image/jpeg",
+            "data": "'$(base64 -w 0 image.jpg)'"
+          }
+        }
+      ]
+    }]
+  }'
+
+# With video (base64) - Requires VIDEO_FORMAT=openai
+curl -X POST "http://localhost:5005/webhook/models/model:generateContent?key=none" \
+  -H "Content-Type: application/json" \
+  -d '{
+    "contents": [{
+      "parts": [
+        {"text": "Describe this video"},
+        {
+          "inline_data": {
+            "mime_type": "video/mp4",
+            "data": "'$(base64 -w 0 video.mp4)'"
+          }
+        }
+      ]
+    }],
+    "generationConfig": {
+      "maxOutputTokens": 4096
+    }
+  }'
+
+# Multiple images in one request
+curl -X POST "http://localhost:5005/webhook/models/model:generateContent?key=none" \
+  -H "Content-Type: application/json" \
+  -d '{
+    "contents": [{
+      "parts": [
+        {"text": "Compare these two images"},
+        {
+          "inline_data": {
+            "mime_type": "image/jpeg",
+            "data": "'$(base64 -w 0 image1.jpg)'"
+          }
+        },
+        {
+          "inline_data": {
+            "mime_type": "image/jpeg",
+            "data": "'$(base64 -w 0 image2.jpg)'"
+          }
+        }
+      ]
+    }]
+  }'
+```
+
+## Features
+
+- ✅ **Gemini → OpenAI Format Conversion**: Automatically converts API formats
+- ✅ **Image Support**: Full support for images (JPEG, PNG, WebP, etc.)
+- ✅ **Video Support**: Configurable video handling for video-capable models
+- ✅ **Request Forwarding**: Proxies to your OpenAI-compatible endpoint
+- ✅ **Response Conversion**: Converts OpenAI responses back to Gemini format
+- ✅ **Real-time Monitoring**: Web UI shows all requests, conversions, and responses
+- ✅ **Multiple Media**: Handle multiple images/videos in single request
+- ✅ **Catch-all Routes**: Accepts any path under `/webhook/`
+- ✅ **Auto-refresh UI**: Updates every 2 seconds
+- ✅ **Error Handling**: Shows detailed errors for debugging
+- ✅ **Request History**: Stores last 50 requests in memory
+- ✅ **Docker Network**: Uses `llm_internal` network for container communication
+
+## Format Conversion Details
+
+### Gemini → OpenAI
+
+**Text:**
+- `contents[].parts[].text` → `messages[].content` (text type)
+
+**Images (all formats supported):**
+- `contents[].parts[].inline_data` (image/jpeg, image/png, etc.)
+  - → `messages[].content` (image_url type)
+  - Format: `data:image/jpeg;base64,{base64_data}`
+- ✅ Universally supported by all vision models
+
+**Videos (format depends on VIDEO_FORMAT):**
+- `contents[].parts[].inline_data` (video/mp4, etc.)
+  - When `VIDEO_FORMAT=openai`: → `messages[].content` (image_url type with video MIME)
+  - When `VIDEO_FORMAT=vllm`: → `messages[].content` (video_url type)
+  - When `VIDEO_FORMAT=skip`: → Replaced with text note
+- ⚠️ Only supported by video-capable models (InternVL3, Qwen2-VL, GPT-4o, etc.)
+
+**Generation Config:**
+- `generationConfig.maxOutputTokens` → `max_tokens`
+- `generationConfig.temperature` → `temperature`
+
+### OpenAI → Gemini
+
+**Response:**
+- `choices[0].message.content` → `candidates[0].content.parts[0].text`
+
+**Usage:**
+- `usage` → `usageMetadata`
+
+## Compatible Endpoints
+
+This proxy works with any OpenAI-compatible endpoint:
+- **OpenAI API** (api.openai.com)
+- **Local LLMs** (LM Studio, Ollama with OpenAI compatibility, Jan)
+- **vLLM** deployments
+- **Text Generation WebUI** (with OpenAI extension)
+- **LocalAI**
+- **Any other OpenAI-compatible API**
+
+## Endpoints
+
+- `GET /` - Web UI to view all requests, conversions, and responses
+- `POST /webhook/*` - Main proxy endpoint (converts Gemini → OpenAI → Gemini)
+- `POST /clear` - Clear all stored requests
+
+## Configuration
+
+### Environment Variables
+- `OPENAI_ENDPOINT` - Your OpenAI-compatible endpoint URL
+- `OPENAI_API_KEY` - API key if required (use 'none' if not needed)
+- `OPENAI_MODEL` - Model name to use (check your endpoint's available models)
+  - For vLLM/Ollama with InternVL3: `OpenGVLab/InternVL3-8B-AWQ`
+  - For OpenAI API: `gpt-4o`, `gpt-4-turbo`, etc.
+  - For SGLang: Use exact model name from deployment
+- `VIDEO_FORMAT` - How to send video content (default: 'openai')
+  - `openai` - Standard format (use for InternVL3, Qwen2-VL, GPT-4o, vLLM, Ollama)
+  - `vllm` - Experimental vLLM-specific format (try if 'openai' fails)
+  - `skip` - Don't send video (use for endpoints without video support)
+  - `error` - Fail if video is present
+
+### Docker Networking
+
+The proxy uses the `llm_internal` Docker network for communication with other services:
+
+**Communication between containers on same network:**
+```bash
+# If your vLLM/Ollama server is also on llm_internal network
+OPENAI_ENDPOINT=http://vllm-container-name:8000/v1/chat/completions
+```
+
+**Communication with host services:**
+```bash
+# For services running on your host machine (not in Docker)
+OPENAI_ENDPOINT=http://host.docker.internal:8000/v1/chat/completions
+```
+
+**Port mapping:**
+- Host: `localhost:5005`
+- Container: port `5000`
+- Access web UI: `http://localhost:5005`
+- Send requests to: `http://localhost:5005/webhook/*`
+
+## Troubleshooting
+
+### SGLang Image Processing Errors
+If you see errors like "cannot identify image file" with SGLang:
+
+1. **Check the model name**: SGLang requires the exact model name from your deployment
+   ```bash
+   # Check available models
+   curl http://localhost:8000/v1/models
+   
+   # Set in .env
+   OPENAI_MODEL=your-actual-model-name
+   ```
+
+2. **Verify base64 encoding**: Check the console logs for "Base64 data appears valid"
+   - If validation fails, the base64 data from your app might be corrupted
+
+3. **Test with simple image**: Try with a small test image first
+   ```bash
+   # Create test image and convert to base64
+   echo "iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAADUlEQVR42mNk+M9QDwADhgGAWjR9awAAAABJRU5ErkJggg==" > test.b64
+   ```
+
+4. **Check SGLang logs**: Look for more detailed errors in your SGLang server logs
+
+5. **Model compatibility**: Ensure your SGLang model supports vision/multimodal inputs
+   - Not all models work with images
+   - Check your model's documentation
+
+### Connection Issues
+If the proxy can't reach your endpoint:
+1. Check if your endpoint is running: `curl http://localhost:8000/v1/models`
+2. Verify the endpoint URL in `.env`
+3. Make sure you're using `host.docker.internal` for host services
+4. Check Docker logs: `docker-compose logs -f`
+
+## Supported Media Types
+
+### Images (✅ Always Supported)
+All vision models support images. The proxy handles these formats:
+- `image/jpeg`
+- `image/png`  
+- `image/webp`
+- `image/gif`
+- Any image MIME type
+
+**Console output when processing images:**
+```
+🖼️  Adding image: image/jpeg
+📊 Media summary:
+   Images: 2 (image/jpeg, image/png)
+```
+
+### Videos (⚠️ Model-Dependent)
+Video support depends on your model and `VIDEO_FORMAT` setting:
+
+**Supported formats:**
+- `video/mp4`
+- `video/mpeg`
+- `video/webm`
+- Any video MIME type
+
+**Console output when processing video:**
+```
+# When VIDEO_FORMAT=openai (sending video)
+📹 Adding video (OpenAI format): video/mp4
+📊 Media summary:
+   Videos: sent as openai (video/mp4)
+
+# When VIDEO_FORMAT=skip (skipping video)
+⏭️  Skipping video: video/mp4 (VIDEO_FORMAT=skip)
+📊 Media summary:
+   Videos: skipped (video/mp4)
+```
+
+### Mixed Media Requests
+You can send text + multiple images + video in a single request:
+```json
+{
+  "contents": [{
+    "parts": [
+      {"text": "Analyze these media files"},
+      {"inline_data": {"mime_type": "image/jpeg", "data": "..."}},
+      {"inline_data": {"mime_type": "image/png", "data": "..."}},
+      {"inline_data": {"mime_type": "video/mp4", "data": "..."}}
+    ]
+  }]
+}
+```
+
+## Video Support by Runner/Model
+
+**✅ VIDEO SUPPORTED** (use `VIDEO_FORMAT=openai`):
+- **vLLM** with video-capable models:
+  - InternVL3-8B-AWQ ✅
+  - Qwen2-VL series ✅
+  - LLaVA-Video ✅
+- **Ollama** with video models:
+  - InternVL ✅
+  - Qwen2-VL ✅  
+- **OpenAI API**:
+  - gpt-4o ✅
+  - gpt-4-turbo ✅
+
+**❌ VIDEO NOT SUPPORTED** (use `VIDEO_FORMAT=skip`):
+- **SGLang** - Does not support video
+- **Most text-only models** - Even if served via vLLM/Ollama
+- **Image-only vision models** - Can only process images, not video
+
+### InternVL3-8B-AWQ Configuration
+
+If you're using InternVL3-8B-AWQ (which **does support video**), your `.env` should be:
+
+```bash
+OPENAI_ENDPOINT=http://host.docker.internal:8000/v1/chat/completions
+OPENAI_MODEL=OpenGVLab/InternVL3-8B-AWQ
+VIDEO_FORMAT=openai
+```
+
+**Troubleshooting InternVL3 video**:
+1. Verify your vLLM/Ollama server is serving InternVL3: `curl http://localhost:8000/v1/models`
+2. Check the model name exactly matches what the server reports
+3. If you get "cannot identify image" errors:
+   - Your runner might not have video support enabled
+   - Try `VIDEO_FORMAT=vllm` as an alternative
+   - Check your vLLM/Ollama version supports video
+
+**How to test if your setup supports video**:
+1. Set `VIDEO_FORMAT=openai`
+2. Send a test request with video
+3. Check the logs - if you see errors about "cannot identify image", try `VIDEO_FORMAT=skip`
+
+**Why video might fail**:
+1. Model doesn't support video (only images)
+2. Runner doesn't support video
+3. Wrong VIDEO_FORMAT for your runner
+
+## Notes
+
+- Requests and responses are stored in memory only (not persisted)
+- Maximum 50 requests are kept in memory
+- The app runs on port 5000 by default
+- Base64 data is truncated in the web UI for readability but fully sent to the endpoint

+ 668 - 0
app.py

@@ -0,0 +1,668 @@
+from flask import Flask, request, jsonify, render_template_string
+from datetime import datetime
+import json
+import os
+import requests
+
+app = Flask(__name__)
+
+# Store recent requests in memory
+recent_requests = []
+MAX_REQUESTS = 50
+
+# OpenAI endpoint configuration
+OPENAI_ENDPOINT = os.getenv(
+    "OPENAI_ENDPOINT", "http://localhost:11434/v1/chat/completions"
+)
+OPENAI_API_KEY = os.getenv("OPENAI_API_KEY", "none")
+OPENAI_MODEL = os.getenv("OPENAI_MODEL", "InternVL3_5-14B")
+# Video format: 'openai' (data URL), 'vllm' (try vLLM format), 'skip', or 'error'
+VIDEO_FORMAT = os.getenv("VIDEO_FORMAT", "openai")
+
+# NEW: Endpoint type detection
+ENDPOINT_TYPE = os.getenv("ENDPOINT_TYPE", "auto")  # 'openai', 'ollama', or 'auto'
+
+
+def detect_endpoint_type(endpoint_url):
+    """Auto-detect if endpoint is OpenAI-compatible or Ollama native"""
+    if "/v1/chat/completions" in endpoint_url:
+        return "openai"
+    elif "/api/generate" in endpoint_url or "/api/chat" in endpoint_url:
+        return "ollama"
+    elif "localhost:11434" in endpoint_url or "ollama" in endpoint_url.lower():
+        return "openai"  # Assume OpenAI-compatible for Ollama
+    else:
+        return "openai"  # Default to OpenAI format
+
+
+def convert_gemini_to_openai(gemini_request):
+    """Convert Gemini API format to OpenAI API format"""
+    try:
+        contents = gemini_request.get("contents", [])
+        messages = []
+        media_info = {"images": [], "videos": []}
+
+        for content in contents:
+            parts = content.get("parts", [])
+            message_content = []
+
+            for part in parts:
+                # Handle text parts
+                if "text" in part:
+                    message_content.append({"type": "text", "text": part["text"]})
+
+                # Handle inline_data (images/video)
+                elif "inline_data" in part:
+                    inline = part["inline_data"]
+                    mime_type = inline.get("mime_type", "")
+                    data = inline.get("data", "")
+
+                    if mime_type.startswith("image/"):
+                        # Images: Universally supported across all runners
+                        media_info["images"].append(mime_type)
+                        print(f"🖼️  Adding image: {mime_type}")
+                        message_content.append(
+                            {
+                                "type": "image_url",
+                                "image_url": {
+                                    "url": f"data:{mime_type};base64,{data}",
+                                    "detail": "auto",
+                                },
+                            }
+                        )
+
+                    elif mime_type.startswith("video/"):
+                        # Videos: Format depends on VIDEO_FORMAT setting
+                        if VIDEO_FORMAT == "skip":
+                            media_info["videos"].append(f"skipped ({mime_type})")
+                            print(f"⏭️  Skipping video: {mime_type} (VIDEO_FORMAT=skip)")
+                            message_content.append(
+                                {
+                                    "type": "text",
+                                    "text": f"[Video content ({mime_type}) was present but skipped]",
+                                }
+                            )
+
+                        elif VIDEO_FORMAT == "error":
+                            raise ValueError(
+                                f"Video content detected ({mime_type}) but VIDEO_FORMAT=error"
+                            )
+
+                        else:  # 'openai', 'vllm', or any other value
+                            media_info["videos"].append(
+                                f"format: {VIDEO_FORMAT} ({mime_type})"
+                            )
+                            print(
+                                f"📹 Adding video ({VIDEO_FORMAT} format): {mime_type}"
+                            )
+                            message_content.append(
+                                {
+                                    "type": "image_url",
+                                    "image_url": {
+                                        "url": f"data:{mime_type};base64,{data}",
+                                        "detail": "auto",
+                                    },
+                                }
+                            )
+
+            # Add as user message
+            # If only one content item and it's text, send as string for better compatibility
+            if len(message_content) == 1 and message_content[0].get("type") == "text":
+                messages.append({"role": "user", "content": message_content[0]["text"]})
+            else:
+                messages.append({"role": "user", "content": message_content})
+
+        # Build OpenAI request
+        openai_request = {"model": OPENAI_MODEL, "messages": messages}
+
+        # Add generation config as OpenAI parameters
+        gen_config = gemini_request.get("generationConfig", {})
+        if "maxOutputTokens" in gen_config:
+            openai_request["max_tokens"] = gen_config["maxOutputTokens"]
+        if "temperature" in gen_config:
+            openai_request["temperature"] = gen_config["temperature"]
+
+        # Log media summary
+        if media_info["images"] or media_info["videos"]:
+            print(f"📊 Media summary:")
+            if media_info["images"]:
+                print(
+                    f"   Images: {len(media_info['images'])} ({', '.join(media_info['images'])})"
+                )
+            if media_info["videos"]:
+                print(f"   Videos: {', '.join(media_info['videos'])}")
+
+        return openai_request
+    except Exception as e:
+        print(f"❌ Error converting request: {e}")
+        raise
+
+
+def convert_gemini_to_ollama(gemini_request):
+    """Convert Gemini API format to Ollama native format"""
+    try:
+        contents = gemini_request.get("contents", [])
+
+        # Extract text and combine into a single prompt
+        prompt_parts = []
+        images = []
+
+        for content in contents:
+            parts = content.get("parts", [])
+
+            for part in parts:
+                if "text" in part:
+                    prompt_parts.append(part["text"])
+                elif "inline_data" in part:
+                    inline = part["inline_data"]
+                    mime_type = inline.get("mime_type", "")
+                    data = inline.get("data", "")
+
+                    if mime_type.startswith("image/") or mime_type.startswith("video/"):
+                        # Ollama expects images in a different format
+                        images.append(data)  # Just the base64 data
+                        print(f"🖼️ Adding media for Ollama: {mime_type}")
+
+        # Build Ollama request
+        ollama_request = {
+            "model": OPENAI_MODEL,
+            "prompt": " ".join(prompt_parts),
+            "stream": False,
+        }
+
+        # Add images if present
+        if images:
+            ollama_request["images"] = images
+
+        # Add generation config
+        gen_config = gemini_request.get("generationConfig", {})
+        if "temperature" in gen_config:
+            ollama_request["options"] = {"temperature": gen_config["temperature"]}
+
+        return ollama_request
+    except Exception as e:
+        print(f"❌ Error converting to Ollama format: {e}")
+        raise
+
+
+def convert_openai_to_gemini(openai_response):
+    """Convert OpenAI API response to Gemini API format"""
+    try:
+        # Extract the message content
+        choices = openai_response.get("choices", [])
+        if not choices:
+            print(f"❌ No choices in OpenAI response: {openai_response}")
+            return {"error": "No response generated"}
+
+        message = choices[0].get("message", {})
+        content = message.get("content", "")
+
+        if not content:
+            print(f"❌ No content in message: {message}")
+            return {"error": "No response generated"}
+
+        # Convert to Gemini format
+        gemini_response = {
+            "candidates": [
+                {
+                    "content": {"parts": [{"text": content}], "role": "model"},
+                    "finishReason": "STOP",
+                    "index": 0,
+                }
+            ],
+            "usageMetadata": {
+                "promptTokenCount": openai_response.get("usage", {}).get(
+                    "prompt_tokens", 0
+                ),
+                "candidatesTokenCount": openai_response.get("usage", {}).get(
+                    "completion_tokens", 0
+                ),
+                "totalTokenCount": openai_response.get("usage", {}).get(
+                    "total_tokens", 0
+                ),
+            },
+        }
+
+        return gemini_response
+    except Exception as e:
+        print(f"❌ Error converting OpenAI response: {e}")
+        raise
+
+
+def convert_ollama_to_gemini(ollama_response):
+    """Convert Ollama native response to Gemini API format"""
+    try:
+        # Ollama /api/generate returns: {"response": "text", "done": true, ...}
+        response_text = ollama_response.get("response", "")
+
+        if not response_text:
+            print(f"❌ No response text in Ollama response: {ollama_response}")
+            return {"error": "No response generated"}
+
+        # Convert to Gemini format
+        gemini_response = {
+            "candidates": [
+                {
+                    "content": {"parts": [{"text": response_text}], "role": "model"},
+                    "finishReason": "STOP",
+                    "index": 0,
+                }
+            ],
+            "usageMetadata": {
+                "promptTokenCount": ollama_response.get("prompt_eval_count", 0),
+                "candidatesTokenCount": ollama_response.get("eval_count", 0),
+                "totalTokenCount": ollama_response.get("prompt_eval_count", 0)
+                + ollama_response.get("eval_count", 0),
+            },
+        }
+
+        return gemini_response
+    except Exception as e:
+        print(f"❌ Error converting Ollama response: {e}")
+        raise
+
+
+HTML_TEMPLATE = """
+<!DOCTYPE html>
+<html>
+<head>
+    <title>POST Request Monitor</title>
+    <style>
+        body { font-family: Arial, sans-serif; margin: 20px; background: #f5f5f5; }
+        h1 { color: #333; }
+        .config { background: #e3f2fd; padding: 10px; margin: 10px 0; border-radius: 5px; }
+        .request { 
+            background: white; 
+            padding: 15px; 
+            margin: 10px 0; 
+            border-radius: 5px;
+            box-shadow: 0 2px 4px rgba(0,0,0,0.1);
+        }
+        .timestamp { color: #666; font-size: 0.9em; }
+        .method { 
+            display: inline-block;
+            padding: 3px 8px;
+            background: #4CAF50;
+            color: white;
+            border-radius: 3px;
+            font-weight: bold;
+        }
+        .forwarded { 
+            display: inline-block;
+            padding: 3px 8px;
+            background: #2196F3;
+            color: white;
+            border-radius: 3px;
+            font-size: 0.8em;
+            margin-left: 10px;
+        }
+        .error-badge { 
+            display: inline-block;
+            padding: 3px 8px;
+            background: #f44336;
+            color: white;
+            border-radius: 3px;
+            font-size: 0.8em;
+            margin-left: 10px;
+        }
+        pre { 
+            background: #f4f4f4; 
+            padding: 10px; 
+            border-radius: 3px;
+            overflow-x: auto;
+            max-height: 300px;
+            overflow-y: auto;
+        }
+        .clear-btn {
+            background: #f44336;
+            color: white;
+            border: none;
+            padding: 10px 20px;
+            border-radius: 5px;
+            cursor: pointer;
+            margin: 10px 0;
+        }
+        .clear-btn:hover { background: #d32f2f; }
+    </style>
+    <script>
+        function clearRequests() {
+            fetch('/clear', { method: 'POST' })
+                .then(() => location.reload());
+        }
+        // Auto-refresh every 3 seconds
+        setTimeout(() => location.reload(), 3000);
+    </script>
+</head>
+<body>
+    <h1>📬 POST Request Monitor & AI Proxy</h1>
+    <div class="config">
+        <strong>Configuration:</strong><br>
+        Endpoint: <strong>{{ endpoint }}</strong><br>
+        Type: <strong>{{ endpoint_type }}</strong><br>
+        Model: <strong>{{ model }}</strong><br>
+        Video Format: <strong>{{ video_format }}</strong>
+    </div>
+    <p>Send POST requests to <strong>http://localhost:5005/webhook</strong></p>
+    <button class="clear-btn" onclick="clearRequests()">Clear All</button>
+    <div id="requests">
+        {% for req in requests %}
+        <div class="request">
+            <div>
+                <span class="method">{{ req.method }}</span>
+                <span class="timestamp">{{ req.timestamp }}</span>
+                {% if req.forwarded %}
+                <span class="forwarded">FORWARDED ({{ req.endpoint_type }})</span>
+                {% endif %}
+                {% if req.error %}
+                <span class="error-badge">ERROR</span>
+                {% endif %}
+            </div>
+            <div><strong>Path:</strong> {{ req.path }}</div>
+            {% if req.query_params %}
+            <div><strong>Query Parameters:</strong></div>
+            <pre>{{ req.query_params }}</pre>
+            {% endif %}
+            {% if req.body %}
+            <div><strong>Incoming Body (Gemini Format):</strong></div>
+            <pre>{{ req.body }}</pre>
+            {% endif %}
+            {% if req.converted_request %}
+            <div><strong>Converted Request:</strong></div>
+            <pre>{{ req.converted_request }}</pre>
+            {% endif %}
+            {% if req.raw_response %}
+            <div><strong>Raw Response:</strong></div>
+            <pre>{{ req.raw_response }}</pre>
+            {% endif %}
+            {% if req.response %}
+            <div><strong>Final Response (Gemini Format):</strong></div>
+            <pre>{{ req.response }}</pre>
+            {% endif %}
+            {% if req.error %}
+            <div><strong>Error:</strong></div>
+            <pre style="color: red;">{{ req.error }}</pre>
+            {% endif %}
+        </div>
+        {% endfor %}
+    </div>
+</body>
+</html>
+"""
+
+
+@app.route("/")
+def index():
+    endpoint_type = (
+        ENDPOINT_TYPE
+        if ENDPOINT_TYPE != "auto"
+        else detect_endpoint_type(OPENAI_ENDPOINT)
+    )
+    return render_template_string(
+        HTML_TEMPLATE,
+        requests=reversed(recent_requests),
+        endpoint=OPENAI_ENDPOINT,
+        endpoint_type=endpoint_type,
+        model=OPENAI_MODEL,
+        video_format=VIDEO_FORMAT,
+    )
+
+
+@app.route("/webhook", methods=["POST", "PUT", "PATCH"], defaults={"subpath": ""})
+@app.route("/webhook/<path:subpath>", methods=["POST", "PUT", "PATCH"])
+def webhook(subpath):
+    """Accept POST/PUT/PATCH requests, forward to AI endpoint, and return response"""
+    timestamp = datetime.now().strftime("%Y-%m-%d %H:%M:%S")
+
+    # Get full path with query parameters
+    full_path = request.full_path if request.query_string else request.path
+
+    print(f"\n{'='*60}")
+    print(f"[{timestamp}] INCOMING {request.method} {full_path}")
+    print(f"Matched route with subpath: '{subpath}'")
+    print(f"{'='*60}")
+
+    # Detect endpoint type
+    endpoint_type = (
+        ENDPOINT_TYPE
+        if ENDPOINT_TYPE != "auto"
+        else detect_endpoint_type(OPENAI_ENDPOINT)
+    )
+    print(f"Detected endpoint type: {endpoint_type}")
+
+    # Get request data
+    try:
+        body = request.get_data(as_text=True)
+        gemini_request = request.get_json() if request.is_json else {}
+        body_display = json.dumps(gemini_request, indent=2) if gemini_request else body
+    except Exception as e:
+        body_display = str(request.get_data())
+        gemini_request = {}
+
+    # Print request details
+    if request.args:
+        print("Query Parameters:")
+        for key, value in request.args.items():
+            print(f"  {key}: {value}")
+    print(
+        f"Body preview:\n{body_display[:500]}{'...' if len(body_display) > 500 else ''}"
+    )
+
+    # Store request info for monitoring
+    req_info = {
+        "timestamp": timestamp,
+        "method": request.method,
+        "path": full_path,
+        "query_params": dict(request.args),
+        "body": body_display,
+        "forwarded": False,
+        "response": None,
+        "error": None,
+        "endpoint_type": endpoint_type,
+        "converted_request": None,
+        "raw_response": None,
+    }
+
+    # Try to forward to AI endpoint
+    try:
+        if gemini_request:
+            print(f"\n{'='*60}")
+            print(f"CONVERTING AND FORWARDING TO {endpoint_type.upper()} ENDPOINT")
+            print(f"Target: {OPENAI_ENDPOINT}")
+            print(f"{'='*60}")
+
+            # Convert based on endpoint type
+            if endpoint_type == "ollama":
+                converted_request = convert_gemini_to_ollama(gemini_request)
+            else:  # openai
+                converted_request = convert_gemini_to_openai(gemini_request)
+
+            # Log the converted request (truncate base64 for readability)
+            logged_request = json.loads(json.dumps(converted_request))
+            if endpoint_type == "openai":
+                for msg in logged_request.get("messages", []):
+                    if isinstance(msg.get("content"), list):
+                        for item in msg["content"]:
+                            if item.get("type") == "image_url":
+                                url = item["image_url"]["url"]
+                                if len(url) > 100:
+                                    item["image_url"]["url"] = (
+                                        url[:50] + "...[base64 data]..." + url[-20:]
+                                    )
+            elif "images" in logged_request:
+                # Truncate Ollama images
+                for i, img in enumerate(logged_request["images"]):
+                    if len(img) > 100:
+                        logged_request["images"][i] = (
+                            img[:50] + "...[base64 data]..." + img[-20:]
+                        )
+
+            print(f"Converted request:\n{json.dumps(logged_request, indent=2)}")
+
+            # Forward to endpoint
+            headers = {
+                "Content-Type": "application/json",
+            }
+            if (
+                OPENAI_API_KEY
+                and OPENAI_API_KEY != "none"
+                and endpoint_type == "openai"
+            ):
+                headers["Authorization"] = f"Bearer {OPENAI_API_KEY}"
+
+            print(f"Sending request to {OPENAI_ENDPOINT}...")
+            response = requests.post(
+                OPENAI_ENDPOINT, json=converted_request, headers=headers, timeout=120
+            )
+
+            print(f"\nResponse Status: {response.status_code}")
+            print(f"Response Headers: {dict(response.headers)}")
+
+            if response.status_code == 200:
+                raw_response = response.json()
+                print(f"Raw Response:\n{json.dumps(raw_response, indent=2)[:1000]}...")
+
+                # Convert back to Gemini format based on endpoint type
+                if endpoint_type == "ollama":
+                    gemini_response = convert_ollama_to_gemini(raw_response)
+                else:  # openai
+                    gemini_response = convert_openai_to_gemini(raw_response)
+
+                print(
+                    f"\nConverted Gemini Response:\n{json.dumps(gemini_response, indent=2)[:1000]}..."
+                )
+
+                req_info["forwarded"] = True
+                req_info["response"] = json.dumps(gemini_response, indent=2)
+                req_info["converted_request"] = json.dumps(logged_request, indent=2)
+                req_info["raw_response"] = json.dumps(raw_response, indent=2)[:2000] + (
+                    "..." if len(json.dumps(raw_response, indent=2)) > 2000 else ""
+                )
+
+                recent_requests.append(req_info)
+                if len(recent_requests) > MAX_REQUESTS:
+                    recent_requests.pop(0)
+
+                print(f"{'='*60}\n")
+                return jsonify(gemini_response), 200
+            else:
+                # Get detailed error
+                try:
+                    error_data = response.json()
+                    error_msg = json.dumps(error_data, indent=2)
+                except:
+                    error_msg = response.text
+
+                full_error = f"{endpoint_type.upper()} endpoint returned {response.status_code}:\n{error_msg}"
+                print(f"ERROR: {full_error}")
+                req_info["error"] = full_error
+                req_info["forwarded"] = True
+                req_info["converted_request"] = json.dumps(logged_request, indent=2)
+                req_info["raw_response"] = error_msg
+
+                recent_requests.append(req_info)
+                if len(recent_requests) > MAX_REQUESTS:
+                    recent_requests.pop(0)
+
+                print(f"{'='*60}\n")
+                return (
+                    jsonify(
+                        {
+                            "error": {
+                                "message": error_msg,
+                                "status": response.status_code,
+                            }
+                        }
+                    ),
+                    response.status_code,
+                )
+        else:
+            # No JSON body, just acknowledge
+            req_info["error"] = "No JSON body to forward"
+            recent_requests.append(req_info)
+            if len(recent_requests) > MAX_REQUESTS:
+                recent_requests.pop(0)
+
+            print(f"{'='*60}\n")
+            return (
+                jsonify(
+                    {
+                        "status": "success",
+                        "message": "Request received but not forwarded (no JSON body)",
+                        "timestamp": timestamp,
+                    }
+                ),
+                200,
+            )
+
+    except Exception as e:
+        error_msg = f"Error processing request: {str(e)}"
+        print(f"ERROR: {error_msg}")
+        import traceback
+
+        traceback.print_exc()
+
+        req_info["error"] = error_msg
+        recent_requests.append(req_info)
+        if len(recent_requests) > MAX_REQUESTS:
+            recent_requests.pop(0)
+
+        print(f"{'='*60}\n")
+        return jsonify({"error": {"message": error_msg}}), 500
+
+
+@app.route("/clear", methods=["POST"])
+def clear():
+    """Clear all stored requests"""
+    recent_requests.clear()
+    return jsonify({"status": "cleared"}), 200
+
+
+@app.errorhandler(404)
+def not_found(e):
+    """Handle 404 errors with helpful message"""
+    print(f"\n❌ 404 ERROR: {request.method} {request.path}")
+    print(f"   Query string: {request.query_string.decode()}")
+    print(f"   Full path: {request.full_path}")
+    print(f"   Available routes:")
+    for rule in app.url_map.iter_rules():
+        print(f"   - {rule.methods} {rule.rule}")
+    return (
+        jsonify(
+            {
+                "error": "Not Found",
+                "message": f"The path {request.path} was not found",
+                "hint": "POST requests should go to /webhook or /webhook/<path>",
+            }
+        ),
+        404,
+    )
+
+
+if __name__ == "__main__":
+    print("🚀 POST Request Monitor & AI Proxy starting...")
+    print("📍 Web UI: http://localhost:5005")
+    print("📮 Webhook endpoint: http://localhost:5005/webhook")
+    print(
+        "📮 Example: http://localhost:5005/webhook/models/model:generateContent?key=none"
+    )
+    print(f"🔗 Forwarding to: {OPENAI_ENDPOINT}")
+    print(f"🤖 Model: {OPENAI_MODEL}")
+    print(f"📹 Video format: {VIDEO_FORMAT}")
+    endpoint_type = (
+        ENDPOINT_TYPE
+        if ENDPOINT_TYPE != "auto"
+        else detect_endpoint_type(OPENAI_ENDPOINT)
+    )
+    print(f"🔧 Endpoint type: {endpoint_type}")
+    print("\n" + "=" * 60)
+    print("CONFIGURATION OPTIONS:")
+    print("Set these environment variables to configure:")
+    print("  OPENAI_ENDPOINT - Target endpoint URL")
+    print("  ENDPOINT_TYPE - 'openai', 'ollama', or 'auto' (default)")
+    print("  OPENAI_MODEL - Model name")
+    print("  VIDEO_FORMAT - 'openai', 'vllm', 'skip', or 'error'")
+    print("\nFor Ollama:")
+    print("  OpenAI-compatible: http://localhost:11434/v1/chat/completions")
+    print("  Native format: http://localhost:11434/api/generate")
+    print("=" * 60)
+    app.run(host="0.0.0.0", port=5000, debug=True)

+ 20 - 0
docker-compose.yml

@@ -0,0 +1,20 @@
+version: '3.8'
+
+services:
+  post-monitor:
+    build: .
+    ports:
+      - "5005:5000"
+    container_name: post-request-monitor
+    restart: unless-stopped
+    environment:
+      - OPENAI_ENDPOINT=${OPENAI_ENDPOINT:-http://host.docker.internal:8000/v1/chat/completions}
+      - OPENAI_API_KEY=${OPENAI_API_KEY:-none}
+      - OPENAI_MODEL=${OPENAI_MODEL:-gpt-4-vision-preview}
+      - VIDEO_FORMAT=${VIDEO_FORMAT:-openai}
+    networks:
+      - llm_internal
+
+networks:
+  llm_internal:
+    external: true

+ 20 - 0
env.example

@@ -0,0 +1,20 @@
+# OpenAI-compatible endpoint URL
+# If on same Docker network (llm_internal): use container name
+#   Example: http://vllm-server:8000/v1/chat/completions
+# If on host machine: use host.docker.internal
+#   Example: http://host.docker.internal:8000/v1/chat/completions
+OPENAI_ENDPOINT=http://host.docker.internal:8000/v1/chat/completions
+
+# OpenAI API Key (if required by your endpoint)
+OPENAI_API_KEY=none
+
+# Model name to use (check your endpoint's /v1/models for available models)
+# For InternVL3-8B-AWQ: Use the exact model name from your deployment
+OPENAI_MODEL=OpenGVLab/InternVL3-8B-AWQ
+
+# Video format - How to send video to the endpoint
+# 'openai' - Standard format (works with OpenAI gpt-4o, vLLM, Ollama with vision models)
+# 'vllm'   - Experimental vLLM-specific format  
+# 'skip'   - Don't send video (for endpoints without video support like SGLang)
+# 'error'  - Fail if video is present
+VIDEO_FORMAT=openai

+ 6 - 0
main.py

@@ -0,0 +1,6 @@
+def main():
+    print("Hello from grelay!")
+
+
+if __name__ == "__main__":
+    main()

+ 10 - 0
pyproject.toml

@@ -0,0 +1,10 @@
+[project]
+name = "grelay"
+version = "0.1.0"
+description = "Add your description here"
+readme = "README.md"
+requires-python = ">=3.11"
+dependencies = [
+    "flask>=3.1.2",
+    "requests>=2.32.5",
+]

+ 287 - 0
uv.lock

@@ -0,0 +1,287 @@
+version = 1
+revision = 3
+requires-python = ">=3.11"
+
+[[package]]
+name = "blinker"
+version = "1.9.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/21/28/9b3f50ce0e048515135495f198351908d99540d69bfdc8c1d15b73dc55ce/blinker-1.9.0.tar.gz", hash = "sha256:b4ce2265a7abece45e7cc896e98dbebe6cead56bcf805a3d23136d145f5445bf", size = 22460, upload-time = "2024-11-08T17:25:47.436Z" }
+wheels = [
+    { url = "https://files.pythonhosted.org/packages/10/cb/f2ad4230dc2eb1a74edf38f1a38b9b52277f75bef262d8908e60d957e13c/blinker-1.9.0-py3-none-any.whl", hash = "sha256:ba0efaa9080b619ff2f3459d1d500c57bddea4a6b424b60a91141db6fd2f08bc", size = 8458, upload-time = "2024-11-08T17:25:46.184Z" },
+]
+
+[[package]]
+name = "certifi"
+version = "2026.1.4"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/e0/2d/a891ca51311197f6ad14a7ef42e2399f36cf2f9bd44752b3dc4eab60fdc5/certifi-2026.1.4.tar.gz", hash = "sha256:ac726dd470482006e014ad384921ed6438c457018f4b3d204aea4281258b2120", size = 154268, upload-time = "2026-01-04T02:42:41.825Z" }
+wheels = [
+    { url = "https://files.pythonhosted.org/packages/e6/ad/3cc14f097111b4de0040c83a525973216457bbeeb63739ef1ed275c1c021/certifi-2026.1.4-py3-none-any.whl", hash = "sha256:9943707519e4add1115f44c2bc244f782c0249876bf51b6599fee1ffbedd685c", size = 152900, upload-time = "2026-01-04T02:42:40.15Z" },
+]
+
+[[package]]
+name = "charset-normalizer"
+version = "3.4.4"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/13/69/33ddede1939fdd074bce5434295f38fae7136463422fe4fd3e0e89b98062/charset_normalizer-3.4.4.tar.gz", hash = "sha256:94537985111c35f28720e43603b8e7b43a6ecfb2ce1d3058bbe955b73404e21a", size = 129418, upload-time = "2025-10-14T04:42:32.879Z" }
+wheels = [
+    { url = "https://files.pythonhosted.org/packages/ed/27/c6491ff4954e58a10f69ad90aca8a1b6fe9c5d3c6f380907af3c37435b59/charset_normalizer-3.4.4-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:6e1fcf0720908f200cd21aa4e6750a48ff6ce4afe7ff5a79a90d5ed8a08296f8", size = 206988, upload-time = "2025-10-14T04:40:33.79Z" },
+    { url = "https://files.pythonhosted.org/packages/94/59/2e87300fe67ab820b5428580a53cad894272dbb97f38a7a814a2a1ac1011/charset_normalizer-3.4.4-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5f819d5fe9234f9f82d75bdfa9aef3a3d72c4d24a6e57aeaebba32a704553aa0", size = 147324, upload-time = "2025-10-14T04:40:34.961Z" },
+    { url = "https://files.pythonhosted.org/packages/07/fb/0cf61dc84b2b088391830f6274cb57c82e4da8bbc2efeac8c025edb88772/charset_normalizer-3.4.4-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:a59cb51917aa591b1c4e6a43c132f0cdc3c76dbad6155df4e28ee626cc77a0a3", size = 142742, upload-time = "2025-10-14T04:40:36.105Z" },
+    { url = "https://files.pythonhosted.org/packages/62/8b/171935adf2312cd745d290ed93cf16cf0dfe320863ab7cbeeae1dcd6535f/charset_normalizer-3.4.4-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:8ef3c867360f88ac904fd3f5e1f902f13307af9052646963ee08ff4f131adafc", size = 160863, upload-time = "2025-10-14T04:40:37.188Z" },
+    { url = "https://files.pythonhosted.org/packages/09/73/ad875b192bda14f2173bfc1bc9a55e009808484a4b256748d931b6948442/charset_normalizer-3.4.4-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d9e45d7faa48ee908174d8fe84854479ef838fc6a705c9315372eacbc2f02897", size = 157837, upload-time = "2025-10-14T04:40:38.435Z" },
+    { url = "https://files.pythonhosted.org/packages/6d/fc/de9cce525b2c5b94b47c70a4b4fb19f871b24995c728e957ee68ab1671ea/charset_normalizer-3.4.4-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:840c25fb618a231545cbab0564a799f101b63b9901f2569faecd6b222ac72381", size = 151550, upload-time = "2025-10-14T04:40:40.053Z" },
+    { url = "https://files.pythonhosted.org/packages/55/c2/43edd615fdfba8c6f2dfbd459b25a6b3b551f24ea21981e23fb768503ce1/charset_normalizer-3.4.4-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:ca5862d5b3928c4940729dacc329aa9102900382fea192fc5e52eb69d6093815", size = 149162, upload-time = "2025-10-14T04:40:41.163Z" },
+    { url = "https://files.pythonhosted.org/packages/03/86/bde4ad8b4d0e9429a4e82c1e8f5c659993a9a863ad62c7df05cf7b678d75/charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d9c7f57c3d666a53421049053eaacdd14bbd0a528e2186fcb2e672effd053bb0", size = 150019, upload-time = "2025-10-14T04:40:42.276Z" },
+    { url = "https://files.pythonhosted.org/packages/1f/86/a151eb2af293a7e7bac3a739b81072585ce36ccfb4493039f49f1d3cae8c/charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:277e970e750505ed74c832b4bf75dac7476262ee2a013f5574dd49075879e161", size = 143310, upload-time = "2025-10-14T04:40:43.439Z" },
+    { url = "https://files.pythonhosted.org/packages/b5/fe/43dae6144a7e07b87478fdfc4dbe9efd5defb0e7ec29f5f58a55aeef7bf7/charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:31fd66405eaf47bb62e8cd575dc621c56c668f27d46a61d975a249930dd5e2a4", size = 162022, upload-time = "2025-10-14T04:40:44.547Z" },
+    { url = "https://files.pythonhosted.org/packages/80/e6/7aab83774f5d2bca81f42ac58d04caf44f0cc2b65fc6db2b3b2e8a05f3b3/charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:0d3d8f15c07f86e9ff82319b3d9ef6f4bf907608f53fe9d92b28ea9ae3d1fd89", size = 149383, upload-time = "2025-10-14T04:40:46.018Z" },
+    { url = "https://files.pythonhosted.org/packages/4f/e8/b289173b4edae05c0dde07f69f8db476a0b511eac556dfe0d6bda3c43384/charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:9f7fcd74d410a36883701fafa2482a6af2ff5ba96b9a620e9e0721e28ead5569", size = 159098, upload-time = "2025-10-14T04:40:47.081Z" },
+    { url = "https://files.pythonhosted.org/packages/d8/df/fe699727754cae3f8478493c7f45f777b17c3ef0600e28abfec8619eb49c/charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:ebf3e58c7ec8a8bed6d66a75d7fb37b55e5015b03ceae72a8e7c74495551e224", size = 152991, upload-time = "2025-10-14T04:40:48.246Z" },
+    { url = "https://files.pythonhosted.org/packages/1a/86/584869fe4ddb6ffa3bd9f491b87a01568797fb9bd8933f557dba9771beaf/charset_normalizer-3.4.4-cp311-cp311-win32.whl", hash = "sha256:eecbc200c7fd5ddb9a7f16c7decb07b566c29fa2161a16cf67b8d068bd21690a", size = 99456, upload-time = "2025-10-14T04:40:49.376Z" },
+    { url = "https://files.pythonhosted.org/packages/65/f6/62fdd5feb60530f50f7e38b4f6a1d5203f4d16ff4f9f0952962c044e919a/charset_normalizer-3.4.4-cp311-cp311-win_amd64.whl", hash = "sha256:5ae497466c7901d54b639cf42d5b8c1b6a4fead55215500d2f486d34db48d016", size = 106978, upload-time = "2025-10-14T04:40:50.844Z" },
+    { url = "https://files.pythonhosted.org/packages/7a/9d/0710916e6c82948b3be62d9d398cb4fcf4e97b56d6a6aeccd66c4b2f2bd5/charset_normalizer-3.4.4-cp311-cp311-win_arm64.whl", hash = "sha256:65e2befcd84bc6f37095f5961e68a6f077bf44946771354a28ad434c2cce0ae1", size = 99969, upload-time = "2025-10-14T04:40:52.272Z" },
+    { url = "https://files.pythonhosted.org/packages/f3/85/1637cd4af66fa687396e757dec650f28025f2a2f5a5531a3208dc0ec43f2/charset_normalizer-3.4.4-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:0a98e6759f854bd25a58a73fa88833fba3b7c491169f86ce1180c948ab3fd394", size = 208425, upload-time = "2025-10-14T04:40:53.353Z" },
+    { url = "https://files.pythonhosted.org/packages/9d/6a/04130023fef2a0d9c62d0bae2649b69f7b7d8d24ea5536feef50551029df/charset_normalizer-3.4.4-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b5b290ccc2a263e8d185130284f8501e3e36c5e02750fc6b6bdeb2e9e96f1e25", size = 148162, upload-time = "2025-10-14T04:40:54.558Z" },
+    { url = "https://files.pythonhosted.org/packages/78/29/62328d79aa60da22c9e0b9a66539feae06ca0f5a4171ac4f7dc285b83688/charset_normalizer-3.4.4-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:74bb723680f9f7a6234dcf67aea57e708ec1fbdf5699fb91dfd6f511b0a320ef", size = 144558, upload-time = "2025-10-14T04:40:55.677Z" },
+    { url = "https://files.pythonhosted.org/packages/86/bb/b32194a4bf15b88403537c2e120b817c61cd4ecffa9b6876e941c3ee38fe/charset_normalizer-3.4.4-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f1e34719c6ed0b92f418c7c780480b26b5d9c50349e9a9af7d76bf757530350d", size = 161497, upload-time = "2025-10-14T04:40:57.217Z" },
+    { url = "https://files.pythonhosted.org/packages/19/89/a54c82b253d5b9b111dc74aca196ba5ccfcca8242d0fb64146d4d3183ff1/charset_normalizer-3.4.4-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2437418e20515acec67d86e12bf70056a33abdacb5cb1655042f6538d6b085a8", size = 159240, upload-time = "2025-10-14T04:40:58.358Z" },
+    { url = "https://files.pythonhosted.org/packages/c0/10/d20b513afe03acc89ec33948320a5544d31f21b05368436d580dec4e234d/charset_normalizer-3.4.4-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:11d694519d7f29d6cd09f6ac70028dba10f92f6cdd059096db198c283794ac86", size = 153471, upload-time = "2025-10-14T04:40:59.468Z" },
+    { url = "https://files.pythonhosted.org/packages/61/fa/fbf177b55bdd727010f9c0a3c49eefa1d10f960e5f09d1d887bf93c2e698/charset_normalizer-3.4.4-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:ac1c4a689edcc530fc9d9aa11f5774b9e2f33f9a0c6a57864e90908f5208d30a", size = 150864, upload-time = "2025-10-14T04:41:00.623Z" },
+    { url = "https://files.pythonhosted.org/packages/05/12/9fbc6a4d39c0198adeebbde20b619790e9236557ca59fc40e0e3cebe6f40/charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:21d142cc6c0ec30d2efee5068ca36c128a30b0f2c53c1c07bd78cb6bc1d3be5f", size = 150647, upload-time = "2025-10-14T04:41:01.754Z" },
+    { url = "https://files.pythonhosted.org/packages/ad/1f/6a9a593d52e3e8c5d2b167daf8c6b968808efb57ef4c210acb907c365bc4/charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:5dbe56a36425d26d6cfb40ce79c314a2e4dd6211d51d6d2191c00bed34f354cc", size = 145110, upload-time = "2025-10-14T04:41:03.231Z" },
+    { url = "https://files.pythonhosted.org/packages/30/42/9a52c609e72471b0fc54386dc63c3781a387bb4fe61c20231a4ebcd58bdd/charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:5bfbb1b9acf3334612667b61bd3002196fe2a1eb4dd74d247e0f2a4d50ec9bbf", size = 162839, upload-time = "2025-10-14T04:41:04.715Z" },
+    { url = "https://files.pythonhosted.org/packages/c4/5b/c0682bbf9f11597073052628ddd38344a3d673fda35a36773f7d19344b23/charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:d055ec1e26e441f6187acf818b73564e6e6282709e9bcb5b63f5b23068356a15", size = 150667, upload-time = "2025-10-14T04:41:05.827Z" },
+    { url = "https://files.pythonhosted.org/packages/e4/24/a41afeab6f990cf2daf6cb8c67419b63b48cf518e4f56022230840c9bfb2/charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:af2d8c67d8e573d6de5bc30cdb27e9b95e49115cd9baad5ddbd1a6207aaa82a9", size = 160535, upload-time = "2025-10-14T04:41:06.938Z" },
+    { url = "https://files.pythonhosted.org/packages/2a/e5/6a4ce77ed243c4a50a1fecca6aaaab419628c818a49434be428fe24c9957/charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:780236ac706e66881f3b7f2f32dfe90507a09e67d1d454c762cf642e6e1586e0", size = 154816, upload-time = "2025-10-14T04:41:08.101Z" },
+    { url = "https://files.pythonhosted.org/packages/a8/ef/89297262b8092b312d29cdb2517cb1237e51db8ecef2e9af5edbe7b683b1/charset_normalizer-3.4.4-cp312-cp312-win32.whl", hash = "sha256:5833d2c39d8896e4e19b689ffc198f08ea58116bee26dea51e362ecc7cd3ed26", size = 99694, upload-time = "2025-10-14T04:41:09.23Z" },
+    { url = "https://files.pythonhosted.org/packages/3d/2d/1e5ed9dd3b3803994c155cd9aacb60c82c331bad84daf75bcb9c91b3295e/charset_normalizer-3.4.4-cp312-cp312-win_amd64.whl", hash = "sha256:a79cfe37875f822425b89a82333404539ae63dbdddf97f84dcbc3d339aae9525", size = 107131, upload-time = "2025-10-14T04:41:10.467Z" },
+    { url = "https://files.pythonhosted.org/packages/d0/d9/0ed4c7098a861482a7b6a95603edce4c0d9db2311af23da1fb2b75ec26fc/charset_normalizer-3.4.4-cp312-cp312-win_arm64.whl", hash = "sha256:376bec83a63b8021bb5c8ea75e21c4ccb86e7e45ca4eb81146091b56599b80c3", size = 100390, upload-time = "2025-10-14T04:41:11.915Z" },
+    { url = "https://files.pythonhosted.org/packages/97/45/4b3a1239bbacd321068ea6e7ac28875b03ab8bc0aa0966452db17cd36714/charset_normalizer-3.4.4-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:e1f185f86a6f3403aa2420e815904c67b2f9ebc443f045edd0de921108345794", size = 208091, upload-time = "2025-10-14T04:41:13.346Z" },
+    { url = "https://files.pythonhosted.org/packages/7d/62/73a6d7450829655a35bb88a88fca7d736f9882a27eacdca2c6d505b57e2e/charset_normalizer-3.4.4-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6b39f987ae8ccdf0d2642338faf2abb1862340facc796048b604ef14919e55ed", size = 147936, upload-time = "2025-10-14T04:41:14.461Z" },
+    { url = "https://files.pythonhosted.org/packages/89/c5/adb8c8b3d6625bef6d88b251bbb0d95f8205831b987631ab0c8bb5d937c2/charset_normalizer-3.4.4-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:3162d5d8ce1bb98dd51af660f2121c55d0fa541b46dff7bb9b9f86ea1d87de72", size = 144180, upload-time = "2025-10-14T04:41:15.588Z" },
+    { url = "https://files.pythonhosted.org/packages/91/ed/9706e4070682d1cc219050b6048bfd293ccf67b3d4f5a4f39207453d4b99/charset_normalizer-3.4.4-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:81d5eb2a312700f4ecaa977a8235b634ce853200e828fbadf3a9c50bab278328", size = 161346, upload-time = "2025-10-14T04:41:16.738Z" },
+    { url = "https://files.pythonhosted.org/packages/d5/0d/031f0d95e4972901a2f6f09ef055751805ff541511dc1252ba3ca1f80cf5/charset_normalizer-3.4.4-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5bd2293095d766545ec1a8f612559f6b40abc0eb18bb2f5d1171872d34036ede", size = 158874, upload-time = "2025-10-14T04:41:17.923Z" },
+    { url = "https://files.pythonhosted.org/packages/f5/83/6ab5883f57c9c801ce5e5677242328aa45592be8a00644310a008d04f922/charset_normalizer-3.4.4-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a8a8b89589086a25749f471e6a900d3f662d1d3b6e2e59dcecf787b1cc3a1894", size = 153076, upload-time = "2025-10-14T04:41:19.106Z" },
+    { url = "https://files.pythonhosted.org/packages/75/1e/5ff781ddf5260e387d6419959ee89ef13878229732732ee73cdae01800f2/charset_normalizer-3.4.4-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:bc7637e2f80d8530ee4a78e878bce464f70087ce73cf7c1caf142416923b98f1", size = 150601, upload-time = "2025-10-14T04:41:20.245Z" },
+    { url = "https://files.pythonhosted.org/packages/d7/57/71be810965493d3510a6ca79b90c19e48696fb1ff964da319334b12677f0/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f8bf04158c6b607d747e93949aa60618b61312fe647a6369f88ce2ff16043490", size = 150376, upload-time = "2025-10-14T04:41:21.398Z" },
+    { url = "https://files.pythonhosted.org/packages/e5/d5/c3d057a78c181d007014feb7e9f2e65905a6c4ef182c0ddf0de2924edd65/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:554af85e960429cf30784dd47447d5125aaa3b99a6f0683589dbd27e2f45da44", size = 144825, upload-time = "2025-10-14T04:41:22.583Z" },
+    { url = "https://files.pythonhosted.org/packages/e6/8c/d0406294828d4976f275ffbe66f00266c4b3136b7506941d87c00cab5272/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:74018750915ee7ad843a774364e13a3db91682f26142baddf775342c3f5b1133", size = 162583, upload-time = "2025-10-14T04:41:23.754Z" },
+    { url = "https://files.pythonhosted.org/packages/d7/24/e2aa1f18c8f15c4c0e932d9287b8609dd30ad56dbe41d926bd846e22fb8d/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:c0463276121fdee9c49b98908b3a89c39be45d86d1dbaa22957e38f6321d4ce3", size = 150366, upload-time = "2025-10-14T04:41:25.27Z" },
+    { url = "https://files.pythonhosted.org/packages/e4/5b/1e6160c7739aad1e2df054300cc618b06bf784a7a164b0f238360721ab86/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:362d61fd13843997c1c446760ef36f240cf81d3ebf74ac62652aebaf7838561e", size = 160300, upload-time = "2025-10-14T04:41:26.725Z" },
+    { url = "https://files.pythonhosted.org/packages/7a/10/f882167cd207fbdd743e55534d5d9620e095089d176d55cb22d5322f2afd/charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9a26f18905b8dd5d685d6d07b0cdf98a79f3c7a918906af7cc143ea2e164c8bc", size = 154465, upload-time = "2025-10-14T04:41:28.322Z" },
+    { url = "https://files.pythonhosted.org/packages/89/66/c7a9e1b7429be72123441bfdbaf2bc13faab3f90b933f664db506dea5915/charset_normalizer-3.4.4-cp313-cp313-win32.whl", hash = "sha256:9b35f4c90079ff2e2edc5b26c0c77925e5d2d255c42c74fdb70fb49b172726ac", size = 99404, upload-time = "2025-10-14T04:41:29.95Z" },
+    { url = "https://files.pythonhosted.org/packages/c4/26/b9924fa27db384bdcd97ab83b4f0a8058d96ad9626ead570674d5e737d90/charset_normalizer-3.4.4-cp313-cp313-win_amd64.whl", hash = "sha256:b435cba5f4f750aa6c0a0d92c541fb79f69a387c91e61f1795227e4ed9cece14", size = 107092, upload-time = "2025-10-14T04:41:31.188Z" },
+    { url = "https://files.pythonhosted.org/packages/af/8f/3ed4bfa0c0c72a7ca17f0380cd9e4dd842b09f664e780c13cff1dcf2ef1b/charset_normalizer-3.4.4-cp313-cp313-win_arm64.whl", hash = "sha256:542d2cee80be6f80247095cc36c418f7bddd14f4a6de45af91dfad36d817bba2", size = 100408, upload-time = "2025-10-14T04:41:32.624Z" },
+    { url = "https://files.pythonhosted.org/packages/2a/35/7051599bd493e62411d6ede36fd5af83a38f37c4767b92884df7301db25d/charset_normalizer-3.4.4-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:da3326d9e65ef63a817ecbcc0df6e94463713b754fe293eaa03da99befb9a5bd", size = 207746, upload-time = "2025-10-14T04:41:33.773Z" },
+    { url = "https://files.pythonhosted.org/packages/10/9a/97c8d48ef10d6cd4fcead2415523221624bf58bcf68a802721a6bc807c8f/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8af65f14dc14a79b924524b1e7fffe304517b2bff5a58bf64f30b98bbc5079eb", size = 147889, upload-time = "2025-10-14T04:41:34.897Z" },
+    { url = "https://files.pythonhosted.org/packages/10/bf/979224a919a1b606c82bd2c5fa49b5c6d5727aa47b4312bb27b1734f53cd/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:74664978bb272435107de04e36db5a9735e78232b85b77d45cfb38f758efd33e", size = 143641, upload-time = "2025-10-14T04:41:36.116Z" },
+    { url = "https://files.pythonhosted.org/packages/ba/33/0ad65587441fc730dc7bd90e9716b30b4702dc7b617e6ba4997dc8651495/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:752944c7ffbfdd10c074dc58ec2d5a8a4cd9493b314d367c14d24c17684ddd14", size = 160779, upload-time = "2025-10-14T04:41:37.229Z" },
+    { url = "https://files.pythonhosted.org/packages/67/ed/331d6b249259ee71ddea93f6f2f0a56cfebd46938bde6fcc6f7b9a3d0e09/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d1f13550535ad8cff21b8d757a3257963e951d96e20ec82ab44bc64aeb62a191", size = 159035, upload-time = "2025-10-14T04:41:38.368Z" },
+    { url = "https://files.pythonhosted.org/packages/67/ff/f6b948ca32e4f2a4576aa129d8bed61f2e0543bf9f5f2b7fc3758ed005c9/charset_normalizer-3.4.4-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ecaae4149d99b1c9e7b88bb03e3221956f68fd6d50be2ef061b2381b61d20838", size = 152542, upload-time = "2025-10-14T04:41:39.862Z" },
+    { url = "https://files.pythonhosted.org/packages/16/85/276033dcbcc369eb176594de22728541a925b2632f9716428c851b149e83/charset_normalizer-3.4.4-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:cb6254dc36b47a990e59e1068afacdcd02958bdcce30bb50cc1700a8b9d624a6", size = 149524, upload-time = "2025-10-14T04:41:41.319Z" },
+    { url = "https://files.pythonhosted.org/packages/9e/f2/6a2a1f722b6aba37050e626530a46a68f74e63683947a8acff92569f979a/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:c8ae8a0f02f57a6e61203a31428fa1d677cbe50c93622b4149d5c0f319c1d19e", size = 150395, upload-time = "2025-10-14T04:41:42.539Z" },
+    { url = "https://files.pythonhosted.org/packages/60/bb/2186cb2f2bbaea6338cad15ce23a67f9b0672929744381e28b0592676824/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:47cc91b2f4dd2833fddaedd2893006b0106129d4b94fdb6af1f4ce5a9965577c", size = 143680, upload-time = "2025-10-14T04:41:43.661Z" },
+    { url = "https://files.pythonhosted.org/packages/7d/a5/bf6f13b772fbb2a90360eb620d52ed8f796f3c5caee8398c3b2eb7b1c60d/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:82004af6c302b5d3ab2cfc4cc5f29db16123b1a8417f2e25f9066f91d4411090", size = 162045, upload-time = "2025-10-14T04:41:44.821Z" },
+    { url = "https://files.pythonhosted.org/packages/df/c5/d1be898bf0dc3ef9030c3825e5d3b83f2c528d207d246cbabe245966808d/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:2b7d8f6c26245217bd2ad053761201e9f9680f8ce52f0fcd8d0755aeae5b2152", size = 149687, upload-time = "2025-10-14T04:41:46.442Z" },
+    { url = "https://files.pythonhosted.org/packages/a5/42/90c1f7b9341eef50c8a1cb3f098ac43b0508413f33affd762855f67a410e/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:799a7a5e4fb2d5898c60b640fd4981d6a25f1c11790935a44ce38c54e985f828", size = 160014, upload-time = "2025-10-14T04:41:47.631Z" },
+    { url = "https://files.pythonhosted.org/packages/76/be/4d3ee471e8145d12795ab655ece37baed0929462a86e72372fd25859047c/charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:99ae2cffebb06e6c22bdc25801d7b30f503cc87dbd283479e7b606f70aff57ec", size = 154044, upload-time = "2025-10-14T04:41:48.81Z" },
+    { url = "https://files.pythonhosted.org/packages/b0/6f/8f7af07237c34a1defe7defc565a9bc1807762f672c0fde711a4b22bf9c0/charset_normalizer-3.4.4-cp314-cp314-win32.whl", hash = "sha256:f9d332f8c2a2fcbffe1378594431458ddbef721c1769d78e2cbc06280d8155f9", size = 99940, upload-time = "2025-10-14T04:41:49.946Z" },
+    { url = "https://files.pythonhosted.org/packages/4b/51/8ade005e5ca5b0d80fb4aff72a3775b325bdc3d27408c8113811a7cbe640/charset_normalizer-3.4.4-cp314-cp314-win_amd64.whl", hash = "sha256:8a6562c3700cce886c5be75ade4a5db4214fda19fede41d9792d100288d8f94c", size = 107104, upload-time = "2025-10-14T04:41:51.051Z" },
+    { url = "https://files.pythonhosted.org/packages/da/5f/6b8f83a55bb8278772c5ae54a577f3099025f9ade59d0136ac24a0df4bde/charset_normalizer-3.4.4-cp314-cp314-win_arm64.whl", hash = "sha256:de00632ca48df9daf77a2c65a484531649261ec9f25489917f09e455cb09ddb2", size = 100743, upload-time = "2025-10-14T04:41:52.122Z" },
+    { url = "https://files.pythonhosted.org/packages/0a/4c/925909008ed5a988ccbb72dcc897407e5d6d3bd72410d69e051fc0c14647/charset_normalizer-3.4.4-py3-none-any.whl", hash = "sha256:7a32c560861a02ff789ad905a2fe94e3f840803362c84fecf1851cb4cf3dc37f", size = 53402, upload-time = "2025-10-14T04:42:31.76Z" },
+]
+
+[[package]]
+name = "click"
+version = "8.3.1"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+    { name = "colorama", marker = "sys_platform == 'win32'" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/3d/fa/656b739db8587d7b5dfa22e22ed02566950fbfbcdc20311993483657a5c0/click-8.3.1.tar.gz", hash = "sha256:12ff4785d337a1bb490bb7e9c2b1ee5da3112e94a8622f26a6c77f5d2fc6842a", size = 295065, upload-time = "2025-11-15T20:45:42.706Z" }
+wheels = [
+    { url = "https://files.pythonhosted.org/packages/98/78/01c019cdb5d6498122777c1a43056ebb3ebfeef2076d9d026bfe15583b2b/click-8.3.1-py3-none-any.whl", hash = "sha256:981153a64e25f12d547d3426c367a4857371575ee7ad18df2a6183ab0545b2a6", size = 108274, upload-time = "2025-11-15T20:45:41.139Z" },
+]
+
+[[package]]
+name = "colorama"
+version = "0.4.6"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697, upload-time = "2022-10-25T02:36:22.414Z" }
+wheels = [
+    { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" },
+]
+
+[[package]]
+name = "flask"
+version = "3.1.2"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+    { name = "blinker" },
+    { name = "click" },
+    { name = "itsdangerous" },
+    { name = "jinja2" },
+    { name = "markupsafe" },
+    { name = "werkzeug" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/dc/6d/cfe3c0fcc5e477df242b98bfe186a4c34357b4847e87ecaef04507332dab/flask-3.1.2.tar.gz", hash = "sha256:bf656c15c80190ed628ad08cdfd3aaa35beb087855e2f494910aa3774cc4fd87", size = 720160, upload-time = "2025-08-19T21:03:21.205Z" }
+wheels = [
+    { url = "https://files.pythonhosted.org/packages/ec/f9/7f9263c5695f4bd0023734af91bedb2ff8209e8de6ead162f35d8dc762fd/flask-3.1.2-py3-none-any.whl", hash = "sha256:ca1d8112ec8a6158cc29ea4858963350011b5c846a414cdb7a954aa9e967d03c", size = 103308, upload-time = "2025-08-19T21:03:19.499Z" },
+]
+
+[[package]]
+name = "grelay"
+version = "0.1.0"
+source = { virtual = "." }
+dependencies = [
+    { name = "flask" },
+    { name = "requests" },
+]
+
+[package.metadata]
+requires-dist = [
+    { name = "flask", specifier = ">=3.1.2" },
+    { name = "requests", specifier = ">=2.32.5" },
+]
+
+[[package]]
+name = "idna"
+version = "3.11"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/6f/6d/0703ccc57f3a7233505399edb88de3cbd678da106337b9fcde432b65ed60/idna-3.11.tar.gz", hash = "sha256:795dafcc9c04ed0c1fb032c2aa73654d8e8c5023a7df64a53f39190ada629902", size = 194582, upload-time = "2025-10-12T14:55:20.501Z" }
+wheels = [
+    { url = "https://files.pythonhosted.org/packages/0e/61/66938bbb5fc52dbdf84594873d5b51fb1f7c7794e9c0f5bd885f30bc507b/idna-3.11-py3-none-any.whl", hash = "sha256:771a87f49d9defaf64091e6e6fe9c18d4833f140bd19464795bc32d966ca37ea", size = 71008, upload-time = "2025-10-12T14:55:18.883Z" },
+]
+
+[[package]]
+name = "itsdangerous"
+version = "2.2.0"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/9c/cb/8ac0172223afbccb63986cc25049b154ecfb5e85932587206f42317be31d/itsdangerous-2.2.0.tar.gz", hash = "sha256:e0050c0b7da1eea53ffaf149c0cfbb5c6e2e2b69c4bef22c81fa6eb73e5f6173", size = 54410, upload-time = "2024-04-16T21:28:15.614Z" }
+wheels = [
+    { url = "https://files.pythonhosted.org/packages/04/96/92447566d16df59b2a776c0fb82dbc4d9e07cd95062562af01e408583fc4/itsdangerous-2.2.0-py3-none-any.whl", hash = "sha256:c6242fc49e35958c8b15141343aa660db5fc54d4f13a1db01a3f5891b98700ef", size = 16234, upload-time = "2024-04-16T21:28:14.499Z" },
+]
+
+[[package]]
+name = "jinja2"
+version = "3.1.6"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+    { name = "markupsafe" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/df/bf/f7da0350254c0ed7c72f3e33cef02e048281fec7ecec5f032d4aac52226b/jinja2-3.1.6.tar.gz", hash = "sha256:0137fb05990d35f1275a587e9aee6d56da821fc83491a0fb838183be43f66d6d", size = 245115, upload-time = "2025-03-05T20:05:02.478Z" }
+wheels = [
+    { url = "https://files.pythonhosted.org/packages/62/a1/3d680cbfd5f4b8f15abc1d571870c5fc3e594bb582bc3b64ea099db13e56/jinja2-3.1.6-py3-none-any.whl", hash = "sha256:85ece4451f492d0c13c5dd7c13a64681a86afae63a5f347908daf103ce6d2f67", size = 134899, upload-time = "2025-03-05T20:05:00.369Z" },
+]
+
+[[package]]
+name = "markupsafe"
+version = "3.0.3"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/7e/99/7690b6d4034fffd95959cbe0c02de8deb3098cc577c67bb6a24fe5d7caa7/markupsafe-3.0.3.tar.gz", hash = "sha256:722695808f4b6457b320fdc131280796bdceb04ab50fe1795cd540799ebe1698", size = 80313, upload-time = "2025-09-27T18:37:40.426Z" }
+wheels = [
+    { url = "https://files.pythonhosted.org/packages/08/db/fefacb2136439fc8dd20e797950e749aa1f4997ed584c62cfb8ef7c2be0e/markupsafe-3.0.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:1cc7ea17a6824959616c525620e387f6dd30fec8cb44f649e31712db02123dad", size = 11631, upload-time = "2025-09-27T18:36:18.185Z" },
+    { url = "https://files.pythonhosted.org/packages/e1/2e/5898933336b61975ce9dc04decbc0a7f2fee78c30353c5efba7f2d6ff27a/markupsafe-3.0.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:4bd4cd07944443f5a265608cc6aab442e4f74dff8088b0dfc8238647b8f6ae9a", size = 12058, upload-time = "2025-09-27T18:36:19.444Z" },
+    { url = "https://files.pythonhosted.org/packages/1d/09/adf2df3699d87d1d8184038df46a9c80d78c0148492323f4693df54e17bb/markupsafe-3.0.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6b5420a1d9450023228968e7e6a9ce57f65d148ab56d2313fcd589eee96a7a50", size = 24287, upload-time = "2025-09-27T18:36:20.768Z" },
+    { url = "https://files.pythonhosted.org/packages/30/ac/0273f6fcb5f42e314c6d8cd99effae6a5354604d461b8d392b5ec9530a54/markupsafe-3.0.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0bf2a864d67e76e5c9a34dc26ec616a66b9888e25e7b9460e1c76d3293bd9dbf", size = 22940, upload-time = "2025-09-27T18:36:22.249Z" },
+    { url = "https://files.pythonhosted.org/packages/19/ae/31c1be199ef767124c042c6c3e904da327a2f7f0cd63a0337e1eca2967a8/markupsafe-3.0.3-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:bc51efed119bc9cfdf792cdeaa4d67e8f6fcccab66ed4bfdd6bde3e59bfcbb2f", size = 21887, upload-time = "2025-09-27T18:36:23.535Z" },
+    { url = "https://files.pythonhosted.org/packages/b2/76/7edcab99d5349a4532a459e1fe64f0b0467a3365056ae550d3bcf3f79e1e/markupsafe-3.0.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:068f375c472b3e7acbe2d5318dea141359e6900156b5b2ba06a30b169086b91a", size = 23692, upload-time = "2025-09-27T18:36:24.823Z" },
+    { url = "https://files.pythonhosted.org/packages/a4/28/6e74cdd26d7514849143d69f0bf2399f929c37dc2b31e6829fd2045b2765/markupsafe-3.0.3-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:7be7b61bb172e1ed687f1754f8e7484f1c8019780f6f6b0786e76bb01c2ae115", size = 21471, upload-time = "2025-09-27T18:36:25.95Z" },
+    { url = "https://files.pythonhosted.org/packages/62/7e/a145f36a5c2945673e590850a6f8014318d5577ed7e5920a4b3448e0865d/markupsafe-3.0.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:f9e130248f4462aaa8e2552d547f36ddadbeaa573879158d721bbd33dfe4743a", size = 22923, upload-time = "2025-09-27T18:36:27.109Z" },
+    { url = "https://files.pythonhosted.org/packages/0f/62/d9c46a7f5c9adbeeeda52f5b8d802e1094e9717705a645efc71b0913a0a8/markupsafe-3.0.3-cp311-cp311-win32.whl", hash = "sha256:0db14f5dafddbb6d9208827849fad01f1a2609380add406671a26386cdf15a19", size = 14572, upload-time = "2025-09-27T18:36:28.045Z" },
+    { url = "https://files.pythonhosted.org/packages/83/8a/4414c03d3f891739326e1783338e48fb49781cc915b2e0ee052aa490d586/markupsafe-3.0.3-cp311-cp311-win_amd64.whl", hash = "sha256:de8a88e63464af587c950061a5e6a67d3632e36df62b986892331d4620a35c01", size = 15077, upload-time = "2025-09-27T18:36:29.025Z" },
+    { url = "https://files.pythonhosted.org/packages/35/73/893072b42e6862f319b5207adc9ae06070f095b358655f077f69a35601f0/markupsafe-3.0.3-cp311-cp311-win_arm64.whl", hash = "sha256:3b562dd9e9ea93f13d53989d23a7e775fdfd1066c33494ff43f5418bc8c58a5c", size = 13876, upload-time = "2025-09-27T18:36:29.954Z" },
+    { url = "https://files.pythonhosted.org/packages/5a/72/147da192e38635ada20e0a2e1a51cf8823d2119ce8883f7053879c2199b5/markupsafe-3.0.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:d53197da72cc091b024dd97249dfc7794d6a56530370992a5e1a08983ad9230e", size = 11615, upload-time = "2025-09-27T18:36:30.854Z" },
+    { url = "https://files.pythonhosted.org/packages/9a/81/7e4e08678a1f98521201c3079f77db69fb552acd56067661f8c2f534a718/markupsafe-3.0.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1872df69a4de6aead3491198eaf13810b565bdbeec3ae2dc8780f14458ec73ce", size = 12020, upload-time = "2025-09-27T18:36:31.971Z" },
+    { url = "https://files.pythonhosted.org/packages/1e/2c/799f4742efc39633a1b54a92eec4082e4f815314869865d876824c257c1e/markupsafe-3.0.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3a7e8ae81ae39e62a41ec302f972ba6ae23a5c5396c8e60113e9066ef893da0d", size = 24332, upload-time = "2025-09-27T18:36:32.813Z" },
+    { url = "https://files.pythonhosted.org/packages/3c/2e/8d0c2ab90a8c1d9a24f0399058ab8519a3279d1bd4289511d74e909f060e/markupsafe-3.0.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d6dd0be5b5b189d31db7cda48b91d7e0a9795f31430b7f271219ab30f1d3ac9d", size = 22947, upload-time = "2025-09-27T18:36:33.86Z" },
+    { url = "https://files.pythonhosted.org/packages/2c/54/887f3092a85238093a0b2154bd629c89444f395618842e8b0c41783898ea/markupsafe-3.0.3-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:94c6f0bb423f739146aec64595853541634bde58b2135f27f61c1ffd1cd4d16a", size = 21962, upload-time = "2025-09-27T18:36:35.099Z" },
+    { url = "https://files.pythonhosted.org/packages/c9/2f/336b8c7b6f4a4d95e91119dc8521402461b74a485558d8f238a68312f11c/markupsafe-3.0.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:be8813b57049a7dc738189df53d69395eba14fb99345e0a5994914a3864c8a4b", size = 23760, upload-time = "2025-09-27T18:36:36.001Z" },
+    { url = "https://files.pythonhosted.org/packages/32/43/67935f2b7e4982ffb50a4d169b724d74b62a3964bc1a9a527f5ac4f1ee2b/markupsafe-3.0.3-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:83891d0e9fb81a825d9a6d61e3f07550ca70a076484292a70fde82c4b807286f", size = 21529, upload-time = "2025-09-27T18:36:36.906Z" },
+    { url = "https://files.pythonhosted.org/packages/89/e0/4486f11e51bbba8b0c041098859e869e304d1c261e59244baa3d295d47b7/markupsafe-3.0.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:77f0643abe7495da77fb436f50f8dab76dbc6e5fd25d39589a0f1fe6548bfa2b", size = 23015, upload-time = "2025-09-27T18:36:37.868Z" },
+    { url = "https://files.pythonhosted.org/packages/2f/e1/78ee7a023dac597a5825441ebd17170785a9dab23de95d2c7508ade94e0e/markupsafe-3.0.3-cp312-cp312-win32.whl", hash = "sha256:d88b440e37a16e651bda4c7c2b930eb586fd15ca7406cb39e211fcff3bf3017d", size = 14540, upload-time = "2025-09-27T18:36:38.761Z" },
+    { url = "https://files.pythonhosted.org/packages/aa/5b/bec5aa9bbbb2c946ca2733ef9c4ca91c91b6a24580193e891b5f7dbe8e1e/markupsafe-3.0.3-cp312-cp312-win_amd64.whl", hash = "sha256:26a5784ded40c9e318cfc2bdb30fe164bdb8665ded9cd64d500a34fb42067b1c", size = 15105, upload-time = "2025-09-27T18:36:39.701Z" },
+    { url = "https://files.pythonhosted.org/packages/e5/f1/216fc1bbfd74011693a4fd837e7026152e89c4bcf3e77b6692fba9923123/markupsafe-3.0.3-cp312-cp312-win_arm64.whl", hash = "sha256:35add3b638a5d900e807944a078b51922212fb3dedb01633a8defc4b01a3c85f", size = 13906, upload-time = "2025-09-27T18:36:40.689Z" },
+    { url = "https://files.pythonhosted.org/packages/38/2f/907b9c7bbba283e68f20259574b13d005c121a0fa4c175f9bed27c4597ff/markupsafe-3.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:e1cf1972137e83c5d4c136c43ced9ac51d0e124706ee1c8aa8532c1287fa8795", size = 11622, upload-time = "2025-09-27T18:36:41.777Z" },
+    { url = "https://files.pythonhosted.org/packages/9c/d9/5f7756922cdd676869eca1c4e3c0cd0df60ed30199ffd775e319089cb3ed/markupsafe-3.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:116bb52f642a37c115f517494ea5feb03889e04df47eeff5b130b1808ce7c219", size = 12029, upload-time = "2025-09-27T18:36:43.257Z" },
+    { url = "https://files.pythonhosted.org/packages/00/07/575a68c754943058c78f30db02ee03a64b3c638586fba6a6dd56830b30a3/markupsafe-3.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:133a43e73a802c5562be9bbcd03d090aa5a1fe899db609c29e8c8d815c5f6de6", size = 24374, upload-time = "2025-09-27T18:36:44.508Z" },
+    { url = "https://files.pythonhosted.org/packages/a9/21/9b05698b46f218fc0e118e1f8168395c65c8a2c750ae2bab54fc4bd4e0e8/markupsafe-3.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ccfcd093f13f0f0b7fdd0f198b90053bf7b2f02a3927a30e63f3ccc9df56b676", size = 22980, upload-time = "2025-09-27T18:36:45.385Z" },
+    { url = "https://files.pythonhosted.org/packages/7f/71/544260864f893f18b6827315b988c146b559391e6e7e8f7252839b1b846a/markupsafe-3.0.3-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:509fa21c6deb7a7a273d629cf5ec029bc209d1a51178615ddf718f5918992ab9", size = 21990, upload-time = "2025-09-27T18:36:46.916Z" },
+    { url = "https://files.pythonhosted.org/packages/c2/28/b50fc2f74d1ad761af2f5dcce7492648b983d00a65b8c0e0cb457c82ebbe/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a4afe79fb3de0b7097d81da19090f4df4f8d3a2b3adaa8764138aac2e44f3af1", size = 23784, upload-time = "2025-09-27T18:36:47.884Z" },
+    { url = "https://files.pythonhosted.org/packages/ed/76/104b2aa106a208da8b17a2fb72e033a5a9d7073c68f7e508b94916ed47a9/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:795e7751525cae078558e679d646ae45574b47ed6e7771863fcc079a6171a0fc", size = 21588, upload-time = "2025-09-27T18:36:48.82Z" },
+    { url = "https://files.pythonhosted.org/packages/b5/99/16a5eb2d140087ebd97180d95249b00a03aa87e29cc224056274f2e45fd6/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8485f406a96febb5140bfeca44a73e3ce5116b2501ac54fe953e488fb1d03b12", size = 23041, upload-time = "2025-09-27T18:36:49.797Z" },
+    { url = "https://files.pythonhosted.org/packages/19/bc/e7140ed90c5d61d77cea142eed9f9c303f4c4806f60a1044c13e3f1471d0/markupsafe-3.0.3-cp313-cp313-win32.whl", hash = "sha256:bdd37121970bfd8be76c5fb069c7751683bdf373db1ed6c010162b2a130248ed", size = 14543, upload-time = "2025-09-27T18:36:51.584Z" },
+    { url = "https://files.pythonhosted.org/packages/05/73/c4abe620b841b6b791f2edc248f556900667a5a1cf023a6646967ae98335/markupsafe-3.0.3-cp313-cp313-win_amd64.whl", hash = "sha256:9a1abfdc021a164803f4d485104931fb8f8c1efd55bc6b748d2f5774e78b62c5", size = 15113, upload-time = "2025-09-27T18:36:52.537Z" },
+    { url = "https://files.pythonhosted.org/packages/f0/3a/fa34a0f7cfef23cf9500d68cb7c32dd64ffd58a12b09225fb03dd37d5b80/markupsafe-3.0.3-cp313-cp313-win_arm64.whl", hash = "sha256:7e68f88e5b8799aa49c85cd116c932a1ac15caaa3f5db09087854d218359e485", size = 13911, upload-time = "2025-09-27T18:36:53.513Z" },
+    { url = "https://files.pythonhosted.org/packages/e4/d7/e05cd7efe43a88a17a37b3ae96e79a19e846f3f456fe79c57ca61356ef01/markupsafe-3.0.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:218551f6df4868a8d527e3062d0fb968682fe92054e89978594c28e642c43a73", size = 11658, upload-time = "2025-09-27T18:36:54.819Z" },
+    { url = "https://files.pythonhosted.org/packages/99/9e/e412117548182ce2148bdeacdda3bb494260c0b0184360fe0d56389b523b/markupsafe-3.0.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:3524b778fe5cfb3452a09d31e7b5adefeea8c5be1d43c4f810ba09f2ceb29d37", size = 12066, upload-time = "2025-09-27T18:36:55.714Z" },
+    { url = "https://files.pythonhosted.org/packages/bc/e6/fa0ffcda717ef64a5108eaa7b4f5ed28d56122c9a6d70ab8b72f9f715c80/markupsafe-3.0.3-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4e885a3d1efa2eadc93c894a21770e4bc67899e3543680313b09f139e149ab19", size = 25639, upload-time = "2025-09-27T18:36:56.908Z" },
+    { url = "https://files.pythonhosted.org/packages/96/ec/2102e881fe9d25fc16cb4b25d5f5cde50970967ffa5dddafdb771237062d/markupsafe-3.0.3-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8709b08f4a89aa7586de0aadc8da56180242ee0ada3999749b183aa23df95025", size = 23569, upload-time = "2025-09-27T18:36:57.913Z" },
+    { url = "https://files.pythonhosted.org/packages/4b/30/6f2fce1f1f205fc9323255b216ca8a235b15860c34b6798f810f05828e32/markupsafe-3.0.3-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:b8512a91625c9b3da6f127803b166b629725e68af71f8184ae7e7d54686a56d6", size = 23284, upload-time = "2025-09-27T18:36:58.833Z" },
+    { url = "https://files.pythonhosted.org/packages/58/47/4a0ccea4ab9f5dcb6f79c0236d954acb382202721e704223a8aafa38b5c8/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:9b79b7a16f7fedff2495d684f2b59b0457c3b493778c9eed31111be64d58279f", size = 24801, upload-time = "2025-09-27T18:36:59.739Z" },
+    { url = "https://files.pythonhosted.org/packages/6a/70/3780e9b72180b6fecb83a4814d84c3bf4b4ae4bf0b19c27196104149734c/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:12c63dfb4a98206f045aa9563db46507995f7ef6d83b2f68eda65c307c6829eb", size = 22769, upload-time = "2025-09-27T18:37:00.719Z" },
+    { url = "https://files.pythonhosted.org/packages/98/c5/c03c7f4125180fc215220c035beac6b9cb684bc7a067c84fc69414d315f5/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:8f71bc33915be5186016f675cd83a1e08523649b0e33efdb898db577ef5bb009", size = 23642, upload-time = "2025-09-27T18:37:01.673Z" },
+    { url = "https://files.pythonhosted.org/packages/80/d6/2d1b89f6ca4bff1036499b1e29a1d02d282259f3681540e16563f27ebc23/markupsafe-3.0.3-cp313-cp313t-win32.whl", hash = "sha256:69c0b73548bc525c8cb9a251cddf1931d1db4d2258e9599c28c07ef3580ef354", size = 14612, upload-time = "2025-09-27T18:37:02.639Z" },
+    { url = "https://files.pythonhosted.org/packages/2b/98/e48a4bfba0a0ffcf9925fe2d69240bfaa19c6f7507b8cd09c70684a53c1e/markupsafe-3.0.3-cp313-cp313t-win_amd64.whl", hash = "sha256:1b4b79e8ebf6b55351f0d91fe80f893b4743f104bff22e90697db1590e47a218", size = 15200, upload-time = "2025-09-27T18:37:03.582Z" },
+    { url = "https://files.pythonhosted.org/packages/0e/72/e3cc540f351f316e9ed0f092757459afbc595824ca724cbc5a5d4263713f/markupsafe-3.0.3-cp313-cp313t-win_arm64.whl", hash = "sha256:ad2cf8aa28b8c020ab2fc8287b0f823d0a7d8630784c31e9ee5edea20f406287", size = 13973, upload-time = "2025-09-27T18:37:04.929Z" },
+    { url = "https://files.pythonhosted.org/packages/33/8a/8e42d4838cd89b7dde187011e97fe6c3af66d8c044997d2183fbd6d31352/markupsafe-3.0.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:eaa9599de571d72e2daf60164784109f19978b327a3910d3e9de8c97b5b70cfe", size = 11619, upload-time = "2025-09-27T18:37:06.342Z" },
+    { url = "https://files.pythonhosted.org/packages/b5/64/7660f8a4a8e53c924d0fa05dc3a55c9cee10bbd82b11c5afb27d44b096ce/markupsafe-3.0.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c47a551199eb8eb2121d4f0f15ae0f923d31350ab9280078d1e5f12b249e0026", size = 12029, upload-time = "2025-09-27T18:37:07.213Z" },
+    { url = "https://files.pythonhosted.org/packages/da/ef/e648bfd021127bef5fa12e1720ffed0c6cbb8310c8d9bea7266337ff06de/markupsafe-3.0.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f34c41761022dd093b4b6896d4810782ffbabe30f2d443ff5f083e0cbbb8c737", size = 24408, upload-time = "2025-09-27T18:37:09.572Z" },
+    { url = "https://files.pythonhosted.org/packages/41/3c/a36c2450754618e62008bf7435ccb0f88053e07592e6028a34776213d877/markupsafe-3.0.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:457a69a9577064c05a97c41f4e65148652db078a3a509039e64d3467b9e7ef97", size = 23005, upload-time = "2025-09-27T18:37:10.58Z" },
+    { url = "https://files.pythonhosted.org/packages/bc/20/b7fdf89a8456b099837cd1dc21974632a02a999ec9bf7ca3e490aacd98e7/markupsafe-3.0.3-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e8afc3f2ccfa24215f8cb28dcf43f0113ac3c37c2f0f0806d8c70e4228c5cf4d", size = 22048, upload-time = "2025-09-27T18:37:11.547Z" },
+    { url = "https://files.pythonhosted.org/packages/9a/a7/591f592afdc734f47db08a75793a55d7fbcc6902a723ae4cfbab61010cc5/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:ec15a59cf5af7be74194f7ab02d0f59a62bdcf1a537677ce67a2537c9b87fcda", size = 23821, upload-time = "2025-09-27T18:37:12.48Z" },
+    { url = "https://files.pythonhosted.org/packages/7d/33/45b24e4f44195b26521bc6f1a82197118f74df348556594bd2262bda1038/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:0eb9ff8191e8498cca014656ae6b8d61f39da5f95b488805da4bb029cccbfbaf", size = 21606, upload-time = "2025-09-27T18:37:13.485Z" },
+    { url = "https://files.pythonhosted.org/packages/ff/0e/53dfaca23a69fbfbbf17a4b64072090e70717344c52eaaaa9c5ddff1e5f0/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:2713baf880df847f2bece4230d4d094280f4e67b1e813eec43b4c0e144a34ffe", size = 23043, upload-time = "2025-09-27T18:37:14.408Z" },
+    { url = "https://files.pythonhosted.org/packages/46/11/f333a06fc16236d5238bfe74daccbca41459dcd8d1fa952e8fbd5dccfb70/markupsafe-3.0.3-cp314-cp314-win32.whl", hash = "sha256:729586769a26dbceff69f7a7dbbf59ab6572b99d94576a5592625d5b411576b9", size = 14747, upload-time = "2025-09-27T18:37:15.36Z" },
+    { url = "https://files.pythonhosted.org/packages/28/52/182836104b33b444e400b14f797212f720cbc9ed6ba34c800639d154e821/markupsafe-3.0.3-cp314-cp314-win_amd64.whl", hash = "sha256:bdc919ead48f234740ad807933cdf545180bfbe9342c2bb451556db2ed958581", size = 15341, upload-time = "2025-09-27T18:37:16.496Z" },
+    { url = "https://files.pythonhosted.org/packages/6f/18/acf23e91bd94fd7b3031558b1f013adfa21a8e407a3fdb32745538730382/markupsafe-3.0.3-cp314-cp314-win_arm64.whl", hash = "sha256:5a7d5dc5140555cf21a6fefbdbf8723f06fcd2f63ef108f2854de715e4422cb4", size = 14073, upload-time = "2025-09-27T18:37:17.476Z" },
+    { url = "https://files.pythonhosted.org/packages/3c/f0/57689aa4076e1b43b15fdfa646b04653969d50cf30c32a102762be2485da/markupsafe-3.0.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:1353ef0c1b138e1907ae78e2f6c63ff67501122006b0f9abad68fda5f4ffc6ab", size = 11661, upload-time = "2025-09-27T18:37:18.453Z" },
+    { url = "https://files.pythonhosted.org/packages/89/c3/2e67a7ca217c6912985ec766c6393b636fb0c2344443ff9d91404dc4c79f/markupsafe-3.0.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:1085e7fbddd3be5f89cc898938f42c0b3c711fdcb37d75221de2666af647c175", size = 12069, upload-time = "2025-09-27T18:37:19.332Z" },
+    { url = "https://files.pythonhosted.org/packages/f0/00/be561dce4e6ca66b15276e184ce4b8aec61fe83662cce2f7d72bd3249d28/markupsafe-3.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1b52b4fb9df4eb9ae465f8d0c228a00624de2334f216f178a995ccdcf82c4634", size = 25670, upload-time = "2025-09-27T18:37:20.245Z" },
+    { url = "https://files.pythonhosted.org/packages/50/09/c419f6f5a92e5fadde27efd190eca90f05e1261b10dbd8cbcb39cd8ea1dc/markupsafe-3.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fed51ac40f757d41b7c48425901843666a6677e3e8eb0abcff09e4ba6e664f50", size = 23598, upload-time = "2025-09-27T18:37:21.177Z" },
+    { url = "https://files.pythonhosted.org/packages/22/44/a0681611106e0b2921b3033fc19bc53323e0b50bc70cffdd19f7d679bb66/markupsafe-3.0.3-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:f190daf01f13c72eac4efd5c430a8de82489d9cff23c364c3ea822545032993e", size = 23261, upload-time = "2025-09-27T18:37:22.167Z" },
+    { url = "https://files.pythonhosted.org/packages/5f/57/1b0b3f100259dc9fffe780cfb60d4be71375510e435efec3d116b6436d43/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:e56b7d45a839a697b5eb268c82a71bd8c7f6c94d6fd50c3d577fa39a9f1409f5", size = 24835, upload-time = "2025-09-27T18:37:23.296Z" },
+    { url = "https://files.pythonhosted.org/packages/26/6a/4bf6d0c97c4920f1597cc14dd720705eca0bf7c787aebc6bb4d1bead5388/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:f3e98bb3798ead92273dc0e5fd0f31ade220f59a266ffd8a4f6065e0a3ce0523", size = 22733, upload-time = "2025-09-27T18:37:24.237Z" },
+    { url = "https://files.pythonhosted.org/packages/14/c7/ca723101509b518797fedc2fdf79ba57f886b4aca8a7d31857ba3ee8281f/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:5678211cb9333a6468fb8d8be0305520aa073f50d17f089b5b4b477ea6e67fdc", size = 23672, upload-time = "2025-09-27T18:37:25.271Z" },
+    { url = "https://files.pythonhosted.org/packages/fb/df/5bd7a48c256faecd1d36edc13133e51397e41b73bb77e1a69deab746ebac/markupsafe-3.0.3-cp314-cp314t-win32.whl", hash = "sha256:915c04ba3851909ce68ccc2b8e2cd691618c4dc4c4232fb7982bca3f41fd8c3d", size = 14819, upload-time = "2025-09-27T18:37:26.285Z" },
+    { url = "https://files.pythonhosted.org/packages/1a/8a/0402ba61a2f16038b48b39bccca271134be00c5c9f0f623208399333c448/markupsafe-3.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:4faffd047e07c38848ce017e8725090413cd80cbc23d86e55c587bf979e579c9", size = 15426, upload-time = "2025-09-27T18:37:27.316Z" },
+    { url = "https://files.pythonhosted.org/packages/70/bc/6f1c2f612465f5fa89b95bead1f44dcb607670fd42891d8fdcd5d039f4f4/markupsafe-3.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:32001d6a8fc98c8cb5c947787c5d08b0a50663d139f1305bac5885d98d9b40fa", size = 14146, upload-time = "2025-09-27T18:37:28.327Z" },
+]
+
+[[package]]
+name = "requests"
+version = "2.32.5"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+    { name = "certifi" },
+    { name = "charset-normalizer" },
+    { name = "idna" },
+    { name = "urllib3" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/c9/74/b3ff8e6c8446842c3f5c837e9c3dfcfe2018ea6ecef224c710c85ef728f4/requests-2.32.5.tar.gz", hash = "sha256:dbba0bac56e100853db0ea71b82b4dfd5fe2bf6d3754a8893c3af500cec7d7cf", size = 134517, upload-time = "2025-08-18T20:46:02.573Z" }
+wheels = [
+    { url = "https://files.pythonhosted.org/packages/1e/db/4254e3eabe8020b458f1a747140d32277ec7a271daf1d235b70dc0b4e6e3/requests-2.32.5-py3-none-any.whl", hash = "sha256:2462f94637a34fd532264295e186976db0f5d453d1cdd31473c85a6a161affb6", size = 64738, upload-time = "2025-08-18T20:46:00.542Z" },
+]
+
+[[package]]
+name = "urllib3"
+version = "2.6.2"
+source = { registry = "https://pypi.org/simple" }
+sdist = { url = "https://files.pythonhosted.org/packages/1e/24/a2a2ed9addd907787d7aa0355ba36a6cadf1768b934c652ea78acbd59dcd/urllib3-2.6.2.tar.gz", hash = "sha256:016f9c98bb7e98085cb2b4b17b87d2c702975664e4f060c6532e64d1c1a5e797", size = 432930, upload-time = "2025-12-11T15:56:40.252Z" }
+wheels = [
+    { url = "https://files.pythonhosted.org/packages/6d/b9/4095b668ea3678bf6a0af005527f39de12fb026516fb3df17495a733b7f8/urllib3-2.6.2-py3-none-any.whl", hash = "sha256:ec21cddfe7724fc7cb4ba4bea7aa8e2ef36f607a4bab81aa6ce42a13dc3f03dd", size = 131182, upload-time = "2025-12-11T15:56:38.584Z" },
+]
+
+[[package]]
+name = "werkzeug"
+version = "3.1.4"
+source = { registry = "https://pypi.org/simple" }
+dependencies = [
+    { name = "markupsafe" },
+]
+sdist = { url = "https://files.pythonhosted.org/packages/45/ea/b0f8eeb287f8df9066e56e831c7824ac6bab645dd6c7a8f4b2d767944f9b/werkzeug-3.1.4.tar.gz", hash = "sha256:cd3cd98b1b92dc3b7b3995038826c68097dcb16f9baa63abe35f20eafeb9fe5e", size = 864687, upload-time = "2025-11-29T02:15:22.841Z" }
+wheels = [
+    { url = "https://files.pythonhosted.org/packages/2f/f9/9e082990c2585c744734f85bec79b5dae5df9c974ffee58fe421652c8e91/werkzeug-3.1.4-py3-none-any.whl", hash = "sha256:2ad50fb9ed09cc3af22c54698351027ace879a0b60a3b5edf5730b2f7d876905", size = 224960, upload-time = "2025-11-29T02:15:21.13Z" },
+]