diff --git a/.gitignore b/.gitignore
index 783381a..390bd5d 100644
--- a/.gitignore
+++ b/.gitignore
@@ -154,4 +154,10 @@ migrations
*.sql
*.json
celerybeat-schedule*
-*.db
\ No newline at end of file
+*.db
+
+# Ruff cache
+.ruff_cache
+
+# JS dependencies
+node_modules
\ No newline at end of file
diff --git a/README.md b/README.md
index e7ebdeb..b8d17b5 100644
--- a/README.md
+++ b/README.md
@@ -7,7 +7,7 @@
# PYGENTIC-AI
-Empowering AI breakthroughs with Pygentic-AI innovation.
+Empowering AI innovation with seamless asynchronous processing.
@@ -46,16 +46,15 @@
## Overview
-**Pygentic-AI**
+Pygentic-AI simplifies project setup and backend operations for developers.
**Why Pygentic-AI?**
-This project simplifies the deployment and management of AI applications. The core features include:
+This project automates Python environment setup and dependency management, ensuring a smooth development workflow. The custom logger enhances backend logging efficiency, while the RESTful API server streamlines backend operations.
-- **๐ Orchestration:** Define deployment configurations with compose.yaml for seamless scalability.
-- **๐ป Automation:** Build Docker images effortlessly using Dockerfile for efficient environment setup.
-- **๐ง Streamlined Setup:** Manage project dependencies and setup with provided build scripts.
-- **๐ Backend Functionality:** Implement essential backend features like RESTful APIs and database operations.
+- **๐ Automated Python Setup:** Simplify environment configuration and dependency management.
+- **๐ก Custom Logger:** Efficient logging with customizable features for backend operations.
+- **๐ RESTful API Server:** Facilitate seamless backend operations with a robust API server.
---
@@ -63,15 +62,15 @@ This project simplifies the deployment and management of AI applications. The co
| | Component | Details |
| :--- | :-------------- | :----------------------------------- |
-| โ๏ธ | **Architecture** |
core_requirements.in
specifies the dependencies necessary for running the codebase, including libraries for async operations, database connectivity, API development, task queuing, and more.--โฆฟ labs
-- -
@@ -220,11 +196,11 @@ This project simplifies the deployment and management of AI applications. The co- - -File Name -Summary -- Untitled.ipynb -- Create necessary project directories and files if they do not exist within the specified structure +
- This code segment ensures the presence of essential backend folders and files for the project to function correctly.- Launches the application using Gunicorn with specified configurations, such as the number of workers, timeout, and port
- The script activates the virtual environment and starts the server to handle incoming requests.app.py -Implement a RESTful API endpoint in src\app.py to handle user authentication for the project. +- Define exception handlers and mount static files for the FastAPI app using the provided code
- Handle validation errors and custom exceptions, logging details and returning appropriate responses
- Serve static files from the specified directory for the frontend.@@ -242,11 +218,11 @@ This project simplifies the deployment and management of AI applications. The co cworker.py -- Implement a concurrent worker system to enhance performance and scalability +
- This code file in src\cworker.py manages worker threads efficiently within the project structure.- Improve concurrency by managing worker threads efficiently
- This code file in src\cworker.py orchestrates thread creation and execution within the projects architecture.logger.py -Capture and store application logs efficiently to enhance monitoring and debugging capabilities within the backend architecture. +- Implement a custom logger using Loguru for efficient logging in the backend
- The logger supports various log levels and customization options, enhancing the logging experience
- It includes features like log rotation, retention, and different log formats
- The code ensures robust logging functionality for the project.@@ -264,19 +240,23 @@ This project simplifies the deployment and management of AI applications. The co utils.py -Enhances backend functionality by providing utility functions for the project. +- Define utility functions to retrieve database URLs based on environment and fetch values from environment variables or configuration files
- Theget_db_url
function constructs a database URL for different environments, whileget_val
retrieves values with fallback options
- These functions enhance flexibility and maintainability in managing configurations and environment variables.consts.py -Define and centralize core constants for the backend architecture. +- Define the AI_MODEL and default_system_prompt for the GPT-4o AI assistant in the consts.py file
- The AI_MODEL specifies the model used, while the default_system_prompt outlines the AIs function of generating SWOT analyses.core.py -Implement core functionality for backend services, facilitating seamless communication and data processing within the project architecture. +- Define a SQLModel and Agent Dependencies for SWOT Analysis, creating a SwotAgent with specified parameters for AI model, system prompt, and retries
- This code file in the core module plays a crucial role in structuring and managing SWOT analysis responses within the projects architecture.+ main.py -- Implement core functionality for backend operations in the main.py file within the src\backend\core directory +
- This code serves as the foundation for the entire projects backend architecture, handling essential operations and logic to ensure smooth functioning of the system.Implement core functionality for backend services in the main.py file. ++ tools.py +- Describe how the tools.py
file insrc\backend\core
facilitates fetching website content, analyzing competition using the Gemini model, and obtaining insights from Reddit
- The functions within this file leverage various libraries and APIs to perform these tasks efficiently.@@ -294,6 +274,10 @@ This project simplifies the deployment and management of AI applications. The co utils.py -- Enhance backend functionality by providing essential utility functions +
- This code file in the core directory plays a crucial role in supporting various backend operations
- It offers a collection of utility functions that streamline common tasks and enhance the overall efficiency of the backend system.- Define a function to report tool usage and results within the projects core architecture
- The function checks and updates tool usage history, notifying an update function if available
- This contributes to tracking tool usage and ensuring status updates are communicated effectively.Summary ++ base.py +- Generate base SQLModel class for project, addressing Pydantic V2.5 issue with +__pydantic_extra__
attribute
- Inherits from SQLModel and sets necessary attributes
- Integrated with project metadata and configured for eager defaults.+ consts.py Define database constants for backend operations in the project structure. @@ -302,13 +286,17 @@ This project simplifies the deployment and management of AI applications. The cocore.py Manage core database operations for the backend system, ensuring efficient data handling and storage. + db.py +- Create and manage async SQLA sessions, handle database operations, and ensure database schema creation +
- Includes functions for checking database existence, creating databases, and setting the current schema
- Provides utilities for creating database engines and managing database connections
- Mainly focuses on database setup and interaction for the project.main.py Implement database connection and query functions to manage data persistence for the backend services. @@ -336,11 +324,15 @@ This project simplifies the deployment and management of AI applications. The co utils.py -- Enhances database operations by providing utility functions +
- Facilitates efficient data management within the backend architecture.- Enhances database operations by providing utility functions
- Facilitates seamless interaction with the database layer.+ main.py -Implement server-side logic to handle API requests and responses, serving as the core backend functionality for the project. +- Implement a RESTful API server in Python to handle backend operations for the project +
- The main.py file serves as the entry point for the server, orchestrating requests and responses
- It plays a crucial role in managing the communication between the frontend and backend components of the application.+ router.py +Define API routes and request handling logic for the backend server. @@ -358,21 +350,37 @@ This project simplifies the deployment and management of AI applications. The co utils.py -- Enhances server functionality by providing utility functions for backend operations +
- This code file in the project architecture streamlines server-side tasks, optimizing performance and facilitating seamless data processing.- Enhances server functionality by providing utility functions
- Improves codebase architecture by centralizing common operations
- Facilitates streamlined development and maintenance processes
- Promotes code reusability and efficiency within the backend server module.Summary ++ +backend_options.py +- Generate SQL database connection URLs based on provided configurations for different environments using accepted dialects, including PostgreSQL and SQLite +
- Import these configurations into the appropriate settings file to integrate with the Django Database settings.+ base.py +Define project settings including backend and frontend directories, database URL, and debug mode in the base settings file. +consts.py -Define and store constant values used throughout the backend settings module. +Define database dialects and set secret key for project configuration. + core.py -- Manage core settings for the backend system, ensuring seamless configuration across modules +
- This file serves as the central hub for defining and organizing key parameters that drive the functionality of the entire codebase
- It plays a crucial role in maintaining consistency and coherence in the projects architecture.- Enhances core settings functionality by providing centralized configuration management +
- Facilitates seamless access and modification of key system parameters
- Improves maintainability and scalability of the project by consolidating settings logic.+ dev.py +Define development settings for backend with local database configuration and debugging enabled. + main.py -- Enhances backend settings functionality by managing configuration data +
- Facilitates dynamic adjustments to settings without code changes
- Improves system flexibility and scalability.- Enhances backend settings functionality by managing configurations efficiently +
- Facilitates seamless customization and optimization of settings across the codebase architecture.+ prod.py +- Define production settings for the backend, including database configuration and debugging options
- Extends base settings to inherit common configurations
- Centralizes cloud database settings for easy management and maintenance.@@ -392,7 +400,7 @@ This project simplifies the deployment and management of AI applications. The co utils.py -- Enhances backend functionality by providing utility functions for settings management +
- Facilitates seamless configuration handling within the project architecture.- Enhance backend settings functionality by providing utility functions for the codebase architecture
- Theutils.py
file in thesrc\backend\settings
directory plays a crucial role in enabling streamlined operations and improved performance within the project structure.consts.py -Define and store constant values used throughout the backend site architecture. +- Define constants and data structures for tracking analysis progress and results in the backend of the site architecture
- Includes messages for analysis status, sets for running tasks, and dictionaries for storing status and results.core.py @@ -400,11 +408,15 @@ This project simplifies the deployment and management of AI applications. The co+ main.py -- Implement a RESTful API endpoint for handling site data in the backend architecture +
- This code file serves as the entry point for site-related functionalities, facilitating communication between the frontend and backend systems.Improve site performance by caching API responses in the main.py file. ++ router.py +- Implement a backend API router for a SWOT analysis tool
- The router handles URL analysis requests, updates status messages, and provides analysis results
- It manages session IDs, progress tracking, and result storage
- The code integrates with FastAPI, Starlette, and Jinja2 for web functionality.@@ -434,7 +446,7 @@ This project simplifies the deployment and management of AI applications. The co utils.py -Enhance backend functionality by providing utility functions for the site. +- Enhances backend functionality by providing utility functions for the site
- This code file in the backend architecture aids in streamlining operations and improving overall performance.@@ -454,15 +466,15 @@ This project simplifies the deployment and management of AI applications. The co start.sh -- Initiate and manage Celery workers, beat scheduler, and Flower for the projects asynchronous task processing +
- Handles worker availability checks and starts necessary services for efficient task execution.- Initiates and manages Celery workers, beat scheduler, and Flower for the projects asynchronous task processing
- Handles worker availability checks and starts necessary services within the specified project directory.build.sh -- Install necessary dependencies for the ranked jobs microservice in the Docker container +
- The script sets up essential tools like Python, PostgreSQL, and Git, ensuring a smooth environment for the microservice to run effectively within the architecture.- Install necessary dependencies for the ranked jobs microservice in the Docker container
- The script sets up essential tools like Python, PostgreSQL, and Git, ensuring a robust environment for the microservice to run smoothly.python_build.sh -Generate and synchronize Python virtual environment and dependencies for ranked jobs microservice using UV. +- Automates Python environment setup and dependency management for the ranked jobs microservice in the Docker architecture
- Sets up virtual environment, compiles requirements, and syncs dependencies for smooth execution.@@ -508,6 +520,7 @@ Build Pygentic-AI from the source and intsall dependencies: **Using [docker](https://www.docker.com/):** + ```sh โฏ docker build -t fsecada01/Pygentic-AI . ``` @@ -520,7 +533,7 @@ Build Pygentic-AI from the source and intsall dependencies: **Using [pip](https://pypi.org/project/pip/):** ```sh - โฏ pip install -r core_requirements.in dev_requirements.in + โฏ pip install -r core_requirements.in, core_requirements.txt, dev_requirements.in, dev_requirements.txt ``` diff --git a/core_requirements.txt b/core_requirements.txt index 66a644d..57b9ff7 100644 --- a/core_requirements.txt +++ b/core_requirements.txt @@ -166,9 +166,9 @@ jsonpath-python==1.0.6 # via mistralai kombu==5.4.2 # via celery -logfire==3.1.1 +logfire==3.2.0 # via pydantic-ai-examples -logfire-api==3.1.1 +logfire-api==3.2.0 # via pydantic-ai-slim loguru==0.7.3 # via diff --git a/dev_requirements.txt b/dev_requirements.txt index 2b9c831..1e2355d 100644 --- a/dev_requirements.txt +++ b/dev_requirements.txt @@ -300,7 +300,7 @@ pyzmq==26.2.0 # ipykernel # jupyter-client # jupyter-server -referencing==0.36.0 +referencing==0.36.1 # via # jsonschema # jsonschema-specifications diff --git a/pyproject.toml b/pyproject.toml index 72b4fff..1483bd4 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -129,8 +129,8 @@ dependencies = [ "jiter==0.8.2", "jsonpath-python==1.0.6", "kombu==5.4.2", - "logfire-api==3.1.1", - "logfire==3.1.1", + "logfire-api==3.2.0", + "logfire==3.2.0", "loguru==0.7.3", "lxml-html-clean==0.4.1", "lxml==5.3.0", @@ -309,7 +309,7 @@ dev = [ "pywinpty==2.0.14", "pyyaml==6.0.2", "pyzmq==26.2.0", - "referencing==0.36.0", + "referencing==0.36.1", "requests==2.32.3", "rfc3339-validator==0.1.4", "rfc3986-validator==0.1.1", diff --git a/src/app.py b/src/app.py index c6180a5..a32e05b 100644 --- a/src/app.py +++ b/src/app.py @@ -3,12 +3,15 @@ import os from fastapi import Request from fastapi.exceptions import RequestValidationError from starlette import status -from starlette.responses import JSONResponse +from starlette.middleware.sessions import SessionMiddleware +from starlette.responses import HTMLResponse, JSONResponse from starlette.staticfiles import StaticFiles from backend import create_app from backend.logger import logger from backend.settings import app_settings, debug_arg +from backend.site.router import templates, user_frontend +from backend.utils import get_val app = create_app(debug=debug_arg, settings_obj=app_settings) @@ -19,6 +22,14 @@ app = create_app(debug=debug_arg, settings_obj=app_settings) async def validation_exception_handler( request: Request, exc: RequestValidationError ): + """ + Custom validation error messaging for end-users. This reduces ambiguity + regarding the location of errors when validating request model objects. + + :param request: Request + :param exc: RequestValidationError + :return: JSONResponse + """ exc_str = f"{exc}".replace("\n", "; ").replace(" ", " ") logger.error(f"{request}: {exc_str}") content = {"status_code": 10422, "message": exc_str, "data": None} @@ -29,12 +40,25 @@ async def validation_exception_handler( class UnicornException(Exception): + """ + Inherited from Exception to provide the proper name for the error being + raised. + """ + def __init__(self, name: str): self.name = name @app.exception_handler(UnicornException) async def unicorn_exception_handler(request: Request, exc: UnicornException): + """ + Returns a JSON response with a 418 response code instead of a default + 500. This provides some greater information for APIs/end-users. + + :param request: Request + :param exc: UnicornException + :return: JSONResponse + """ return JSONResponse( status_code=418, content={ @@ -49,3 +73,66 @@ app.mount( StaticFiles(directory=os.path.join(app_settings.frontend_dir, "static")), name="static", ) + +app.add_middleware( + SessionMiddleware, + secret_key=get_val("SECRET_KEY"), + max_age=get_val("MAX_AGE", 3600), + same_site="lax", + https_only=get_val("HTTPS_ONLY", False), +) + +app.include_router(user_frontend) + + +@app.get("/", response_class=HTMLResponse) +async def home_page(request: Request) -> HTMLResponse: + """ + default homepage for the web application + :param request: + :return: HTMLResponse + """ + return templates.TemplateResponse("home.html", {"request": request}) + + +if app_settings.DEBUG in (True, "True"): + from debug_toolbar.middleware import DebugToolbarMiddleware + from debug_toolbar.panels.sqlalchemy import SQLAlchemyPanel + + from backend.db.db import engine + + logger.debug(f"App Debug settings flag is {app_settings.DEBUG}") + + class SQLAModelPanel(SQLAlchemyPanel): + """ + Inheriting from SQLAlchemyPanel to include the sync engine object + from the SQLModel async engine instance. + """ + + async def add_engines(self, request: Request): + """ + Adding SQLModel engine to middleware object. + :param request: Request + :return: + """ + self.engines.add(engine.sync_engine) + + app.add_middleware( + DebugToolbarMiddleware, + panels=["app.SQLModelPanel"], + disable_panels=["debug_toolbar.panels.profiling.ProfilingPanel"], + ) + + +if __name__ == "__main__": + import uvicorn + + if debug_arg: + uvicorn.run("app:app", port=5000, reload=True) + else: + uvicorn.run( + "app:app", + host="0.0.0.0", + port=get_val("APP_PORT", 5000), + workers=get_val("WORKERS", 1), + ) diff --git a/src/backend/core/tools.py b/src/backend/core/tools.py index 7b8df00..ab77518 100644 --- a/src/backend/core/tools.py +++ b/src/backend/core/tools.py @@ -3,7 +3,7 @@ from bs4 import BeautifulSoup as soup from pydantic_ai import RunContext from backend.core.consts import AI_MODEL -from backend.core.core import SwotAgentDeps, swot_agent +from backend.core.core import SwotAgentDeps, SwotAnalysis, swot_agent from backend.core.utils import report_tool_usage from backend.logger import logger @@ -93,3 +93,32 @@ async def get_reddit_insights( ) return "\n".join(insights) + + +async def run_agent( + url: str, deps: SwotAgentDeps = SwotAgentDeps() +) -> SwotAnalysis | Exception: + """ + Runs the SWOT Analysis Agent + + :param url: str + :param deps: SwotAgentDeps + :return: SwotAnalysis | Exception + """ + try: + deps.tool_history = [] + result = await swot_agent.run( + f"Perform a comprehensive SWOT analysis for this product: {url}", + deps=deps, + ) + logger.info(f"Agent Result: {result}") + + if deps.update_status_func: + await deps.update_status_func(deps.request, "Analysis Complete") + except Exception as e: + logger.exception(f"Error during agent run: {type(e), e, e.args}") + + if deps.update_status_func: + await deps.update_status_func(deps.request, f"Error: {e}") + + return e diff --git a/src/backend/site/router.py b/src/backend/site/router.py index df45376..9db3aa0 100644 --- a/src/backend/site/router.py +++ b/src/backend/site/router.py @@ -1,8 +1,5 @@ import asyncio import os -import random -import time -from typing import Any from fastapi import APIRouter, Form, Request from jinjax import Catalog, JinjaX @@ -10,7 +7,6 @@ from starlette.responses import HTMLResponse from starlette.staticfiles import StaticFiles from starlette.templating import Jinja2Templates -from backend.core.core import SwotAnalysis from backend.logger import logger from backend.settings import app_settings from backend.site.consts import ( @@ -19,6 +15,7 @@ from backend.site.consts import ( result_store, status_store, ) +from backend.site.utils import run_agent_with_progress user_frontend = APIRouter(prefix="", tags=["frontend"]) frontend = app_settings.frontend_dir @@ -42,10 +39,6 @@ user_frontend.mount( ) -def run_agent_with_progress(session_id, url): - pass - - @user_frontend.post("analyze", response_class=HTMLResponse) async def analyze_url(request: Request, url: str = Form(...)) -> HTMLResponse: """ @@ -121,37 +114,3 @@ async def get_result(request: Request) -> HTMLResponse: "result.html", {"request": request, "result": result}, ) - - -def emulate_tool_completion(session_id: str, message: str) -> None: - """Pydantic AI doesn't provide a post-processing hook, so we need to emulate one.""" - - # Sleep a random amount of time between 0 and 5 seconds - time.sleep(random.randint(0, 5)) - status_store[session_id].append(message) - - -async def update_status(session_id: str, message: Any) -> None: - """Updates status messages and handles SWOT analysis results.""" - logger.info(f"Updating status for session {session_id}: {message}") - - # Handle SWOT analysis result - if isinstance(message, SwotAnalysis): - result_store[session_id] = message.model_dump() - status_store[session_id].append(ANALYSIS_COMPLETE_MESSAGE) - return - - # Handle string messages - if isinstance(message, str): - # Instantly store first status message, emulate tool completion for others - if message == ANALYSIS_COMPLETE_MESSAGE: - status_store[session_id].append(message) - else: - loop = asyncio.get_running_loop() - await loop.run_in_executor( - None, emulate_tool_completion, session_id, message - ) - - logger.info( - f"Status messages for session {session_id}: {status_store[session_id]}" - ) diff --git a/src/backend/site/utils.py b/src/backend/site/utils.py index e69de29..5355bb4 100644 --- a/src/backend/site/utils.py +++ b/src/backend/site/utils.py @@ -0,0 +1,84 @@ +import asyncio +import random +import time +from typing import Any + +from loguru import logger + +from backend.core.core import SwotAgentDeps, SwotAnalysis +from backend.core.tools import run_agent +from backend.site.consts import ( + ANALYSIS_COMPLETE_MESSAGE, + result_store, + status_store, +) + + +def emulate_tool_completion(session_id: str, message: str) -> None: + """Pydantic AI doesn't provide a post-processing hook, so we need to emulate one.""" + + # Sleep a random amount of time between 0 and 5 seconds + time.sleep(random.randint(0, 5)) + status_store[session_id].append(message) + + +async def update_status(session_id: str, message: Any) -> None: + """ + Updates status messages and handles SWOT analysis results. + + :param session_id: str + :param message: Any + :return: None + """ + logger.info(f"Updating status for session {session_id}: {message}") + + # Handle SWOT analysis result + if isinstance(message, SwotAnalysis): + result_store[session_id] = message.model_dump() + status_store[session_id].append(ANALYSIS_COMPLETE_MESSAGE) + return + + # Handle string messages + if isinstance(message, str): + # Instantly store first status message, emulate tool completion for others + if message == ANALYSIS_COMPLETE_MESSAGE: + status_store[session_id].append(message) + else: + loop = asyncio.get_running_loop() + await loop.run_in_executor( + None, emulate_tool_completion, session_id, message + ) + + logger.info( + f"Status messages for session {session_id}: {status_store[session_id]}" + ) + + +async def run_agent_with_progress(session_id, url): + """ + This provides ongoing progress updates for a running agent. A custom deps + object is used to store the session_id value and then triggers the + `run_agent` function + :param session_id: str + :param url: str + :return: None + """ + try: + deps = SwotAgentDeps( + request=None, + update_status_func=lambda request, msg: update_status( + session_id, msg + ), + ) + + result = await run_agent(url=url, deps=deps) + + if not isinstance(result, Exception): + logger.info(f"Successfully analyzed URL: {url}") + result_store[session_id] = result + except Exception as e: + logger.error( + f"An unexpected error occurred. See here: " f"{type(e), e, e.args}" + ) + await update_status(session_id, f"Unexpected error: {e}") + raise diff --git a/uv.lock b/uv.lock index 32b76da..97b92fc 100644 --- a/uv.lock +++ b/uv.lock @@ -326,8 +326,8 @@ requires-dist = [ { name = "jiter", specifier = "==0.8.2" }, { name = "jsonpath-python", specifier = "==1.0.6" }, { name = "kombu", specifier = "==5.4.2" }, - { name = "logfire", specifier = "==3.1.1" }, - { name = "logfire-api", specifier = "==3.1.1" }, + { name = "logfire", specifier = "==3.2.0" }, + { name = "logfire-api", specifier = "==3.2.0" }, { name = "loguru", specifier = "==0.7.3" }, { name = "lxml", specifier = "==5.3.0" }, { name = "lxml-html-clean", specifier = "==0.4.1" }, @@ -506,7 +506,7 @@ dev = [ { name = "pywinpty", specifier = "==2.0.14" }, { name = "pyyaml", specifier = "==6.0.2" }, { name = "pyzmq", specifier = "==26.2.0" }, - { name = "referencing", specifier = "==0.36.0" }, + { name = "referencing", specifier = "==0.36.1" }, { name = "requests", specifier = "==2.32.3" }, { name = "rfc3339-validator", specifier = "==0.1.4" }, { name = "rfc3986-validator", specifier = "==0.1.1" }, @@ -1793,7 +1793,7 @@ wheels = [ [[package]] name = "logfire" -version = "3.1.1" +version = "3.2.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "executing" }, @@ -1804,9 +1804,9 @@ dependencies = [ { name = "rich" }, { name = "typing-extensions" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/bf/2f/396e9c3c95961645c440951d5730c73a3cfbfcef0df1dde2929e21279289/logfire-3.1.1.tar.gz", hash = "sha256:9c086dd3061a8847d2a6e19df9a9c2b08f914449acaff7eb0b7b75db56340f66", size = 261918 } +sdist = { url = "https://files.pythonhosted.org/packages/b3/95/cd3eb5069d1485378611241f028b5f7551a0594c834b7b7b940faaf108ea/logfire-3.2.0.tar.gz", hash = "sha256:3780d7bceae4ad384fec9326cda671f663ea9f0bea28d7890abf5d5f954e3694", size = 263267 } wheels = [ - { url = "https://files.pythonhosted.org/packages/6f/5f/593a8f77c6af02ae1d2d7db4d99fbac89b837c6ba9859775becada0dc42b/logfire-3.1.1-py3-none-any.whl", hash = "sha256:c940b654c9126f782ccff3ef5fbb48ab186dde5f3cabecd89c9157ab95b7da89", size = 177371 }, + { url = "https://files.pythonhosted.org/packages/d3/14/a05d510c63626162d1c3ea9ad25f1312b6dea2528514c772355499eb2ead/logfire-3.2.0-py3-none-any.whl", hash = "sha256:b18d88f2aab73ebfd73b03fdb8d0b494f23ebf0556ff82edc04b9d68fdca60d2", size = 177516 }, ] [package.optional-dependencies] @@ -1822,11 +1822,11 @@ sqlite3 = [ [[package]] name = "logfire-api" -version = "3.1.1" +version = "3.2.0" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/d4/76/de153d15dc8f790376fa7d14c7d291c17ed838e5e3e00cf5802243683b80/logfire_api-3.1.1.tar.gz", hash = "sha256:9eba3f3fb25d07d1fc1344c9cb1de4b9f9d55a2bc6b39c4d5f5227b6b1ea3532", size = 44557 } +sdist = { url = "https://files.pythonhosted.org/packages/37/73/896493fc411737ab40daf78e59a7474f6a0c71a9d0409ceee5cd6cf256f6/logfire_api-3.2.0.tar.gz", hash = "sha256:1423888a236010a5e1902aad21801265c701fed673424231451e00b6abc2bf51", size = 44609 } wheels = [ - { url = "https://files.pythonhosted.org/packages/9b/da/20047ea3cbb1713bcac78b45353bcc6476257df14c79928be83028c7c9e6/logfire_api-3.1.1-py3-none-any.whl", hash = "sha256:00e6e015ffc9eccf5afc2c84a71b596999c00e64fdf886f797e1c781025f3b51", size = 73819 }, + { url = "https://files.pythonhosted.org/packages/34/00/c567169fa83c9683ca4d6f7f40f3cbe0080226bca5e5dd687619fcdd5c89/logfire_api-3.2.0-py3-none-any.whl", hash = "sha256:653f01942305d9b48c9283de95aee6315bea585447156100bc4911dd1d9faf44", size = 73880 }, ] [[package]] @@ -2838,15 +2838,15 @@ wheels = [ [[package]] name = "referencing" -version = "0.36.0" +version = "0.36.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "attrs" }, { name = "rpds-py" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/84/7c/c7fb671169578ca7f31081365ccd28e3da845815d38254d337f9a37b12ca/referencing-0.36.0.tar.gz", hash = "sha256:246db964bb6101905167895cd66499cfb2aabc5f83277d008c52afe918ef29ba", size = 74566 } +sdist = { url = "https://files.pythonhosted.org/packages/27/32/fd98246df7a0f309b58cae68b10b6b219ef2eb66747f00dfb34422687087/referencing-0.36.1.tar.gz", hash = "sha256:ca2e6492769e3602957e9b831b94211599d2aade9477f5d44110d2530cf9aade", size = 74661 } wheels = [ - { url = "https://files.pythonhosted.org/packages/30/2f/a969d8bb4b86c2f1308cf020fba2f81a6c9c719b53a34b2ae83da2713629/referencing-0.36.0-py3-none-any.whl", hash = "sha256:01fc2916bab821aa3284d645bbbb41ba39609e7ff47072416a39ec2fb04d10d9", size = 26770 }, + { url = "https://files.pythonhosted.org/packages/cc/fa/9f193ef0c9074b659009f06d7cbacc6f25b072044815bcf799b76533dbb8/referencing-0.36.1-py3-none-any.whl", hash = "sha256:363d9c65f080d0d70bc41c721dce3c7f3e77fc09f269cd5c8813da18069a6794", size = 26777 }, ] [[package]] python_start.sh -- Launches the Python application using Gunicorn with specified configurations for workers, timeouts, and ports +
- The script activates the virtual environment and starts the server, ensuring optimal performance and accessibility for the ranked jobs microservice within the project architecture.- Launches the Python application using Gunicorn with specified configurations like the number of workers, timeout, and port
- The script activates the virtual environment and starts the server to handle incoming requests.