# https://www.kodosumi.io/ llms-full.txt## AI Agent Deployment
[home](https://www.kodosumi.io/)**New update** · Message about what’s new# Run AI Agents at Scale, Reliable and FastThe distributed runtime environment that manages and executes agentic services at enterprise scale.[Getting Started](https://docs.kodosumi.io/) [GitHub](https://github.com/masumi-network) [Discord](https://discord.com/invite/aj4QfnTS92)[](https://www.kodosumi.io/#)Framework AgnosticNewsletterSign up for updatesWelcome to the AI Agent Economy. You’ll now receive network news.Oops! Something went wrong while submitting the form.## Reasons for KodosumiCompetitive Analysis### Deploy locally, run pipelines in parallel, fine-tune - Kodosumi scales effortlessly thanks to [Ray](https://www.ray.io/).Instant Observability### See what is happening with built-in real-time monitoring via [Ray](https://www.ray.io/) dashboard.Zero Lock-In### Keep full control - Kodosumi is open source, framework agnostic, and portable across platforms.Minimal Configuration Overhead### Only a Single YAML config file required to directly deploy agents on Kodosumi.Natively Agentic### Focus on building your AI services, including long running ones, and let Kodosumi do the work behind the scenes!Learn how to build your own agent[See Details](https://www.kodosumi.io/#)## Key Challenges Solved by KodosumiOptional category/Tagline### Running Long-Lasting Agents- Agents often execute tasks of unpredictable duration.
- Kodosumi leverages Ray to reliably manage these complex, long-running workflows, that can run on industry-grade infrastructure.Optional category/Tagline### Handling Bursty Agent Traffic- Agent workloads can spike unexpectedly.
- Kodosumi smoothly scales horizontally across a [Ray](https://www.ray.io/) Cluster, ensuring consistent performance even under intense load.Optional category/Tagline### Debugging Complex Agentic Workflows- Complex agents require transparent observability.
- Kodosumi provides built-in real-time insights and detailed logging via the Ray dashboard to easily diagnose and resolve issues.Optional category/Tagline### Integrating Open Source Tools & LLMs- Teams want flexibility without lock-in.
- Kodosumi offers an open, framework-agnostic platform designed specifically for seamless integration with any existing LLMs (including self-hosted) and agent frameworks.[README.md](https://github.com/masumi-network/kodosumi)## Getting Started01**INSTALL KODOSUMI**#### Install Kodosumi#### Kodosumi is a PyPi package. Get it with pip or uv.PyPiJavaScriptCopied`pip install kodosumi`02**Create A DIRECTORY FOR YOUR AGENTIC APPS**#### Create service home#### A directory for your agentic appsJavaScriptCopied`mkdir ./home
cp -r ./kodosumi/apps/hymn ./home/` **`4`**03**PREPARE CONFIG.YAML**#### Configure the environment#### In `config.yaml` you define the Python package requirements and environment variables.config.yamlconfig.yamlJavaScriptCopied`applications:
- name: hymn
route_prefix: /hymn
import_path: hymn.app:fast_app
runtime_env:
pip:
- crewai
- crewai_tools
env_vars:
OTEL_SDK_DISABLED: "true"
OPENAI_API_KEY: add your key here ` **`5`**04**Start Ray**#### Start Ray as a daemon#### Change to `./home` and start Ray inside this directory so Ray can import from this directory.JavaScriptCopied`cd home
ray start --head`05**Configure environment variables**#### Deploy your agents#### Deploy example services, or your own flows right away.filename.shJavaScriptCopied`cp .env.` `example` ` .env
nano .env`06**Deploy example apps with Ray**#### Launch the service in your localhost Ray cluster.#### Ensure you start serve in the same directory as Ray ( `./home`).JavaScriptCopied`serve deploy ./hymn/config.yaml`07**Start Kodosumi**#### Access your service#### Finally start the kodosumi components and register ray endpoints available at [http://localhost:8001/-/routes](http://localhost:8001/-/routes).JavaScriptCopied`koco start --register http://localhost:8001/-/routes`[README.md](https://github.com/masumi-network/kodosumi)## Core ConceptsWe use a combination of terms Agents - Flows - Agentic Services to describe the system.#### Agent#### An autonomous object within the Kodosumi framework that can perform specific tasks or services.Agents can interact with other agents, process data, and execute complex workflows.#### Flow#### An automated process, workflow, or system of interconnected tasks working towards a common objective.We use the term Flow to emphasize its process-oriented nature.#### Agentic Service#### A self-contained, deployable unit within the Kodosumi framework.The Agentic Service integrates one or more Flows with required resources and configurations to deliver complete functionality.[Read Docs](https://docs.kodosumi.io/)Learn how to build your own agent[See Details](https://www.kodosumi.io/#)Free & Open-Source## Contribute, Create an Issue,
DiscussKodosomi Repo is a great option for getting answers to Masumi technical questions lorem ipsum.[Visit masumi repo](https://github.com/masumi-network)Sokosumi Marketplace is the customer-facing storefront and agentic platform, where end-users can hire agents. Publish your agents and earn when those are used.[Visit Sokosumi Marketplace](https://www.sokosumi.com/)Masumi Network makes deploying your agents easy and lets you participate in Sokosumi Marketplace.[Visit masumi network](https://www.masumi.network/)Joining Discord will get you quick answers and is the best place to discuss upcoming features and products with developers and early adopters.[Visit Discord](https://discord.com/invite/aj4QfnTS92)For Developers## Monetize Your AgentsDeploy your agent on Masumi Network to participate in Sokosumi Marketplace[Masumi Network](https://www.masumi.network/) [Sokosumi Marketplace](https://www.sokosumi.com/) [Masumi Repo](https://github.com/masumi-network)## FAQ### What exactly is Kodosumi?Kodosumi is a pre-configured runtime to build, deploy, and scale AI agents using Ray, Litestar and FastAPI.### Do I need Ray expertise?No. Kodosumi simplifies Ray deployment. If you know how to build Agents with Python, that should be enough.### Is Kodosumi production-ready?It depends. It’s built on Ray, Litestar and FastAPI - all trusted at enterprise scale. That said, we are still developing the framework, meaning some concepts may be a subject of change.### Can I use my existing AI models/agents/workflows/business logic?Absolutely. Kodosumi gives you full freedom of choice in terms of underlying agentic and business logic. Use any LLMs, vector stores, AI frameworks, SDKs, then wrap them in a scalable Kodosumi runtime!### Does Kodosumi lock me into a vendor?No vendor lock-in. Kodosumi is open-source and does not enforce developers to use any particular LLM vendors, agentic frameworks or SDKs.### Why both FastAPI and Litestar?FastAPI powers agent endpoints; Litestar runs the admin interface and core services.### Can I configure my runtime?Yes. Kodosumi uses a simple, minimal YAML file to configure deployable runtime alongside with config.py file which allows admin panels and development environment.### Does Kodosumi support Kubernetes or Docker?Yes. Deployments are consistent across Kubernetes, Docker, or bare metal.### Is real-time monitoring available?Built-in. The web panel streams real-time events and lets you replay runs.### What’s the pricing model?It’s free and open-source. Run it on-premises, locally, or in any cloud.we are online## Join the Community[Discord](https://discord.com/invite/aj4QfnTS92) [Twitter](https://x.com/MasumiNetwork).avif)