Run AI Agents at Scale, Reliable and Fast
The distributed runtime environment that manages and executes agentic services at enterprise scale.
Built for production. From day one.
Scalability via Ray
Deploy locally, run pipelines in parallel, fine-tune. Kodosumi scales effortlessly thanks to Ray.
Real-time Monitoring
Built-in observability through Ray dashboard gives full visibility into agent operations and resource usage.
All systems
Operational
No Vendor Lock-in
Kodosumi is open source, framework agnostic, and portable across platforms. Use any AI framework you prefer.
Minimal Configuration
A single YAML config file is all you need. Define dependencies, environment variables, and deploy.
config.yaml
dependencies:
- kodosumi
- crewai
env_vars: [API_KEY]
Three core concepts
Core Concept
Agent
An autonomous object that can perform specific tasks or services. Agents are the fundamental building blocks in Kodosumi, each encapsulating specialized capabilities that can be orchestrated into larger workflows.
Core Concept
Flow
An automated process, workflow, or system of interconnected tasks. Flows define how multiple agents coordinate, passing data between steps to accomplish complex objectives.
Core Concept
Agentic Service
A self-contained, deployable unit that integrates one or more Flows. Agentic Services are what you deploy to Kodosumi. They bundle agents, flows, and configuration into a single runtime.
3
Agents
2
Flows
99.9%
Uptime
Start running agents at scale.
Install Kodosumi, write a config, deploy your agents. Built on Ray. Open source.
$ pip install kodosumi
# Define your agent
from kodosumi import Agent, Flow
class ResearchAgent(Agent):
name = 'researcher'
model = 'gpt-4'
flow = Flow(agents=[ResearchAgent])
flow.deploy()
# Running on Ray.
