kodosumi

Run AI Agents at Scale, Reliable and Fast

The distributed runtime environment that manages and executes agentic services at enterprise scale.

Built for production. From day one.

01

Scalability via Ray

Deploy locally, run pipelines in parallel, fine-tune. Kodosumi scales effortlessly thanks to Ray.

1 node1,000+ nodes
02

Real-time Monitoring

Built-in observability through Ray dashboard gives full visibility into agent operations and resource usage.

All systems

Operational

CPU: 42% | GPU: 78%
03

No Vendor Lock-in

Kodosumi is open source, framework agnostic, and portable across platforms. Use any AI framework you prefer.

CrewAIcompatible
LangChaincompatible
FastAPIcompatible
+ any Python framework
04

Minimal Configuration

A single YAML config file is all you need. Define dependencies, environment variables, and deploy.

config.yaml

dependencies:

- kodosumi

- crewai

env_vars: [API_KEY]

1 file · deploy in seconds

Three core concepts

Core Concept

Agent

An autonomous object that can perform specific tasks or services. Agents are the fundamental building blocks in Kodosumi, each encapsulating specialized capabilities that can be orchestrated into larger workflows.

1
Receives task request
Research QueryProcessing
2
Returns structured result
Analysis ReportComplete

Core Concept

Flow

An automated process, workflow, or system of interconnected tasks. Flows define how multiple agents coordinate, passing data between steps to accomplish complex objectives.

Flow Pipeline3 steps
1
Data IngestionCollector Agent
complete
2
AnalysisResearch Agent
running
3
Report GenerationWriter Agent
queued

Core Concept

Agentic Service

A self-contained, deployable unit that integrates one or more Flows. Agentic Services are what you deploy to Kodosumi. They bundle agents, flows, and configuration into a single runtime.

Deployed Service
Running
market-research-servicev1.2.0
Flow: data-collectFlow: analyzeFlow: report

3

Agents

2

Flows

99.9%

Uptime

ray|4 replicas|GPU: A100

Start running agents at scale.

Install Kodosumi, write a config, deploy your agents. Built on Ray. Open source.

$ pip install kodosumi

# Define your agent

from kodosumi import Agent, Flow

class ResearchAgent(Agent):

name = 'researcher'

model = 'gpt-4'

flow = Flow(agents=[ResearchAgent])

flow.deploy()

# Running on Ray.