Cycls is an open source AI platform for deploying, and publishing AI agents. A hosted version is now available in private beta for early access.
Cycls design philosophy is to not get in your way by removing friction and unnecessary setup, so you can stay focused on the AI.
pip install cycls
Your first AI agent with Cycls is just a few lines of code:
import cycls
agent = cycls.Agent(name="spark", keys=["ak-...", "as-..."])
@agent("hello")
def func(messages):
return "Hello, world!"
# return messages[-1].get("content") # echo last message
agent.run()
Run it with:
python agent.py
Here's a minimal example showing how a Cycls agent is structured. You define the project name, optional package installs, and the agent handler function, all in one place:
import cycls
agent = cycls.Agent(
name="spark", # project name
install=["openai"], # optional: install packages locally, zero config
keys=["ak-...", "as-..."] # your cycls keys
)
@agent("hello") # this is the agent name (e.g. hello.cycls.run)
def func(messages): # messages = chat history (list of dicts)
return "Hello, world!" # return a string, supports Markdown, HTML, Tailwind
agent.run() # finally, launch it in development mode
name
: Name of your project. This shows up on your dashboard and is used internally.@agent("hello")
: This defines an endpoint for your agent. It becomes a URL like https://hello.cycls.ai.install=[...]
: Installs Python packages locally inside the agent runtime, no setup required.func(messages)
: The heart of your agent. You get the full message history and return a string, anything from plain text to rich HTML.Multiple agents?
Yes, you can declare as many as you want per file.
Cycls supports async handlers and streaming output, making it ideal for working with LLMs like OpenAI, Groq, and others.
import cycls
agent = cycls.Agent(name="spark",
install=["openai"],
keys=["ak-...", "as-..."])
async def llm(x):
import openai
client = openai.AsyncOpenAI(api_key="sk-...")
model = "gpt-4o"
response = await client.chat.completions.create(model=model, messages=x, temperature=0.2, stream=True)
async def event_stream():
async for chunk in response:
content = chunk.choices[0].delta.content
if content:
yield content
return event_stream()
@agent("hello")
async def func(messages):
return await llm(messages)
agent.run()
In the example above, we added "openai"
to the install
list. This tells Cycls to install the package inside the agent's own runtime, not your local environment. That's why the import openai
happens inside the function. Everything is self-contained and runs exactly where it needs to.
The OpenAI SDK also works with any provider that follows the same API format. For example, you can switch to Groq and stream Gemma, Google's open source model:
client = openai.AsyncOpenAI(
base_url="https://api.groq.com/openai/v1",
api_key="gsk..."
)
model = "gemma-2-9b-it"
That's it. Youβre now streaming output from an open source LLM, with zero infrastructure and minimal changes.
Cycls gives you full access to the raw messages
list, so you can shape the prompt however you want before passing it to an LLM. This is useful for injecting system instructions, trimming history, or building your own message stack.
@agent("poet")
async def func(messages):
system = {
"role": "system",
"content": "You are a helpful assistant who speaks in poems."
}
prompt = [system] + messages
return await llm(prompt)
When your agent is ready to go live, just call:
agent.push()
That's all it takes. Your agent is instantly published to the web with a public URL and ready for real users. No extra setup, no deployment pipeline, no DevOps. Just push and it's live.
Cycls lets you return rich content from your agents, not just plain text. You can yield:
- Markdown (code blocks, tables, lists)
- Tailwind-styled HTML (cards, layouts, custom elements)
- Media (images, audio, video)
- Interactive snippets (buttons, links, forms) You can stream these outputs line by line, just like you'd stream a conversation.
@agent("readme")
def func(messages):
yield "# Cycls is awesome π\n"
yield "Here's a quick list:\n\n- Easy install\n- Instant deploy\n- Markdown + HTML output\n"
yield "```python\n"
yield "print('hello world')\n"
yield "```"
@agent("html")
def func(messages):
yield """<div class="p-4 bg-white rounded-2xl shadow-md">
<h2 class="text-xl font-bold text-gray-800">Welcome to Cycls</h2>
<p class="text-gray-600 mt-2">Build AI agents with zero config.</p>
</div>
"""
@agent("media")
def func(messages):
yield ""
yield """<audio controls class="mt-4">
<source src="https://upload.wikimedia.org/wikipedia/commons/2/24/Bourne_woods_Birdsong_and_rain_2020-06-17_0742.mp3" type="audio/mpeg" />
Your browser does not support the audio element.
</audio>
"""
yield """<video controls class="mt-4 w-full rounded-xl">
<source src="https://www.w3schools.com/html/mov_bbb.mp4" type="video/mp4" />
Your browser does not support the video tag.
</video>
"""
In this example, we populate the DB during runtime and use OpenAI embeddings to handle semantic search.
import cycls
agent = cycls.Agent(name="spark",
install=["openai", "chromadb"],
keys=["ak-...", "as-..."])
@agent("rag")
def func(messages):
import chromadb
import chromadb.utils.embedding_functions as embedding_functions
client = chromadb.PersistentClient(path="db")
embedder = embedding_functions.OpenAIEmbeddingFunction(
api_key="sk-...",
model_name="text-embedding-3-large")
collection = client.get_or_create_collection(name="my_collection", embedding_function=embedder)
collection.add(documents=["This is a document about pineapple",
"This is a document about oranges"],
ids=["id1", "id2"])
results = collection.query(
query_texts=[messages[-1].get("content")], n_results=1) # e.g. This is a query document about hawaii
return f"{results.get("documents")[0][0]}" # f"{results}"
agent.run()
If you've already built your Chroma collection locally, just copy the directory into the agent runtime using copy
.
import cycls
agent = cycls.Agent(name="spark",
install=["openai", "chromadb"],
copy=["db"],
keys=["ak-...", "as-..."])
@agent("rag")
def func(messages):
import chromadb
import chromadb.utils.embedding_functions as embedding_functions
client = chromadb.PersistentClient(path="db")
embedder = embedding_functions.OpenAIEmbeddingFunction(
api_key="sk-...",
model_name="text-embedding-3-large")
collection = client.get_or_create_collection(name="my_collection", embedding_function=embedder)
results = collection.query(
query_texts=[messages[-1].get("content")], n_results=1) # e.g. This is a query document about hawaii
return f"{results.get("documents")[0][0]}" # f"{results}"
agent.run()
Single html file.
curl -X POST -H 'Authorization: Bearer sk-0123456789' 'https://agent.link' -d '{"messages":[{"role":"user","content":"what is cohomology?"}]}' --no-buffer