Agno
Memori Cloud integrates with Agno at the model layer. Register your Agno model with llm.register(...) and Memori captures run(), arun(), and streamed responses automatically.
Quick Start
Agno Integration
from agno.agent import Agent
from agno.models.openai import OpenAIChat
from memori import Memori
model = OpenAIChat(id="gpt-4o-mini")
mem = Memori().llm.register(openai_chat=model)
mem.attribution(entity_id="user_123", process_id="agno_agent")
agent = Agent(
model=model,
instructions=["Be helpful and concise"],
markdown=True,
)
response = agent.run("Hello!", session_id="support-session")
print(response.content)
Different Providers
Agno supports multiple model families. Use the matching registration keyword in Memori.
| Package | Model Class | Registration Keyword |
|---|---|---|
agno.models.openai | OpenAIChat | openai_chat=model |
agno.models.anthropic | Claude | claude=model |
agno.models.google | Gemini | gemini=model |
agno.models.xai | xAI | xai=model |
Agno Providers
from agno.models.openai import OpenAIChat
model = OpenAIChat(id="gpt-4o-mini")
mem = Memori().llm.register(openai_chat=model)
Supported Modes
| Mode | Method |
|---|---|
| Sync | agent.run() |
| Async | await agent.arun() |
| Streamed | agent.run(stream=True) |