Agno

Memori Cloud integrates with Agno at the model layer. Register your Agno model with llm.register(...) and Memori captures run(), arun(), and streamed responses automatically.

Quick Start

Agno Integration
from agno.agent import Agent
from agno.models.openai import OpenAIChat
from memori import Memori

model = OpenAIChat(id="gpt-4o-mini")

mem = Memori().llm.register(openai_chat=model)
mem.attribution(entity_id="user_123", process_id="agno_agent")

agent = Agent(
    model=model,
    instructions=["Be helpful and concise"],
    markdown=True,
)

response = agent.run("Hello!", session_id="support-session")
print(response.content)

Different Providers

Agno supports multiple model families. Use the matching registration keyword in Memori.

PackageModel ClassRegistration Keyword
agno.models.openaiOpenAIChatopenai_chat=model
agno.models.anthropicClaudeclaude=model
agno.models.googleGeminigemini=model
agno.models.xaixAIxai=model
Agno Providers
from agno.models.openai import OpenAIChat

model = OpenAIChat(id="gpt-4o-mini")
mem = Memori().llm.register(openai_chat=model)

Supported Modes

ModeMethod
Syncagent.run()
Asyncawait agent.arun()
Streamedagent.run(stream=True)