Anthropic
Memori supports all Anthropic Claude models. Register your client once and every conversation is automatically captured and stored. Note: max_tokens is required for all Anthropic API calls.
Want a zero-setup option? The Memori Cloud at app.memorilabs.ai.
Quick Start
Anthropic Integration
from anthropic import Anthropic
from memori import Memori
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
engine = create_engine("sqlite:///memori.db")
SessionLocal = sessionmaker(bind=engine)
client = Anthropic()
mem = Memori(conn=SessionLocal).llm.register(client)
mem.config.storage.build()
mem.attribution(entity_id="user_123", process_id="claude_assistant")
response = client.messages.create(
model="claude-3-5-sonnet-20241022",
max_tokens=1024,
messages=[{"role": "user", "content": "Hello!"}]
)
print(response.content[0].text)
Supported Modes
| Mode | Method |
|---|---|
| Sync | client.messages.create() |
| Async | await client.messages.create() |
| Streamed | client.messages.stream() |
System Prompts
Anthropic supports a top-level system parameter separate from the messages array. Memori captures both.
response = client.messages.create(
model="claude-3-5-sonnet-20241022",
max_tokens=1024,
system="You are a helpful coding assistant.",
messages=[{"role": "user", "content": "Explain Python decorators."}]
)
print(response.content[0].text)