Anthropic

Memori Cloud supports all Anthropic Claude models. The max_tokens parameter is required for all Anthropic API calls.

Quick Start

Anthropic Integration
from anthropic import Anthropic
from memori import Memori

client = Anthropic()

mem = Memori().llm.register(client)
mem.attribution(entity_id="user_123", process_id="claude_assistant")

response = client.messages.create(
    model="claude-sonnet-4-5-20250929",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Hello!"}]
)
print(response.content[0].text)

Supported Modes

ModeMethod
Syncclient.messages.create()
Asyncawait client.messages.create()
Streamedclient.messages.stream()

System Prompts

Anthropic supports a top-level system parameter separate from the messages array. Memori captures both.

from anthropic import Anthropic
from memori import Memori

client = Anthropic()

mem = Memori().llm.register(client)
mem.attribution(entity_id="user_123", process_id="claude_assistant")

response = client.messages.create(
    model="claude-sonnet-4-5-20250929",
    max_tokens=1024,
    system="You are a helpful coding assistant.",
    messages=[
        {"role": "user", "content": "Explain Python decorators."}
    ]
)
print(response.content[0].text)