OpenAI
Memori supports all OpenAI models and API styles. Register your client once and every call is automatically captured and stored.
Want a zero-setup option? The Memori Cloud at app.memorilabs.ai.
Quick Start
OpenAI Integration
from memori import Memori
from openai import OpenAI
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
engine = create_engine("sqlite:///memori.db")
SessionLocal = sessionmaker(bind=engine)
client = OpenAI()
mem = Memori(conn=SessionLocal).llm.register(client)
mem.config.storage.build()
mem.attribution(entity_id="user_123", process_id="my_agent")
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Hello!"}]
)
print(response.choices[0].message.content)
Supported Modes
| Mode | Method |
|---|---|
| Sync | client.chat.completions.create() |
| Async | await client.chat.completions.create() |
| Streamed | stream=True parameter |
| Responses API | client.responses.create() |
Multi-Turn Conversations
Memori captures each call and links them within the same session. Pass your full conversation history as usual.
from memori import Memori
from openai import OpenAI
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
engine = create_engine("sqlite:///memori.db")
SessionLocal = sessionmaker(bind=engine)
client = OpenAI()
mem = Memori(conn=SessionLocal).llm.register(client)
mem.attribution(entity_id="user_123", process_id="my_agent")
messages = [{"role": "user", "content": "My name is Alice."}]
response = client.chat.completions.create(model="gpt-4o-mini", messages=messages)
messages.append({"role": "assistant", "content": response.choices[0].message.content})
messages.append({"role": "user", "content": "What's my name?"})
response = client.chat.completions.create(model="gpt-4o-mini", messages=messages)
print(response.choices[0].message.content)