OpenAI
Memori supports all OpenAI Chat Completions and Responses APIs. Both sync and async clients are fully supported.
Quick Start
OpenAI Integration
from memori import Memori
from openai import OpenAI
client = OpenAI()
mem = Memori().llm.register(client)
mem.attribution(entity_id="user_123", process_id="my_agent")
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Hello!"}]
)
print(response.choices[0].message.content)
Supported Modes
| Mode | Method |
|---|---|
| Sync | client.chat.completions.create() |
| Async | await client.chat.completions.create() |
| Streamed | stream=True parameter |
| Responses API | client.responses.create() |
Multi-Turn Conversations
Memori automatically captures each interaction and links them within the same session.
from memori import Memori
from openai import OpenAI
client = OpenAI()
mem = Memori().llm.register(client)
mem.attribution(entity_id="user_123", process_id="my_agent")
messages = [
{"role": "user", "content": "My name is Alice."}
]
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=messages
)
messages.append({
"role": "assistant",
"content": response.choices[0].message.content
})
messages.append({
"role": "user",
"content": "What's my name?"
})
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=messages
)
print(response.choices[0].message.content)