LangChain
Memori Cloud supports any LangChain chat model. Each class has its own registration keyword: ChatOpenAI for OpenAI, ChatAnthropic for Anthropic, ChatBedrock for BedRock, ChatGoogleGenerativeAI for Google Gen AI models.
Quick Start
LangChain Integration
from langchain_openai import ChatOpenAI
from memori import Memori
client = ChatOpenAI(model="gpt-4o-mini")
mem = Memori().llm.register(chatopenai=client)
mem.attribution(entity_id="user_123", process_id="langchain_agent")
response = client.invoke("Hello!")
print(response.content)
Different Providers
| Package | Chat Model | Registration Keyword |
|---|---|---|
langchain-openai | ChatOpenAI | chatopenai=client |
langchain-anthropic | ChatAnthropic | chatanthropic=client |
langchain-google-genai | ChatGoogleGenerativeAI | chatgooglegenai=client |
langchain-aws | ChatBedrock | chatbedrock=client |
LangChain Providers
from langchain_anthropic import ChatAnthropic
from memori import Memori
client = ChatAnthropic(model="claude-sonnet-4-5-20250929")
mem = Memori().llm.register(chatopenai=client)
Supported Modes
| Mode | Method |
|---|---|
| Sync | client.invoke() |
| Async | await client.ainvoke() |
| Streamed | client.stream() |