AWS Bedrock
Memori supports AWS Bedrock through langchain-aws. Register using the chatbedrock keyword to access Claude, Llama, Mistral, and other Bedrock models.
Want a zero-setup option? The Memori Cloud at app.memorilabs.ai.
Quick Start
Bedrock Integration
from langchain_aws import ChatBedrock
from memori import Memori
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
engine = create_engine("sqlite:///memori.db")
SessionLocal = sessionmaker(bind=engine)
client = ChatBedrock(
model_id="anthropic.claude-3-5-sonnet-20241022-v2:0",
region_name="us-east-1"
)
mem = Memori(conn=SessionLocal).llm.register(chatbedrock=client)
mem.config.storage.build()
mem.attribution(entity_id="user_123", process_id="bedrock_agent")
response = client.invoke("Hello!")
print(response.content)
Supported Modes
| Mode | Method |
|---|---|
| Sync | client.invoke() |
| Async | await client.ainvoke() |
| Streamed | client.stream() |
Available Models
| Model | Model ID |
|---|---|
| Claude 3.5 Sonnet | anthropic.claude-3-5-sonnet-20241022-v2:0 |
| Claude 3 Haiku | anthropic.claude-3-haiku-20240307-v1:0 |
| Llama 3.1 70B | meta.llama3-1-70b-instruct-v1:0 |
| Mistral Large | mistral.mistral-large-2407-v1:0 |