# Memori > Memori is an open source system that gives your AI agents a structured, persistent memory layer. It automatically captures conversations, extracts meaningful facts, and makes them searchable across entities, processes, and sessions. ## Memori Cloud Docs - [Introduction](https://memorilabs.ai/docs/memori-cloud): Memori gives your AI agents structured, persistent memory — no database setup required. - [Getting Started / Python SDK Quickstart](https://memorilabs.ai/docs/memori-cloud/getting-started/python-quickstart): Get started with Memori Cloud in under 3 minutes. - [Getting Started / TypeScript SDK Quickstart](https://memorilabs.ai/docs/memori-cloud/getting-started/typescript-quickstart): Get started with Memori Cloud and the TypeScript SDK in under 3 minutes. - [Getting Started / Installation](https://memorilabs.ai/docs/memori-cloud/getting-started/installation): Install Memori and set up your API key for the Memori Cloud. - [Getting Started / Use Cases](https://memorilabs.ai/docs/memori-cloud/getting-started/use-cases): Common use cases and applications for Memori. - [Core Concepts / Architecture](https://memorilabs.ai/docs/memori-cloud/concepts/architecture): Understand how Memori's Cloud platform is designed — from your app to Memori Cloud, with managed storage, augmentation, and recall. - [Core Concepts / How Memori Works](https://memorilabs.ai/docs/memori-cloud/concepts/how-memory-works): Understand the core concepts behind Memori — entities, processes, sessions, memory types, attribution, and how recall brings it all together. - [Core Concepts / Advanced Augmentation](https://memorilabs.ai/docs/memori-cloud/concepts/advanced-augmentation): How Memori's Advanced Augmentation engine extracts structured facts, preferences, and knowledge from your AI conversations — all managed in Memori Cloud. - [Core Concepts / Async Patterns](https://memorilabs.ai/docs/memori-cloud/concepts/async-patterns): Best practices for using Memori with async/await in Python and TypeScript. - [Core Concepts / Multi-User Support](https://memorilabs.ai/docs/memori-cloud/concepts/multi-user-support): How Memori isolates memories across users, applications, and sessions so each user gets a personalized experience. - [Core Concepts / Knowledge Graph](https://memorilabs.ai/docs/memori-cloud/concepts/knowledge-graph): How Memori automatically builds a knowledge graph from your AI conversations using semantic triples, and how to query it through the Recall API. - [Benchmark / Overview](https://memorilabs.ai/docs/memori-cloud/benchmark/overview): Memori - A Persistent Memory Layer for Efficient, Context-Aware LLM Agents - [Benchmark / Experiments](https://memorilabs.ai/docs/memori-cloud/benchmark/experiments): These experiments evaluate the quality and accuracy of the memory assets produced by Memori's Advanced Augmentation pipeline. - [Benchmark / Results](https://memorilabs.ai/docs/memori-cloud/benchmark/results): See how Memori’s Advanced Augmentation performed on the LoCoMo benchmark. Using our LLM-as-a-Judge framework, we evaluated four reasoning categories (Multi-Hop, Temporal, Open-Domain, and Single-Hop) and compared Memori against several memory baselines and a Full-Context ceiling. - [MCP / Overview](https://memorilabs.ai/docs/memori-cloud/mcp/overview): Use the Model Context Protocol (MCP) to connect AI agents directly to Memori for real-time recall and memory storage. - [MCP / Client Setup](https://memorilabs.ai/docs/memori-cloud/mcp/client-setup): Configure the Memori MCP server in Cursor, Claude Code, Codex, Warp, Antigravity, and other MCP clients with custom connectors. - [MCP / Agent Skills](https://memorilabs.ai/docs/memori-cloud/mcp/agent-skills): Add a SKILL.md file to teach your IDE agent when and how to use Memori MCP tools for reliable recall and memory storage. - [OpenClaw / Overview](https://memorilabs.ai/docs/memori-cloud/openclaw/overview): Give your OpenClaw agents structured, persistent memory with the Memori plugin — auto-recall and auto-capture across every session. - [OpenClaw / Quickstart](https://memorilabs.ai/docs/memori-cloud/openclaw/quickstart): Install and configure the Memori plugin for OpenClaw in under 5 minutes. - [Dashboard / Overview](https://memorilabs.ai/docs/memori-cloud/dashboard/overview): An overview of the Memori Cloud dashboard at app.memorilabs.ai for managing API keys, testing memories, and monitoring usage. - [Dashboard / Memories](https://memorilabs.ai/docs/memori-cloud/dashboard/memories): Explore memories and entities in table and graph views, then drill into details, associations, and retrieval behavior. - [Dashboard / Analytics](https://memorilabs.ai/docs/memori-cloud/dashboard/analytics): Understand memory volume, retrieval performance, usage quotas, and top subjects in Memori Cloud analytics. - [Dashboard / Playground](https://memorilabs.ai/docs/memori-cloud/dashboard/playground): Use Playground to validate memory extraction end to end with chat, extracted memories, and the memory graph. - [Dashboard / API Keys](https://memorilabs.ai/docs/memori-cloud/dashboard/api-keys): Create and manage Memori API keys from the dashboard. - [LLM Providers / Overview](https://memorilabs.ai/docs/memori-cloud/llm/overview): Memori is LLM-agnostic. Register any supported client and Memori handles memory capture, augmentation, and recall automatically. - [LLM Providers / OpenAI](https://memorilabs.ai/docs/memori-cloud/llm/openai): Using Memori with OpenAI models including GPT-4o, GPT-4.1, and the Responses API on Memori Cloud. - [LLM Providers / Anthropic](https://memorilabs.ai/docs/memori-cloud/llm/anthropic): Using Memori with Anthropic Claude models on Memori Cloud. - [LLM Providers / Gemini](https://memorilabs.ai/docs/memori-cloud/llm/gemini): Using Memori with Google Gemini models on Memori Cloud. - [LLM Providers / DeepSeek](https://memorilabs.ai/docs/memori-cloud/llm/deepseek): Using Memori with DeepSeek models via the OpenAI-compatible API on Memori Cloud. - [LLM Providers / AWS Bedrock](https://memorilabs.ai/docs/memori-cloud/llm/aws-bedrock): Using Memori with AWS Bedrock models via the LangChain ChatBedrock adapter on Memori Cloud. - [LLM Providers / xAI Grok](https://memorilabs.ai/docs/memori-cloud/llm/xai-grok): Using Memori with xAI Grok models via the OpenAI-compatible API on Memori Cloud. - [LLM Providers / Agno](https://memorilabs.ai/docs/memori-cloud/llm/agno): Using Memori with Agno agents on Memori Cloud. - [LLM Providers / Pydantic AI](https://memorilabs.ai/docs/memori-cloud/llm/pydantic-ai): Using Memori with Pydantic AI agents on Memori Cloud. - [LLM Providers / LangChain](https://memorilabs.ai/docs/memori-cloud/llm/langchain): Using Memori with LangChain chat models on Memori Cloud. - [Support / Troubleshooting](https://memorilabs.ai/docs/memori-cloud/support/troubleshooting): Common issues and solutions when using Memori Cloud. - [Support / FAQ](https://memorilabs.ai/docs/memori-cloud/support/faq): Frequently asked questions about Memori Cloud. ## Memori BYODB Docs - [Introduction](https://memorilabs.ai/docs/memori-byodb): Memori is an open-source, structured memory layer for AI agents — own your data, choose your database, keep full control. - [Getting Started / Python SDK Quickstart](https://memorilabs.ai/docs/memori-byodb/getting-started/python-quickstart): Get started with Memori BYODB in under 3 minutes using SQLite and OpenAI. - [Getting Started / Installation](https://memorilabs.ai/docs/memori-byodb/getting-started/installation): Install Memori and set up your database for the Memori BYODB. - [Getting Started / Use Cases](https://memorilabs.ai/docs/memori-byodb/getting-started/use-cases): Common use cases and applications for Memori open source. - [Core Concepts / Architecture](https://memorilabs.ai/docs/memori-byodb/concepts/architecture): Understand how Memori's open-source architecture works — from your app to your own database, with local storage, augmentation, and recall. - [Core Concepts / How Memori Works](https://memorilabs.ai/docs/memori-byodb/concepts/how-memory-works): Understand the core concepts behind Memori — entities, processes, sessions, memory types, attribution, and how recall brings it all together. - [Core Concepts / Advanced Augmentation](https://memorilabs.ai/docs/memori-byodb/concepts/advanced-augmentation): How Memori's Advanced Augmentation engine extracts structured facts, preferences, and knowledge from your AI conversations — all stored in your own database. - [Core Concepts / Async Patterns](https://memorilabs.ai/docs/memori-byodb/concepts/async-patterns): Best practices for using Memori with async/await — AsyncOpenAI, AsyncAnthropic, FastAPI, and thread safety. - [Core Concepts / Multi-User Support](https://memorilabs.ai/docs/memori-byodb/concepts/multi-user-support): How Memori isolates memories across users, applications, and sessions so each user gets a personalized experience — all in your own database. - [Core Concepts / Knowledge Graph](https://memorilabs.ai/docs/memori-byodb/concepts/knowledge-graph): How Memori automatically builds a knowledge graph from your AI conversations using semantic triples, stored in your own database where you can query it directly. - [Core Concepts / CLI Quickstart](https://memorilabs.ai/docs/memori-byodb/concepts/cli-quickstart): Get started with the Memori command line interface for setup, diagnostics, and management. - [Dashboard / API Keys](https://memorilabs.ai/docs/memori-byodb/dashboard/api-keys): Create and manage Memori API keys from the dashboard. - [LLM Providers / Overview](https://memorilabs.ai/docs/memori-byodb/llm/overview): Memori is LLM-agnostic. Register any supported client with a local database connection and Memori handles memory capture, augmentation, and recall automatically. - [LLM Providers / OpenAI](https://memorilabs.ai/docs/memori-byodb/llm/openai): Using Memori with OpenAI models including GPT-4o, GPT-4.1, and the Responses API with Memori BYODB. - [LLM Providers / Anthropic](https://memorilabs.ai/docs/memori-byodb/llm/anthropic): Using Memori with Anthropic Claude models and Memori BYODB. - [LLM Providers / Gemini](https://memorilabs.ai/docs/memori-byodb/llm/gemini): Using Memori with Google Gemini models and Memori BYODB. - [LLM Providers / DeepSeek](https://memorilabs.ai/docs/memori-byodb/llm/deepseek): Using Memori with DeepSeek models via the OpenAI-compatible API and Memori BYODB. - [LLM Providers / AWS Bedrock](https://memorilabs.ai/docs/memori-byodb/llm/aws-bedrock): Using Memori with AWS Bedrock models via the LangChain ChatBedrock adapter and Memori BYODB. - [LLM Providers / xAI Grok](https://memorilabs.ai/docs/memori-byodb/llm/xai-grok): Using Memori with xAI Grok models via the OpenAI-compatible API and Memori BYODB. - [LLM Providers / Nebius AI Studio](https://memorilabs.ai/docs/memori-byodb/llm/nebius): Using Memori with Nebius AI Studio models via the OpenAI-compatible API and Memori BYODB. - [LLM Providers / Agno](https://memorilabs.ai/docs/memori-byodb/llm/agno): Using Memori with Agno agents and Memori BYODB. - [LLM Providers / Pydantic AI](https://memorilabs.ai/docs/memori-byodb/llm/pydantic-ai): Using Memori with Pydantic AI agents and Memori BYODB. - [LLM Providers / LangChain](https://memorilabs.ai/docs/memori-byodb/llm/langchain): Using Memori with LangChain chat models and Memori BYODB. - [Databases / Overview](https://memorilabs.ai/docs/memori-byodb/databases/overview): Memori supports SQLite, PostgreSQL, MySQL, MariaDB, Oracle, MongoDB, CockroachDB, and OceanBase, plus providers like Neon, Supabase, and AWS RDS/Aurora. - [Databases / SQLite](https://memorilabs.ai/docs/memori-byodb/databases/sqlite): Set up Memori with SQLite — zero install, file-based storage perfect for development and prototyping. - [Databases / MySQL](https://memorilabs.ai/docs/memori-byodb/databases/mysql): Set up Memori with MySQL — use your existing MySQL infrastructure for AI agent memory. - [Databases / PostgreSQL](https://memorilabs.ai/docs/memori-byodb/databases/postgres): Set up Memori with PostgreSQL — recommended for production with connection pooling and high concurrency. - [Databases / Oracle](https://memorilabs.ai/docs/memori-byodb/databases/oracle): Set up Memori with Oracle Database — enterprise-grade AI memory for Oracle infrastructure. - [Databases / MongoDB](https://memorilabs.ai/docs/memori-byodb/databases/mongodb): Set up Memori with MongoDB — document-oriented AI memory using PyMongo. - [Databases / CockroachDB](https://memorilabs.ai/docs/memori-byodb/databases/cockroachdb): Set up Memori with CockroachDB — distributed SQL database with PostgreSQL compatibility, automatic scaling, and strong consistency. - [Support / Troubleshooting](https://memorilabs.ai/docs/memori-byodb/support/troubleshooting): Common issues and solutions when using Memori open source. - [Support / FAQ](https://memorilabs.ai/docs/memori-byodb/support/faq): Frequently asked questions about Memori open source. - [Contribute / Overview](https://memorilabs.ai/docs/memori-byodb/contribute/overview): How to contribute to the Memori open-source project — code, documentation, issues, and discussions. - [Contribute / Development Setup](https://memorilabs.ai/docs/memori-byodb/contribute/development-setup): Set up your local development environment for contributing to the Memori open-source project. ## Repository - [GitHub Repository](https://github.com/MemoriLabs/Memori)