Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
41 commits
Select commit Hold shift + click to select a range
c3472c4
fix: fix bugs when running local queue for memos
tangg555 Dec 25, 2025
4ef6fec
fix: remove an unnecessary function
tangg555 Dec 25, 2025
a2152f1
fix: update README.md
zZhangSir Dec 25, 2025
9e6f846
update requirement,Dockerfile
pursues Dec 25, 2025
1670ef3
fix: update README.md
zZhangSir Dec 25, 2025
85268ba
fix: update README.md (#786)
fridayL Dec 25, 2025
33fc280
update: requirement,Dockerfile (#784)
fridayL Dec 25, 2025
0990db6
Merge branch 'main' into hotfix/scheduler
tangg555 Dec 25, 2025
ede5e81
fix: update README.md
zZhangSir Dec 25, 2025
f0af4e1
hotfix: redis dependency in scheduler (#781)
CaralHsi Dec 25, 2025
1ce1a57
Merge branch 'main' into main_zhq_readme
CaralHsi Dec 25, 2025
760dd47
feat: update readme
lijicode Dec 25, 2025
335d426
feat: fix NACOS
lijicode Dec 25, 2025
1c7ef04
feat: add timer log
Dec 26, 2025
4de2d30
update requirements
pursues Dec 26, 2025
56d9166
Merge branch 'MemTensor:main' into main
pursues Dec 26, 2025
d8bace4
feat: add timer log (#793)
CaralHsi Dec 26, 2025
26c07a9
Merge remote-tracking branch 'origin/main' into main_zhq_readme
zZhangSir Dec 26, 2025
5152f29
fix: 12.26 update README.md
zZhangSir Dec 26, 2025
4b7c872
Merge branch 'MemTensor:main' into main
pursues Dec 26, 2025
7cdd740
change local server name
pursues Dec 26, 2025
35af8f6
requirements add neo4j (#792)
fridayL Dec 26, 2025
eb11e0a
update docker-compose.yml
pursues Dec 26, 2025
e115d3f
Merge branch 'MemTensor:main' into main
pursues Dec 26, 2025
9f71f0a
fix: issues caused by no reading default use_redis from env
tangg555 Dec 26, 2025
02abdcc
Merge branch 'main' into main_zhq_readme
lijicode Dec 26, 2025
d42ec55
Main zhq readme (#794)
fridayL Dec 26, 2025
4e9882b
update docker.compose.yml (#798)
fridayL Dec 26, 2025
20a0ac7
feat: fix requirements
lijicode Dec 26, 2025
e582a8f
feat: fix requirements (#800)
glin93 Dec 26, 2025
f6e716e
Merge branch 'main' into hotfix/scheduler
tangg555 Dec 29, 2025
ad56cda
fix: issues caused by no reading default use_redis from env (#799)
fridayL Dec 30, 2025
5c7e40e
add neo4j
pursues Dec 30, 2025
1afc89f
Merge branch 'main' of github.com:pursues/MemOS
pursues Dec 30, 2025
815a394
add neo4j (#809)
fridayL Dec 30, 2025
0d278af
fix: logs context and empty embedding
fridayL Jan 4, 2026
08f660f
fix: logs context and empty embedding (#819)
fridayL Jan 4, 2026
4d38ad0
fix reranker
pursues Jan 5, 2026
d5457f3
fix reranker (#823)
fridayL Jan 5, 2026
626eb29
fix: conflict
CaralHsi Jan 7, 2026
67b9d6b
fix: conflict
CaralHsi Jan 7, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
217 changes: 85 additions & 132 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -118,89 +118,31 @@ showcasing its capabilities in **information extraction**, **temporal and cross-
- **🔌 Extensible**: Easily extend and customize memory modules, data sources, and LLM integrations.


## 📦 Installation
## 🚀 Quickstart Guide

### Install via pip

```bash
pip install MemoryOS
```

### Optional Dependencies

MemOS provides several optional dependency groups for different features. You can install them based on your needs.

| Feature | Package Name |
| --------------------- | ------------------------- |
| Tree Memory | `MemoryOS[tree-mem]` |
| Memory Reader | `MemoryOS[mem-reader]` |
| Memory Scheduler | `MemoryOS[mem-scheduler]` |

Example installation commands:

```bash
pip install MemoryOS[tree-mem]
pip install MemoryOS[tree-mem,mem-reader]
pip install MemoryOS[mem-scheduler]
pip install MemoryOS[tree-mem,mem-reader,mem-scheduler]
```

### External Dependencies

#### Ollama Support

To use MemOS with [Ollama](https://ollama.com/), first install the Ollama CLI:

```bash
curl -fsSL https://ollama.com/install.sh | sh
```

#### Transformers Support

To use functionalities based on the `transformers` library, ensure you have [PyTorch](https://pytorch.org/get-started/locally/) installed (CUDA version recommended for GPU acceleration).

#### Download Examples
### Get API Key
- Sign up and get started on[`MemOS dashboard`](https://memos-dashboard.openmem.net/cn/quickstart/?source=landing)
- Open the API Keys Console in the MemOS dashboard and copy the API Key into the initialization code

To download example code, data and configurations, run the following command:

```bash
memos download_examples
```


## 🚀 Getting Started

### ⭐️ MemOS online API
The easiest way to use MemOS. Equip your agent with memory **in minutes**!

Sign up and get started on[`MemOS dashboard`](https://memos-dashboard.openmem.net/cn/quickstart/?source=landing).


### Self-Hosted Server
1. Get the repository.
```bash
git clone https://github.com/MemTensor/MemOS.git
cd MemOS
pip install -r ./docker/requirements.txt
```
### Install via pip

2. Configure `docker/.env.example` and copy to `MemOS/.env`
3. Start the service.
```bash
uvicorn memos.api.server_api:app --host 0.0.0.0 --port 8001 --workers 8
pip install MemoryOS -U
```

### Interface SDK
#### Here is a quick example showing how to create all interface SDK
### Basic Usage

This interface is used to add messages, supporting multiple types of content and batch additions. MemOS will automatically parse the messages and handle memory for reference in subsequent conversations.
- Initialize MemOS client with API Key to start sending requests
```python
# Please make sure MemoS is installed (pip install MemoryOS -U)
from memos.api.client import MemOSClient

# Initialize the client using the API Key
client = MemOSClient(api_key="YOUR_API_KEY")
```

- This API allows you to add one or more messages to a specific conversation. As illustrated in the examples bellow, you can add messages in real time during a user-assistant interaction, import historical messages in bulk, or enrich the conversation with user preferences and behavior data. All added messages are transformed into memories by MemOS, enabling their retrieval in future conversations to support chat history management, user behavior tracking, and personalized interactions.
```python
messages = [
{"role": "user", "content": "I have planned to travel to Guangzhou during the summer vacation. What chain hotels are available for accommodation?"},
{"role": "assistant", "content": "You can consider [7 Days, All Seasons, Hilton], and so on."},
Expand All @@ -214,79 +156,90 @@ res = client.add_message(messages=messages, user_id=user_id, conversation_id=con
print(f"result: {res}")
```

This interface is used to retrieve the memories of a specified user, returning the memory fragments most relevant to the input query for Agent use. The recalled memory fragments include 'factual memory', 'preference memory', and 'tool memory'.
- This API allows you to query a user’s memory and returns the fragments most relevant to the input. These can serve as references for the model when generating responses. As shown in the examples bellow, You can retrieve memory in real time during a user’s conversation with the AI, or perform a global search across their entire memory to create user profiles or support personalized recommendations, improving both dialogue coherence and personalization.
In the latest update, in addition to “Fact Memory”, the system now supports “Preference Memory”, enabling LLM to respond in a way that better understands the user.
```python
# Please make sure MemoS is installed (pip install MemoryOS -U)
from memos.api.client import MemOSClient

# Initialize the client using the API Key
client = MemOSClient(api_key="YOUR_API_KEY")

query = "I want to go out to play during National Day. Can you recommend a city I haven't been to and a hotel brand I haven't stayed at?"
user_id = "memos_user_123"
conversation_id = "0928"
conversation_id = "0610"
res = client.search_memory(query=query, user_id=user_id, conversation_id=conversation_id)

print(f"result: {res}")
```

This interface is used to delete the memory of specified users and supports batch deletion.
```python
# Please make sure MemoS is installed (pip install MemoryOS -U)
from memos.api.client import MemOSClient

# Initialize the client using the API Key
client = MemOSClient(api_key="YOUR_API_KEY")

user_ids = ["memos_user_123"]
# Replace with the memory ID
memory_ids = ["6b23b583-f4c4-4a8f-b345-58d0c48fea04"]
res = client.delete_memory(user_ids=user_ids, memory_ids=memory_ids)

print(f"result: {res}")
```

This interface is used to add feedback to messages in the current session, allowing MemOS to correct its memory based on user feedback.
```python
# Please make sure MemoS is installed (pip install MemoryOS -U)
from memos.api.client import MemOSClient

# Initialize the client using the API Key
client = MemOSClient(api_key="YOUR_API_KEY")

user_id = "memos_user_123"
conversation_id = "memos_feedback_conv"
feedback_content = "No, let's change it now to a meal allowance of 150 yuan per day and a lodging subsidy of 700 yuan per day for first-tier cities; for second- and third-tier cities, it remains the same as before."
# Replace with the knowledgebase ID
allow_knowledgebase_ids = ["basee5ec9050-c964-484f-abf1-ce3e8e2aa5b7"]

res = client.add_feedback(
user_id=user_id,
conversation_id=conversation_id,
feedback_content=feedback_content,
allow_knowledgebase_ids=allow_knowledgebase_ids
)

print(f"result: {res}")
```

This interface is used to create a knowledgebase associated with a project
```python
# Please make sure MemoS is installed (pip install MemoryOS -U)
from memos.api.client import MemOSClient

# Initialize the client using the API Key
client = MemOSClient(api_key="YOUR_API_KEY")

knowledgebase_name = "Financial Reimbursement Knowledge Base"
knowledgebase_description = "A compilation of all knowledge related to the company's financial reimbursements."
### Self-Hosted Server
1. Get the repository.
```bash
git clone https://github.com/MemTensor/MemOS.git
cd MemOS
pip install -r ./docker/requirements.txt
```
2. Configure `docker/.env.example` and copy to `MemOS/.env`
- The `OPENAI_API_KEY`,`MOS_EMBEDDER_API_KEY`,`MEMRADER_API_KEY` and others can be applied for through [`BaiLian`](https://bailian.console.aliyun.com/?spm=a2c4g.11186623.0.0.2f2165b08fRk4l&tab=api#/api).
- Fill in the corresponding configuration in the `MemOS/.env` file.
3. Start the service.

res = client.create_knowledgebase(
knowledgebase_name=knowledgebase_name,
knowledgebase_description=knowledgebase_description
)
print(f"result: {res}")
```
- Launch via Docker
###### Tips: Please ensure that Docker Compose is installed successfully and that you have navigated to the docker directory (via `cd docker`) before executing the following command.
```bash
# Enter docker directory
docker compose up
```
##### If you prefer to deploy using Docker, please refer to the [`Docker Reference`](https://docs.openmem.net/open_source/getting_started/rest_api_server/#method-1-docker-use-repository-dependency-package-imagestart-recommended-use).

- Launch via the uvicorn command line interface (CLI)
###### Tips: Please ensure that Neo4j and Qdrant are running before executing the following command.
```bash
uvicorn memos.api.server_api:app --host 0.0.0.0 --port 8001 --workers 1
```
##### For detailed integration steps, see the [`CLI Reference`](https://docs.openmem.net/open_source/getting_started/rest_api_server/#method-3client-install-with-CLI).



Example
- Add User Message
```python
import requests
import json

data = {
"user_id": "8736b16e-1d20-4163-980b-a5063c3facdc",
"mem_cube_id": "b32d0977-435d-4828-a86f-4f47f8b55bca",
"messages": [
{
"role": "user",
"content": "I like strawberry"
}
],
"async_mode": "sync"
}
headers = {
"Content-Type": "application/json"
}
url = "http://localhost:8000/product/add"

res = requests.post(url=url, headers=headers, data=json.dumps(data))
print(f"result: {res.json()}")
```
- Search User Memory
```python
import requests
import json

data = {
"query": "What do I like",
"user_id": "8736b16e-1d20-4163-980b-a5063c3facdc",
"mem_cube_id": "b32d0977-435d-4828-a86f-4f47f8b55bca"
}
headers = {
"Content-Type": "application/json"
}
url = "http://localhost:8000/product/search"

res = requests.post(url=url, headers=headers, data=json.dumps(data))
print(f"result: {res.json()}")
```

## 💬 Community & Support

Expand Down
Loading