The ARKOS Memory Module implements a dual-tier memory system combining PostgreSQL for conversation context storage with Mem0/Supabase for semantic long-term memory retrieval.
The Memory Module provides both short-term conversation context (PostgreSQL) and long-term semantic memory (Mem0 with Supabase vector store) for personalized AI interactions.
All memory operations use Pydantic message classes:
from model_module.ArkModelNew import ( Message, UserMessage, AIMessage, SystemMessage, ToolMessage)# User messageuser_msg = UserMessage(content="Hello, how are you?")# AI responseai_msg = AIMessage(content="I'm doing well, thanks!")# System instructionsystem_msg = SystemMessage(content="You are a helpful assistant")# Tool resulttool_msg = ToolMessage(content='{"result": "success"}')
Store a message to both PostgreSQL (immediate) and Mem0 (background):
from model_module.ArkModelNew import UserMessage, AIMessage# Add user messagememory.add_memory(UserMessage(content="My favorite color is blue"))# Add AI responsememory.add_memory(AIMessage(content="I'll remember that you like blue!"))
Messages are stored in PostgreSQL immediately (fast, synchronous) and sent to Mem0 in a background thread (non-blocking) for long-term storage.
# Get last 5 conversation turnscontext = memory.retrieve_short_memory(turns=5)# Returns list of Message objects in chronological orderfor msg in context: print(f"{msg.role}: {msg.content}")
Search for semantically relevant memories using Mem0:
# Get recent context firstcontext = memory.retrieve_short_memory(turns=2)# Retrieve relevant long-term memorieslong_term = memory.retrieve_long_memory( context=context, # Used to build search query mem0_limit=10 # Max results to return)# Returns a SystemMessage with retrieved memoriesprint(long_term.content)# Output: "retrieved memories:\nuser: My favorite color is blue\n..."
CREATE TABLE conversation_context ( id SERIAL PRIMARY KEY, user_id VARCHAR(255) NOT NULL, session_id VARCHAR(255) NOT NULL, role VARCHAR(50) NOT NULL, message TEXT NOT NULL, created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP);CREATE INDEX idx_user_id ON conversation_context(user_id);CREATE INDEX idx_session_id ON conversation_context(session_id);
The module uses a global connection pool for efficiency:
# Pool is initialized lazily on first use# Configured with:# - minconn=1 (minimum connections)# - maxconn=10 (maximum connections)# Connections are automatically managedconn = pool.getconn()try: # Use connection cur = conn.cursor() cur.execute(...) conn.commit()finally: pool.putconn(conn) # Return to pool
# PostgreSQL connection (REQUIRED)DB_URL=postgresql://user:password@host:port/database# OpenAI API key (required by Mem0, can be placeholder)OPENAI_API_KEY=sk-placeholder