Integration
Tutorial
April 13, 2026
Persistent Memory for Phidata Agent Teams
Phidata makes building agent teams straightforward with its Agent and Team classes. But agent memory resets between runs. This guide integrates REM Labs so your Phidata agents carry knowledge across sessions, share context within teams, and retrieve with multi-signal fusion accuracy.
The Problem: Agents That Forget
Phidata agents can use tools, maintain conversation history within a session, and delegate to sub-agents. But when the Python process ends, everything the agent learned disappears. For agents that handle recurring tasks -- customer support, research, project management -- this makes each interaction start from zero.
Step 1: Install
pip install remlabs-memory phidata openai
Step 2: Create REM Memory Tools
from phi.tools import Toolkit
from remlabs import RemMemory
class RemMemoryTools(Toolkit):
def __init__(self, api_key: str, namespace: str = "phidata-agent"):
super().__init__(name="rem_memory")
self.mem = RemMemory(api_key=api_key)
self.namespace = namespace
self.register(self.search_memory)
self.register(self.store_memory)
def search_memory(self, query: str) -> str:
"""Search persistent memory for relevant context from past interactions."""
results = self.mem.search(query, namespace=self.namespace, limit=5)
if not results:
return "No relevant memories found."
return "\n".join([f"- {r['value']}" for r in results])
def store_memory(self, value: str) -> str:
"""Store an important fact or observation for future reference."""
self.mem.store(value=value, namespace=self.namespace, tags=["agent-learned"])
return f"Stored in memory: {value}"
The RemMemoryTools toolkit exposes two functions that Phidata agents can call: search_memory for retrieval and store_memory for persistence. Both are automatically available as tools the agent can invoke during conversation.
Step 3: Attach to an Agent
from phi.agent import Agent
from phi.model.openai import OpenAIChat
agent = Agent(
name="Support Agent",
model=OpenAIChat(id="gpt-4o"),
tools=[RemMemoryTools(api_key="sk-slop-...", namespace="support")],
instructions=[
"You are a customer support agent with persistent memory.",
"At the start of each conversation, search memory for relevant context.",
"When you learn important facts about a customer, store them in memory.",
],
show_tool_calls=True
)
# The agent will search memory and store new facts automatically
agent.print_response("Hi, I'm Sarah from Acme Corp. We discussed migrating to v2 last week.")
# On next run (even after restart), the agent recalls:
agent.print_response("What did Sarah from Acme Corp want to discuss?")
The agent searches REM before responding, finds the stored conversation about Sarah and the v2 migration, and provides a personalized answer -- even if the process was restarted between calls.
Step 4: Shared Memory Across Agent Teams
from phi.agent import Agent
researcher = Agent(
name="Researcher",
model=OpenAIChat(id="gpt-4o"),
tools=[RemMemoryTools(api_key="sk-slop-...", namespace="team-research")],
instructions=["Research topics and store findings in memory."]
)
writer = Agent(
name="Writer",
model=OpenAIChat(id="gpt-4o"),
tools=[RemMemoryTools(api_key="sk-slop-...", namespace="team-research")],
instructions=["Search memory for research findings and write reports."]
)
# Researcher stores findings
researcher.print_response("Research the latest pricing changes from Competitor X.")
# Writer retrieves them (same namespace)
writer.print_response("Write a summary of competitor pricing changes.")
Both agents share the team-research namespace. What one agent stores, the other can retrieve. This works across processes, machines, and restarts.
Multi-Signal Retrieval
Every memory is indexed three ways: vector embeddings, full-text search, and entity graphs. When an agent calls search_memory, REM fuses results from all three signals using reciprocal rank fusion. This reaches 90% on LongMemEval -- handling proper nouns, temporal queries, and knowledge updates that vector-only search misses.
Works with any Phidata model: The REM memory toolkit works with any model Phidata supports -- OpenAI, Anthropic, Groq, Ollama. The memory layer is model-agnostic.
Give your Phidata agents a memory
Free tier. No credit card. pip install and go.
Get started free →