Use Supabase as AI Memory Backend with REM

If you already use Supabase for your app backend, you can pipe your data into REM Labs for AI memory without leaving your stack. This guide walks through connecting Supabase tables, syncing with Edge Functions, and querying memories with multi-signal retrieval.

Why Supabase + REM Labs

Supabase gives you Postgres, auth, and real-time subscriptions out of the box. REM Labs adds the memory layer your AI features need -- semantic search, entity extraction, temporal decay, and neural reranking. Together, you keep your data in Supabase while giving your AI agents a proper memory system.

Step 1: Install the SDK

npm install @remlabs/sdk @supabase/supabase-js

Step 2: Sync Supabase Rows to REM

Create a Supabase Edge Function that fires on inserts and pushes data to REM Labs:

// supabase/functions/sync-to-rem/index.ts import { serve } from "https://deno.land/std@0.168.0/http/server.ts"; import { RemClient } from "@remlabs/sdk"; const rem = new RemClient({ apiKey: Deno.env.get("REMLABS_API_KEY")! }); serve(async (req) => { const { type, record, table } = await req.json(); if (type === "INSERT" || type === "UPDATE") { await rem.remember({ content: JSON.stringify(record), namespace: `supabase:${table}`, tags: [table, record.category].filter(Boolean), metadata: { supabase_id: record.id, table } }); } return new Response(JSON.stringify({ ok: true }), { headers: { "Content-Type": "application/json" } }); });

Step 3: Set Up the Database Webhook

In the Supabase dashboard, navigate to Database > Webhooks and create a trigger that calls your Edge Function on INSERT and UPDATE events for the tables you want to sync.

-- Or use SQL to create the trigger directly CREATE OR REPLACE FUNCTION notify_rem_sync() RETURNS trigger AS $$ BEGIN PERFORM net.http_post( url := 'https://your-project.supabase.co/functions/v1/sync-to-rem', body := json_build_object( 'type', TG_OP, 'table', TG_TABLE_NAME, 'record', row_to_json(NEW) )::text, headers := '{"Authorization": "Bearer your-service-key"}'::jsonb ); RETURN NEW; END; $$ LANGUAGE plpgsql; CREATE TRIGGER sync_notes_to_rem AFTER INSERT OR UPDATE ON notes FOR EACH ROW EXECUTE FUNCTION notify_rem_sync();

Step 4: Query Memories from Your App

Now your AI features can recall context from any synced table:

import { RemClient } from "@remlabs/sdk"; const rem = new RemClient({ apiKey: process.env.REMLABS_API_KEY }); // Recall relevant memories for a user query const results = await rem.recall({ query: "What did the user say about their budget?", namespace: "supabase:conversations", limit: 5 }); // Each result includes content, score, and metadata results.forEach(m => { console.log(m.content, m.score, m.metadata.supabase_id); });

Real-Time Sync with Supabase Channels

For real-time use cases, subscribe to Supabase changes and push to REM on the fly:

import { createClient } from "@supabase/supabase-js"; import { RemClient } from "@remlabs/sdk"; const supabase = createClient(SUPABASE_URL, SUPABASE_KEY); const rem = new RemClient({ apiKey: REMLABS_API_KEY }); supabase.channel("notes-sync") .on("postgres_changes", { event: "INSERT", schema: "public", table: "notes" }, async (payload) => { await rem.remember({ content: payload.new.body, namespace: "supabase:notes", tags: payload.new.tags || [], metadata: { supabase_id: payload.new.id } }); }) .subscribe();

Namespace strategy: Use supabase:table_name as your namespace pattern. This lets you query across all Supabase data or scope to a single table when recalling memories.

Give your Supabase app an AI memory

Free tier. One SDK. Multi-signal retrieval on your existing data.

Get Started