MemoryRouterMemoryRouter

Open WebUI

Upload your Open WebUI conversation history to MemoryRouter and give your AI persistent memory across every session.

Connect Open WebUI to MemoryRouter for persistent, semantic memory across every conversation and every model.

Two steps: backfill your history with the upload function, then connect MemoryRouter as a provider so new conversations are stored automatically.


Quick Start

1. Get your API key

Sign up free at app.memoryrouter.ai and copy your memory key (mk_xxx).

2. Add MemoryRouter as a provider

In Open WebUI:

  1. Go to Settings → Connections
  2. Add a new OpenAI-compatible connection:
    • URL: https://api.memoryrouter.ai/v1
    • API Key: your mk_xxx key
  3. Save — MemoryRouter models now appear in your model selector

From this point on, every conversation through MemoryRouter is automatically stored. Your AI remembers what you talked about yesterday, last week, and last month.

3. Upload your history (one-time)

Install the MemoryRouter Upload function to backfill all your existing conversations:

  1. Go to Admin → Functions → Add
  2. Paste the function from GitHub or install from the Open WebUI community
  3. Open the function's Valves (settings) and enter your mk_xxx key
  4. Click the Upload History to MemoryRouter button on any message

That's it. Your AI now has full context of everything you've ever discussed.


Memory Mode via Key Suffix

Control memory behavior directly from your API key — no headers or code needed. Just append a suffix:

Connection KeyRetrieveStoreUse Case
mk_xxxFull memory (default)
mk_xxx:readQuery past memories, don't store new ones
mk_xxx:writeStore conversations without injecting memories
mk_xxx:offPure proxy — no memory at all

Pro tip: Create two connections in Open WebUI — one with mk_xxx for full memory, one with mk_xxx:read for when you want context from past conversations without adding to your vault. Switch between them from the model dropdown.

The suffix is stripped before authentication — same key, different behavior.


How It Works

┌──────────────┐         ┌──────────────────┐         ┌──────────────┐
│  Open WebUI  │ ──────► │  MemoryRouter    │ ──────► │  AI Provider │
│  (your UI)   │         │  (proxy + vault) │         │  (any model) │
└──────────────┘         └──────────────────┘         └──────────────┘


                         Stores every
                         conversation
                         automatically

New conversations: When you use MemoryRouter as your provider, every message flows through the proxy. The proxy stores the conversation in your vault and injects relevant memories into the context window — automatically.

History backfill: The upload function reads your Open WebUI chat database, extracts all user and assistant messages, and uploads them to your vault in batches. This is a one-time operation — after that, the proxy handles everything.


Upload Function Details

The upload function handles the full complexity of Open WebUI's chat format:

FeatureDetail
Message extractionOnly user and assistant roles (skips system, tool calls)
Large messagesAuto-chunked at ~4,000 characters with natural boundary splitting
Batch uploadsGroups messages into batches (max 100 items or 2MB per batch)
Duplicate protectionTracks upload status — safe to click multiple times
Progress updatesReal-time status bar during processing and upload
Tree walkingFollows Open WebUI's branching message tree (active branch only)

Valves (Settings)

SettingDescription
memoryrouter_api_keyYour mk_xxx key from app.memoryrouter.ai
memoryrouter_endpointAPI endpoint (default: https://api.memoryrouter.ai)
history_uploadedAuto-set after upload. Reset to false to re-upload everything.

FAQ

Do I need to upload again after new conversations?

No. Once MemoryRouter is set as your provider, new conversations are stored automatically through the proxy. The upload function is only for backfilling history that happened before you connected.

Does it work with any model?

Yes. MemoryRouter is a proxy — it works with OpenAI, Anthropic, Google, Mistral, Groq, and any other provider. You use your own API keys. MemoryRouter adds the memory layer on top.

What about private/sensitive conversations?

Your data is stored in your personal vault and is never shared. MemoryRouter doesn't train on your data or expose it to other users.

Can I use my own API keys?

Yes. MemoryRouter proxies requests to your chosen provider using your own API keys. We don't provide model access — we provide the memory layer.

I clicked upload twice — will it duplicate?

No. After the first successful upload, the function marks itself as complete. Clicking again shows "Already uploaded" and exits. To force a re-upload (e.g., after a memory reset), set history_uploaded to false in the function's Valves.

How much does it cost?

MemoryRouter is free for up to 50 million tokens of stored memory. The upload function itself is free and open source.

On this page