Skip to main content

Memory

CommandLane's memory system gives the AI Agent long-term recall across sessions. Important facts, preferences, and decisions are saved automatically so the agent can reference them later — without you having to repeat yourself.

How It Works

  1. Automatic extraction: During conversations, the agent identifies important information and saves it as a memory
  2. Core summary: At the start of each session, the agent loads a summary of your most important memories
  3. Context-aware recall: When you ask a question, the agent searches your memories alongside your knowledge base

Memory Categories

Memories are organized into four categories:

CategoryWhat it storesExample
FactObjective information"Project deadline is March 15"
PreferenceYour habits and preferences"Prefers async communication"
DecisionStrategic decisions"Using Rust for the backend"
GeneralEverything else"Had a productive meeting with the design team"

Saving Memories

Automatic

The agent automatically detects and saves important facts during conversation. You don't need to do anything — it happens in the background.

Explicit

You can also ask the agent to remember something specific:

Remember that the API key rotates every 90 days

Save this: our team uses the #deployments channel for release announcements

Importance Scoring

Each memory receives an importance score from 0.0 to 1.0. Higher-scored memories:

  • Surface more prominently in search results
  • Are more likely to appear in the core summary
  • Persist longer before natural decay

Memory Decay

Memories that aren't accessed over time gradually lose relevance. This prevents stale or outdated information from cluttering your memory. Important memories that you reference regularly stay fresh.

Window-Level Scoping

Memories are scoped to the app context you're working in. When you're in VS Code, the agent prioritizes memories related to your development work. When you're in a browser, it prioritizes web-related context. This keeps the agent's recall relevant to what you're doing right now.

Privacy

All memories are stored locally in your SQLite database. They are never sent to external servers unless you interact with the AI Agent using a cloud provider (OpenAI or Anthropic), in which case memory context is included in the conversation to help the agent respond accurately.

If you use Ollama as your AI provider, everything stays entirely on your machine.