Skip to main content

How It Works

BaseLayer turns your AI conversations into a searchable knowledge graph through three stages: Capture, Dream, and Recall.

Capture

BaseLayer watches your AI conversations as they happen. Chrome extension for web AI: Chat with Claude, ChatGPT, Gemini, OpenRouter, or Open WebUI and the extension captures each conversation automatically. IDE watchers for coding tools: The desktop app monitors Cursor, Claude Code, Windsurf, GitHub Copilot, and Aider. It picks up conversations automatically. No plugins needed. The desktop app is available on macOS (10.15+). Captured conversations are sent securely to your memory in the cloud.

Dream

This is where raw conversations become structured knowledge. After a conversation is captured, BaseLayer’s dream engine runs in the background. It reads through what was discussed and extracts:
  • People: colleagues, contacts, anyone mentioned by name
  • Projects: codebases, products, initiatives you’re working on
  • Organizations: companies, teams, groups
  • Technologies: languages, frameworks, services, infrastructure
  • Concepts: ideas, patterns, decisions, strategies
  • Relationships: how all of the above connect to each other
BaseLayer supports 14 entity types in total: person, place, project, plan, organization, concept, event, media, technology, goal, document, belief, habit, and product. For example, if you told Claude: “I’m working with Sarah on the payments migration to Stripe”, BaseLayer would extract:
  • A person named Sarah
  • A project called payments migration
  • A technology reference to Stripe
  • Relationships linking them together
The dream engine is event-driven and runs in the cloud. Processing speed depends on your tier: Pro users get near-realtime extraction (roughly 2 minutes), while Free tier conversations are processed every 2 hours.
”Dreaming” is our term for the background knowledge extraction process. It happens automatically. You don’t need to do anything.

Recall

Any MCP-connected AI can search your knowledge graph. When you connect BaseLayer to Claude, ChatGPT, or another AI tool via MCP (Model Context Protocol), that AI gains access to 11 tools:

Querying

ToolWhat it does
ask_questionGet a synthesized prose answer from your memory, with citations

Searching

ToolWhat it does
memory_searchSemantic search across entities and facts
get_entityFetch a full dossier on a specific entity
get_entity_relationsExplore connections and facts between entities
get_entity_provenanceTrace where a piece of knowledge came from
list_entitiesBrowse all entities of a given type
recent_conversationsLoad recent conversation context for session orientation
retrieve_evidenceSearch raw conversation history for exact quotes

Writing

ToolWhat it does
record_memoryPersist a new observation or decision to your memory
vault_factAssert a specific fact as a graph edge between two entities
update_planCreate or update a persistent plan that tracks multi-step efforts
The recommended flow: start with ask_question for synthesized answers, use memory_search when you need to browse or explore, and call get_entity for deep dives into specific people, projects, or concepts.

The result

Your knowledge compounds across every AI tool you use. A conversation in ChatGPT becomes context available in Claude. A decision made in Cursor is remembered in Gemini. A preference stated once is never forgotten. One memory. Every AI. All your knowledge.