Skip to main content
The LLM context window is temporary. Everything in it disappears when the conversation ends. The memory system is permanent: documents written to memory survive indefinitely and are retrievable via search from any session. This means the agent must be proactive about writing. Before answering a question about prior work, the agent should search memory. Before ending a task that produced useful information, the agent should write a summary.
The agent is instructed to call memory_search before answering questions about past work, prior decisions, or previously stored information. If you feel the agent has forgotten something, try asking it to search memory explicitly.

Workspace Structure

The workspace uses a filesystem-like path hierarchy. Documents live at paths you define:
Example PathUse Case
context/vision.mdProject goals and direction
context/architecture.mdSystem design decisions
daily/2024-01-15.mdDaily notes and logs
daily/standup.mdLatest standup draft (overwritten daily)
projects/ironclaw/notes.mdProject-specific notes
inbox/task-20240115.mdIncoming tasks to process
processed/task-20240115.mdProcessed and archived tasks
ops/incidents/2024-01-15.mdIncident records
AGENTS.mdAgent behavior instructions
SOUL.mdAgent values and personality
Paths are arbitrary strings. Use whatever structure makes sense for your workflow. The memory_tree tool shows all paths organized as a directory tree.

Four Memory Tools

ToolDescription
memory_searchHybrid FTS + vector search. Call this before answering questions about prior work. Returns ranked results.
memory_writeWrite a document to a path. Creates or overwrites. Supports structured content (markdown, JSON, plain text).
memory_readRead a specific document by exact path.
memory_treeList all paths in the workspace as a tree. Use for discovery and navigation.

You can configure the memory to be persisted as a vector store, which allows for fast semantic search and retrieval during the initial onboarding. This is ideal for larger workspaces or when you want the agent to have quick access to a large amount of information.