Adds knowledge to your project’s memory. The configured LLM parses your natural language input into structured fields (category, subject, content) and stores it in Backboard.
tribal remember [OPTIONS] [TEXT]
Argument Description TEXTKnowledge to remember (or pipe via stdin)
Option Description --json, -jOutput result as JSON (for agent consumption)
tribal remember " numpy 1.26 breaks on Python 3.13 — pin to <1.26 "
tribal remember " if redis gives ECONNREFUSED, restart the container "
tribal remember " our staging DB is on port 5433, not 5432 "
echo " use --legacy-peer-deps for React 18 installs " | tribal remember
tribal remember --json " webpack needs NODE_OPTIONS=--openssl-legacy-provider "
Your text is sent to the configured LLM (Anthropic, OpenAI, or Google)
The LLM parses it into structured fields: category , subject , and content
The structured entry is stored in Backboard as a vector embedding
The memory is now searchable via tribal recall
Write in natural language — the LLM handles structuring
Be specific — include context about why something matters, not just what
Don’t duplicate — search with tribal recall first to check if the knowledge already exists
Each tribal remember call creates one memory entry — store one fact per call for better search granularity