Fallback
An alternative approach when the primary method fails (e.g., Ollama fails, Claude API takes over).
Why it matters
Reliability requires backup plans. Fallback chains ensure your system stays operational even when individual components fail.
In practice
Our LLM routing: cache check (free) then FAQ match (free) then Ollama local (nearly free) then Claude API (paid, tracked). Each layer is a fallback.
Related terms
Ollama
A tool for running AI models locally. Free, private, fast.
Cache (LLM Cache)
Storing previous AI responses for reuse. Saves costs and speeds up repeated queries.
Self-Healing
A system's ability to detect and fix errors without human intervention.
n8n-First Principle
A design philosophy: every repeatable task is a workflow, not an AI call. AI is the last resort.