Connect AI Assistants to Your Content Platform
Give Claude, Cursor, and other MCP clients direct access to brand voices, digital twins, content generation, and audience insights.
Three Interaction Modes
The server instructs the LLM how to interact with the neuroflash API. You just ask your question — the LLM picks the best approach automatically.
Traditional Mode
The LLM calls one tool per API endpoint directly. Best for simple, single-step operations like "list my brand voices" or "get workspace details".
46 toolsCode Mode BETA
The LLM writes Python code that orchestrates multiple API calls in a sandbox. It handles conditional logic, data transformation, and multi-step workflows autonomously — using 98% fewer tokens.
2 tools ~1k tokensExploratory Mode BETA
The LLM progressively discovers the API surface — first browsing domains, then drilling into actions, then comparing data across resources. Ideal when the question is open-ended.
3 tools