Let Your AI Assistant Speak Coreflux
The Model Context Protocol (MCP) is an open standard that lets AI assistants — like Claude, GitHub Copilot, or any MCP-compatible client — interact directly with external tools and data sources. With the Coreflux MCP Server, your AI assistant gains real-time access to the entire Coreflux documentation — searching for specific LoT (Language of Things) syntax, looking up configuration patterns, and getting step-by-step guidance — without you ever leaving your conversation. Instead of switching tabs between your AI assistant and the docs website, you can simply ask and get answers grounded in the official Coreflux documentation.When to Use This
- You want your AI assistant to answer Coreflux questions accurately using official documentation
- You need LoT syntax help while coding in an AI-powered editor (Cursor, VS Code with Copilot)
- You want to search the documentation through natural language without leaving your workflow
- You’re building with Coreflux and want your assistant to have up-to-date reference material
In This Page
- What is MCP? — Quick overview of the protocol
- Setup — Installation instructions for Cursor, Claude Desktop, Claude.ai, and VS Code
- Available Tools — Reference for the tools provided by the Coreflux MCP Server
- Best Practices — Tips for getting the most out of the integration
- Next Steps — Where to go from here
What is MCP?
MCP (Model Context Protocol) is a standard created by Anthropic that defines how AI assistants communicate with external services. It works like a plugin system: an MCP server exposes a set of tools, and an MCP client (your AI assistant) discovers and calls those tools during conversation. The Coreflux MCP Server exposes documentation search and AI-assisted Q&A as tools that any MCP-compatible AI assistant can use. When you ask your assistant “how do I create a time-based LoT action?”, it can call the Coreflux MCP to search the official docs and return a grounded, accurate answer.| Component | Role |
|---|---|
| MCP Server | Coreflux’s hosted service that exposes documentation tools |
| MCP Client | Your AI assistant (Claude, Copilot, etc.) that calls those tools |
| Transport | HTTP connection between client and server |
Setup
The Coreflux MCP Server is available as a hosted service. To connect your AI assistant, add the server URL to your client’s MCP configuration. Coreflux MCP Server URL:- Cursor
- Claude Desktop
- Claude.ai
- VS Code
Cursor supports MCP servers natively. Add the Coreflux MCP to your global configuration or to a specific project.
Open MCP Configuration
Navigate to your Cursor MCP configuration file. You can use the global configuration at
~/.cursor/mcp.json, or create a project-level .cursor/mcp.json in your workspace root.Restart Cursor
Restart Cursor or reload the window for the MCP server to be detected. You should see the Coreflux tools become available in your AI assistant’s tool list.
If you already have other MCP servers configured, simply add the
"coreflux" entry alongside them inside the existing mcpServers object.Available Tools
Once connected, the Coreflux MCP Server provides two documentation tools. Your AI assistant discovers these tools automatically and uses them when your questions relate to Coreflux.Search Documentation
Tool name:consult_documentation
Performs semantic and keyword search across the full Coreflux documentation. Returns ranked results with titles, URLs, and content snippets from actual documentation pages.
| Parameter | Type | Required | Description |
|---|---|---|---|
query | string | Yes | The search query. Be specific for better results |
page_size | integer | No | Number of results to return (1–20). Default: 10 |
response_format | string | No | Output format: markdown or json. Default: markdown |
- Looking up specific syntax (e.g., “CALL PYTHON syntax”, “ON TOPIC trigger”)
- Finding code examples for a particular feature
- Browsing documentation pages on a topic
- Quick reference lookups
- “Search the Coreflux docs for Modbus TCP route configuration”
- “Find examples of LoT actions that use Python integration”
- “Look up the syntax for data storage routes”
Ask Documentation Assistant
Tool name:consult_internal_documentation
An AI assistant trained on Coreflux documentation that provides synthesized, natural language answers. It understands context, compares approaches, and includes source references from the documentation.
| Parameter | Type | Required | Description |
|---|---|---|---|
query | string | Yes | Your question about Coreflux, LoT, or related topics |
conversation_history | string | No | Previous conversation in JSON format for multi-turn context |
- Learning new concepts with explanations (e.g., “What is the difference between Actions and Models?”)
- Step-by-step guidance (e.g., “How do I create a time-based action?”)
- Comparing approaches and understanding best practices
- Getting architecture advice (e.g., “How should I structure a system to monitor 100 sensors?”)
- “Ask the Coreflux docs: what are the best practices for error handling in LoT actions?”
- “How do I set up a PostgreSQL route to store sensor data?”
- “What’s the difference between LoT Rules and LoT Actions?”
Using Both Tools Together
The two tools complement each other. A good workflow is:- Start with the assistant (
consult_internal_documentation) to understand concepts and get guided explanations - Follow up with search (
consult_documentation) to find additional code examples or dive deeper into specific reference pages
Best Practices
Be Specific with Your Questions
Be Specific with Your Questions
The more specific your question, the better the results. Instead of asking “tell me about routes,” try “how do I configure a PostgreSQL data storage route with authentication.” Specific queries help the tools return more relevant documentation.
Ask for Source References
Ask for Source References
When you need to verify information, ask your assistant to include the documentation source links. The
consult_internal_documentation tool includes source references in its responses, letting you navigate directly to the relevant docs page.Use for Code Review
Use for Code Review
Before deploying LoT code, ask your assistant to validate the syntax against the documentation. For example: “Check the Coreflux docs — is this the correct syntax for a Modbus TCP route?” This ensures your code follows the latest documented patterns.
Leverage Conversation Context
Leverage Conversation Context
The documentation assistant supports multi-turn conversations. Ask follow-up questions to drill deeper into a topic without repeating context — the assistant remembers what you discussed previously in the same conversation.

