The Bridge Between AI and Your Coreflux Environment
Modern AI assistants are powerful — but they don’t natively understand LoT (Language of Things) syntax, Coreflux broker configuration, or the nuances of industrial IoT architecture. Without access to the official documentation, an AI assistant will guess, often producing code that looks plausible but uses invented syntax. The Model Context Protocol (MCP) solves this. MCP is an open standard that lets AI assistants — like Claude, GitHub Copilot, or any MCP-compatible client — call external tools during a conversation. The Coreflux MCP Server exposes the entire Coreflux documentation as a set of tools your assistant can query in real time. When you ask “how do I create a time-based LoT Action?”, your assistant doesn’t guess — it looks up the answer in the official docs and responds with verified syntax and working examples. The result: you stay in your editor, describe what you want in plain English, and get accurate LoT code grounded in real documentation.When to Use This
- You want your AI assistant to answer Coreflux questions accurately using official documentation
- You need LoT syntax help while coding in an AI-powered editor (Cursor, VS Code with Copilot)
- You want to search the documentation through natural language without leaving your workflow
- You’re building with Coreflux and want your assistant to have up-to-date reference material
In This Page
- How It Works — The architecture connecting your AI to Coreflux
- Setup — Installation instructions for Cursor, Claude Desktop, Claude.ai, and VS Code
- Available Tools — Reference for the tools provided by the Coreflux MCP Server
- Verify Your Connection — Test that everything works
- Best Practices — Tips for getting the most out of the integration
- Next Steps — Where to go from here
How It Works
MCP (Model Context Protocol) is a standard created by Anthropic that defines how AI assistants communicate with external services. It works like a plugin system: an MCP server exposes a set of tools, and an MCP client (your AI assistant) discovers and calls those tools during conversation. Here is what happens when you ask your AI assistant a Coreflux question:| Step | What Happens |
|---|---|
| 1. You ask | You type a question in natural language — “Create a LoT Action that monitors temperature sensors” |
| 2. AI recognizes the domain | Your assistant detects this is a Coreflux question and decides to consult the MCP tools |
| 3. MCP tool call | The assistant calls the Coreflux MCP Server — searching the documentation or asking the docs assistant |
| 4. Documentation responds | The MCP server returns relevant documentation snippets, syntax references, and code examples |
| 5. AI synthesizes | Your assistant combines the documentation with your specific requirements to produce accurate, grounded LoT code |
| 6. You review | You receive a response with correct syntax, proper patterns, and source references you can verify |
| Component | Role |
|---|---|
| MCP Server | Coreflux’s hosted service that exposes documentation tools |
| MCP Client | Your AI assistant (Claude, Copilot, etc.) that calls those tools |
| Transport | HTTP connection between client and server |
Setup
The Coreflux MCP Server is available as a hosted service. To connect your AI assistant, add the server URL to your client’s MCP configuration. Coreflux MCP Server URL:- Cursor
- Claude Desktop
- Claude.ai
- VS Code
Cursor supports MCP servers natively. Add the Coreflux MCP to your global configuration or to a specific project.
Open MCP Configuration
Navigate to your Cursor MCP configuration file. You can use the global configuration at
~/.cursor/mcp.json, or create a project-level .cursor/mcp.json in your workspace root.Restart Cursor
Restart Cursor or reload the window for the MCP server to be detected. You should see the Coreflux tools become available in your AI assistant’s tool list.
If you already have other MCP servers configured, simply add the
"coreflux" entry alongside them inside the existing mcpServers object.Available Tools
Once connected, the Coreflux MCP Server provides two documentation tools. Your AI assistant discovers these tools automatically and uses them when your questions relate to Coreflux.Search Documentation
Tool name:consult_documentation
Performs semantic and keyword search across the full Coreflux documentation. Returns ranked results with titles, URLs, and content snippets from actual documentation pages.
| Parameter | Type | Required | Description |
|---|---|---|---|
query | string | Yes | The search query. Be specific for better results |
page_size | integer | No | Number of results to return (1–20). Default: 10 |
response_format | string | No | Output format: markdown or json. Default: markdown |
- Looking up specific syntax (e.g., “CALL PYTHON syntax”, “ON TOPIC trigger”)
- Finding code examples for a particular feature
- Browsing documentation pages on a topic
- Quick reference lookups
- “Search the Coreflux docs for Modbus TCP route configuration”
- “Find examples of LoT actions that use Python integration”
- “Look up the syntax for data storage routes”
Ask Documentation Assistant
Tool name:consult_internal_documentation
An AI assistant trained on Coreflux documentation that provides synthesized, natural language answers. It understands context, compares approaches, and includes source references from the documentation.
| Parameter | Type | Required | Description |
|---|---|---|---|
query | string | Yes | Your question about Coreflux, LoT, or related topics |
conversation_history | string | No | Previous conversation in JSON format for multi-turn context |
- Learning new concepts with explanations (e.g., “What is the difference between Actions and Models?”)
- Step-by-step guidance (e.g., “How do I create a time-based action?”)
- Comparing approaches and understanding best practices
- Getting architecture advice (e.g., “How should I structure a system to monitor 100 sensors?”)
- “Ask the Coreflux docs: what are the best practices for error handling in LoT actions?”
- “How do I set up a PostgreSQL route to store sensor data?”
- “What’s the difference between LoT Rules and LoT Actions?”
Using Both Tools Together
The two tools complement each other. A good workflow is:- Start with the assistant (
consult_internal_documentation) to understand concepts and get guided explanations - Follow up with search (
consult_documentation) to find additional code examples or dive deeper into specific reference pages
Verify Your Connection
After setup, confirm the MCP is working by sending a test prompt to your AI assistant. The specific prompt doesn’t matter — what matters is that the assistant calls the Coreflux MCP tools rather than answering from memory.Quick Test
Send this prompt to your AI assistant:It’s working if your assistant calls one of the Coreflux tools (
consult_documentation or consult_internal_documentation) during its response. In Cursor, you’ll see the tool calls in the assistant’s output. In Claude Desktop, look for the hammer icon indicating tool usage.What a Connected Response Looks Like
When the MCP is active, your assistant’s response will:- Reference specific LoT syntax from the documentation (not invented patterns)
- Include working code examples that match the official docs
- Cite source pages you can open to verify the information
- Use correct terminology — LoT, Actions, Models, Rules, Routes — exactly as defined in the documentation
Best Practices
Be Specific with Your Questions
Be Specific with Your Questions
The more specific your question, the better the results. Instead of asking “tell me about routes,” try “how do I configure a PostgreSQL data storage route with authentication.” Specific queries help the tools return more relevant documentation.
Ask for Source References
Ask for Source References
When you need to verify information, ask your assistant to include the documentation source links. The
consult_internal_documentation tool includes source references in its responses, letting you navigate directly to the relevant docs page.Use for Code Review
Use for Code Review
Before deploying LoT code, ask your assistant to validate the syntax against the documentation. For example: “Check the Coreflux docs — is this the correct syntax for a Modbus TCP route?” This ensures your code follows the latest documented patterns.
Leverage Conversation Context
Leverage Conversation Context
The documentation assistant supports multi-turn conversations. Ask follow-up questions to drill deeper into a topic without repeating context — the assistant remembers what you discussed previously in the same conversation.

