Skip to main content

Let Your AI Assistant Speak Coreflux

The Model Context Protocol (MCP) is an open standard that lets AI assistants — like Claude, GitHub Copilot, or any MCP-compatible client — interact directly with external tools and data sources. With the Coreflux MCP Server, your AI assistant gains real-time access to the entire Coreflux documentation — searching for specific LoT (Language of Things) syntax, looking up configuration patterns, and getting step-by-step guidance — without you ever leaving your conversation. Instead of switching tabs between your AI assistant and the docs website, you can simply ask and get answers grounded in the official Coreflux documentation.
Like giving your AI assistant a library card for Coreflux. Instead of guessing about LoT syntax or broker configuration, your assistant can look it up directly in the official docs and give you an accurate, sourced answer.

When to Use This

  • You want your AI assistant to answer Coreflux questions accurately using official documentation
  • You need LoT syntax help while coding in an AI-powered editor (Cursor, VS Code with Copilot)
  • You want to search the documentation through natural language without leaving your workflow
  • You’re building with Coreflux and want your assistant to have up-to-date reference material

In This Page

  • What is MCP? — Quick overview of the protocol
  • Setup — Installation instructions for Cursor, Claude Desktop, Claude.ai, and VS Code
  • Available Tools — Reference for the tools provided by the Coreflux MCP Server
  • Best Practices — Tips for getting the most out of the integration
  • Next Steps — Where to go from here

What is MCP?

MCP (Model Context Protocol) is a standard created by Anthropic that defines how AI assistants communicate with external services. It works like a plugin system: an MCP server exposes a set of tools, and an MCP client (your AI assistant) discovers and calls those tools during conversation. The Coreflux MCP Server exposes documentation search and AI-assisted Q&A as tools that any MCP-compatible AI assistant can use. When you ask your assistant “how do I create a time-based LoT action?”, it can call the Coreflux MCP to search the official docs and return a grounded, accurate answer.
ComponentRole
MCP ServerCoreflux’s hosted service that exposes documentation tools
MCP ClientYour AI assistant (Claude, Copilot, etc.) that calls those tools
TransportHTTP connection between client and server

Setup

The Coreflux MCP Server is available as a hosted service. To connect your AI assistant, add the server URL to your client’s MCP configuration. Coreflux MCP Server URL:
https://mcp-beta.coreflux.org/mcp
Cursor supports MCP servers natively. Add the Coreflux MCP to your global configuration or to a specific project.
1

Open MCP Configuration

Navigate to your Cursor MCP configuration file. You can use the global configuration at ~/.cursor/mcp.json, or create a project-level .cursor/mcp.json in your workspace root.
2

Add the Coreflux MCP Server

Add the following entry to the mcpServers object:
{
  "mcpServers": {
    "coreflux": {
      "url": "https://mcp-beta.coreflux.org/mcp"
    }
  }
}
3

Restart Cursor

Restart Cursor or reload the window for the MCP server to be detected. You should see the Coreflux tools become available in your AI assistant’s tool list.
4

Verify the Connection

Open Cursor Settings and navigate to the MCP section. You should see coreflux listed as a connected server with its tools available.
If you already have other MCP servers configured, simply add the "coreflux" entry alongside them inside the existing mcpServers object.

Available Tools

Once connected, the Coreflux MCP Server provides two documentation tools. Your AI assistant discovers these tools automatically and uses them when your questions relate to Coreflux.

Search Documentation

Tool name: consult_documentation Performs semantic and keyword search across the full Coreflux documentation. Returns ranked results with titles, URLs, and content snippets from actual documentation pages.
ParameterTypeRequiredDescription
querystringYesThe search query. Be specific for better results
page_sizeintegerNoNumber of results to return (1–20). Default: 10
response_formatstringNoOutput format: markdown or json. Default: markdown
Best for:
  • Looking up specific syntax (e.g., “CALL PYTHON syntax”, “ON TOPIC trigger”)
  • Finding code examples for a particular feature
  • Browsing documentation pages on a topic
  • Quick reference lookups
Example prompts:
  • “Search the Coreflux docs for Modbus TCP route configuration”
  • “Find examples of LoT actions that use Python integration”
  • “Look up the syntax for data storage routes”

Ask Documentation Assistant

Tool name: consult_internal_documentation An AI assistant trained on Coreflux documentation that provides synthesized, natural language answers. It understands context, compares approaches, and includes source references from the documentation.
ParameterTypeRequiredDescription
querystringYesYour question about Coreflux, LoT, or related topics
conversation_historystringNoPrevious conversation in JSON format for multi-turn context
Best for:
  • Learning new concepts with explanations (e.g., “What is the difference between Actions and Models?”)
  • Step-by-step guidance (e.g., “How do I create a time-based action?”)
  • Comparing approaches and understanding best practices
  • Getting architecture advice (e.g., “How should I structure a system to monitor 100 sensors?”)
Example prompts:
  • “Ask the Coreflux docs: what are the best practices for error handling in LoT actions?”
  • “How do I set up a PostgreSQL route to store sensor data?”
  • “What’s the difference between LoT Rules and LoT Actions?”

Using Both Tools Together

The two tools complement each other. A good workflow is:
  1. Start with the assistant (consult_internal_documentation) to understand concepts and get guided explanations
  2. Follow up with search (consult_documentation) to find additional code examples or dive deeper into specific reference pages
Your AI assistant may use both tools automatically in a single response when it needs both conceptual understanding and specific code references to answer your question fully.

Best Practices

The more specific your question, the better the results. Instead of asking “tell me about routes,” try “how do I configure a PostgreSQL data storage route with authentication.” Specific queries help the tools return more relevant documentation.
When you need to verify information, ask your assistant to include the documentation source links. The consult_internal_documentation tool includes source references in its responses, letting you navigate directly to the relevant docs page.
Before deploying LoT code, ask your assistant to validate the syntax against the documentation. For example: “Check the Coreflux docs — is this the correct syntax for a Modbus TCP route?” This ensures your code follows the latest documented patterns.
The documentation assistant supports multi-turn conversations. Ask follow-up questions to drill deeper into a topic without repeating context — the assistant remembers what you discussed previously in the same conversation.
The Coreflux MCP Server is currently in beta. The server URL and available tools may change as the service evolves. Check this page for the latest configuration instructions.

Next Steps