AI promises to revolutionize incident response. But without real-time access to your observability stack, even the most advanced LLM can’t help you troubleshoot production issues.
Before now, connecting an LLM to a new data source was a significant engineering effort. Each connection required a custom, vendor-specific integration, creating a complex and brittle web of one-off solutions. This integration challenge meant that for every new AI model or external tool, a new bespoke connection had to be built and maintained. This approach slowed innovation, increased development costs, and created vendor lock-in, making it difficult for teams to adopt the best tools for the job.
The industry needed a universal standard to simplify this messy and complicated system, and the Model Context Protocol (MCP) introduced by Anthropic in 2024 just fit the bill.
Introducing the Chronosphere MCP Server: Your AI's gateway to guided observability
In alignment with the industry’s shift towards standardized, context-aware AI, Chronosphere is announcing the general availability of the Chronosphere MCP Server. This new component serves as a secure, open, and controlled bridge, connecting AI agents and large language models directly to the wealth of curated observability data within the Chronosphere platform.
General availability: The Chronosphere MCP Server is here
The Chronosphere MCP Server is now generally available to all Chronosphere customers. It is an open-source, read-only bridge that connects large language models (LLMs) and AI agents directly and securely to observability data within the Chronosphere platform. By implementing the MCP standard, the server provides a robust and universal interface for AI tools to query telemetry data, inspect platform configurations, and gain the situational awareness needed to accelerate troubleshooting and analysis.
The Chronosphere MCP is available in two flavors. Chronosphere hosts and runs a production instance of the MCP server that customers can use to connect their LLM applications to their own Chronosphere tenants. In addition, the Chronosphere MCP is fully open sourced and is available on GitHub for customers to download and self-host their own server.
Core principles: Secure, open, and aware
The design of the Chronosphere MCP Server is founded on three core principles:
- Secure by design: The server provides strictly read-only access to the Chronosphere platform. AI agents can query and analyze data, but they are architecturally prevented from performing any mutation actions like writing metrics, updating dashboards, or deleting monitors.
- Open and extensible: The Chronosphere MCP Server is fully open source, with its Go language-based source code available on GitHub. This transparency aligns with the open-standards ethos of the DevOps and SRE communities.
- Context aware: The MCP Server allows LLMs to reap the benefits of Chronosphere’s temporal knowledge graph. The knowledge graph builds context and relationships within the sea of observability telemetry to accelerate guided troubleshooting and enhance system knowledge. The MCP Server acts as the controlled gateway for AI to access dashboards, monitors, and Service Level Objectives (SLOs) powered by these relationships, ensuring agents operate on a foundation of reliable truth.
What can your AI agent do? A tour of the available tools
The Chronosphere MCP Server exposes a comprehensive set of read-only tools that allow AI agents to deeply explore your observability landscape. Here’s a few of the things you can support with our MCP server:
- Query telemetry data: Execute PromQL queries, fetch logs by ID, query log ranges, list traces, and correlate change events with system behavior.
- Inspect platform configuration: Get and list dashboards, fetch monitor configurations, and review SLOs.
- Understand data shaping and control: List all data shaping rules (Drop, Mapping, Rollup, and Recording Rules) to give AI insight into how data is being optimized by the Control Plane.
Check out all of the resources within Chronosphere which the MCP Server can access.
Getting started with the Chronosphere MCP Server
Adopting the Chronosphere MCP Server is a straightforward process. The central resource is its public GitHub repository, which contains the source code and detailed documentation.
- Primary Resource: https://github.com/chronosphereio/chronosphere-mcp
To integrate the server with an AI-enabled tool like Cursor, you only need to add a small JSON snippet to your configuration file defining the server’s endpoint and your API token.
JSON
{
"mcpServers": {
"chronosphere": {
"url": "https://.chronosphere.io/api/mcp/mcp",
"headers": {
"Authorization": "Bearer "
}
}
}
}
For comprehensive details on all available tools and advanced configuration, consult the official product documentation.
- Official Documentation: https://docs.chronosphere.io/integrate/mcp-server
Summary
The launch of the Chronosphere MCP Server marks a pivotal moment in the evolution of modern observability. As cloud native systems grow more complex, the gap between data and actionable insight widens. By embracing the open MCP standard, we are providing a secure, controlled, and open-source bridge that connects the power of AI directly to the curated, high-signal data within the Chronosphere platform. This isn’t just about adding another tool; it’s about fundamentally transforming how SRE, DevOps, and developer teams interact with their systems. We’re empowering engineers to move from reactive firefighting to proactive, AI-assisted problem-solving, laying the groundwork for a future of more intelligent and resilient operations.
Frequently Asked Questions
What exactly is the Model Context Protocol (MCP)?
The Model Context Protocol (MCP) is a universal standard, like USB-C, but for AI. It provides a common, open-source language that allows AI models to securely connect to external tools and data sources, giving them the real-time context they need to be effective.
Is the Chronosphere MCP Server secure?
Yes. The server is strictly read-only. It empowers AI agents to query data and read configurations but architecturally prevents them from making any changes to your production environment.
Is this a proprietary Chronosphere feature?
No. The Chronosphere MCP Server is fully open source, and the code is available in our public GitHub repository. We are committed to open, flexible standards that prevent vendor lock-in.
Which AI tools can I use with the server?
The server is designed to be compatible with any tool that supports the open MCP standard, including a growing ecosystem of popular AI-enabled developer tools like Claude Code, Cursor, Codex, and Gemini.
How does the MCP Server relate to the Chronosphere Control Plane?
They work together. The Control Plane refines your raw observability data to reduce noise and control costs. The MCP Server then provides a secure gateway for AI agents to access that high-quality, curated data, ensuring your AI operates on a foundation of ground truth.
O’Reilly eBook: Cloud Native Observability
Master cloud native observability. Download O’Reilly’s Cloud Native Observability eBook now!