LLM-Ready API MCP Servers
Overview
The Kensho LLM-Ready API supports Model Context Protocol (MCP) integration, enabling seamless connectivity with leading AI platforms. Choose between our managed remote server for immediate deployment or a local implementation for enhanced control and customization.
Deployment Options
Remote MCP Server
Connect to Kensho's hosted MCP server for zero-configuration setup and automatic updates. Our remote server provides enterprise-grade reliability and performance across multiple AI platforms.
Supported Platforms:
- Claude — Full support for Claude Desktop and claude.ai
- Amazon Quick Suite — Full support for Amazon Quick Suite
- Databricks — Full support for Databricks AI
- ChatGPT — Beta integration available
Custom Integrations
Local MCP Server
For organizations requiring on-premises deployment or custom configurations, our local MCP server provides full control over your integration environment.
Get started using the Local MCP Server Setup Guide.
Authentication
For secure access to the Kensho LLM-Ready API through MCP servers, configure your authentication credentials using one of our supported methods.
Get started using the Authentication Setup Guide.