LLM-ready API
Integrations

Integrations with LLMs

The LLM-ready API supports both code generation and function calling as data access patterns. In code generation, the LLM generates Python code that interacts with the API, which the LLM application or user must execute in a Python environment. In function calling, the LLM directly invokes API operations within the interaction. Code generation offers more flexibility and control, while function calling is simpler and limited to predefined capabilities. "Best" is use case specific.

Using the examples provided in the Playground Notebooks (opens in a new tab), you can experiment with the LLM-ready API without having to integrate it with your own LLM or application.

Code Generation

An example of a single pass of model prompt-response:

# Follow previous steps of usage guide...
 
# Ask your question 
question = "FAANG companies annual revenues over the last 5 years?" 
 
# Combine your question with the prompt into one string 
input_prompt = f"{client.prompt} + \nQUESTION: {question}" 
 
from openai import OpenAI
 
OpenAI_Client = OpenAI(api_key=<Your OpenAI key>, # replace with your own key)
open_ai_model_response = OpenAI_Client.chat.completions.create(
   model="gpt-4o",
       messages=[
           {
               "role": "user",
               "content": input_prompt,
           }
       ],
       temperature = 0,
       seed = 0,
   ).choices[0].message.content
print(open_ai_model_response)

Then execute the resulting code in a Python environment.

When incorporated as a tool in an LLM application (with other tools orchestrated as a system), the LLM-ready API can drive richer and more complex end-user interactions.

Function Calling

We provide kfinance_client.tools as external tools that can be connected to models, enable them to fetch data, take actions, and handle responses. To learn more, check out documentation from OpenAI (opens in a new tab), Anthropic (opens in a new tab), and Google (opens in a new tab).

Below is an example of how to integrate the LLM-ready API using function calling with GPT, Claude, and Gemini models:

from openai import OpenAI
 
OpenAI(api_key=<Your OpenAI key>).chat.completions.create(
    model=<Your model of choice>,
    messages=<Your system prompt and other chat calls>,
    tools=kfinance_client.openai_tool_descriptions
)

The LLM-ready API can augment various third-party applications with S&P Global's trusted and timely financial data.

ChatGPT

The LLM-ready API is available via the S&P Global GPT in ChatGPT. Simply log in to ChatGPT (opens in a new tab), and search for "S&P Global" in the GPT store, then add it to your workspace.

To use the S&P Global GPT, simply navigate to the sidebar or type S&P Global into the chat bar. After you submit a question, you will be prompted to sign in to the LLM-ready API using your browser via the Okta authentication window. Log-in using your LLM-ready API credentials.

After a successful authentication, you will be redirected to your ChatGPT session where the results of the LLM-ready API call will be displayed.

Like the LLM-Ready API, the GPT is in Beta. We look forward to improving its performance and learning more about how it can be most useful to your workflows.