Getting started with Not Diamond

Not Diamond is an AI model router that automatically determines which LLM is best-suited to respond to any query, improving LLM output quality by combining multiple LLMs into a meta-model that learns when to call each LLM.

Key features

Installation

Python: Requires Python 3.10+. It’s recommended that you create and activate a virtualenv prior to installing the package. For this example, we’ll be installing the optional additional create dependencies, which you can learn more about here.

pip install notdiamond[create]

Setting up

Create a .env file with your Not Diamond API key and the API keys of the models you want to route between:

NOTDIAMOND_API_KEY = "YOUR_NOTDIAMOND_API_KEY"
OPENAI_API_KEY = "YOUR_OPENAI_API_KEY"
ANTHROPIC_API_KEY = "YOUR_ANTHROPIC_API_KEY"

Sending your first Not Diamond API request

Create a new file in the same directory as your .env file and copy and run the code below (you can toggle between Python and TypeScript in the top left of the code block):

from notdiamond import NotDiamond

# Define the Not Diamond routing client
client = NotDiamond()

# The best LLM is determined by Not Diamond based on the messages and specified models
result, session_id, provider = client.chat.completions.create(
   messages=[
      {"role": "system", "content": "You are a helpful assistant."},
      {"role": "user", "content": "Consiely explain merge sort."}  # Adjust as desired
   ],
   model=['openai/gpt-3.5-turbo', 'openai/gpt-4o', 'anthropic/claude-3-5-sonnet-20240620']
)

print("ND session ID: ", session_id)  # A unique ID of Not Diamond's recommendation
print("LLM called: ", provider.model)  # The LLM routed to
print("LLM output: ", result.content)  # The LLM response