Integrating Router
OpenAI Python​
Using OpenAI's SDK, the following function.
from openai import OpenAI
client = OpenAI(
api_key="YOUR_UMAMIAI_API_KEY", # defaults to os.environ.get("UMAMIAI_API_KEY")
)
chat_completion = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[
{
"role": "user",
"content": "Say this is a test",
}
],
)
becomes
from openai import OpenAI
client = OpenAI(
api_key="YOUR_UMAMIAI_API_KEY", # defaults to os.environ.get("UMAMIAI_API_KEY")
base_url="https://api.umamiai.xyz/router",
)
chat_completion = client.chat.completions.create(
model="router",
messages=[
{
"role": "user",
"content": "Say this is a test",
}
],
)
Setting Parameters​
To route across a subset of models, specifiy the array of whitelisted origins:
chat_completion = client.chat.completions.create(
model="router",
messages=[
{
"role": "user",
"content": "Say this is a test",
}
],
extra_body={
"models": ["gpt-3.5-turbo", "claude-2.1"]
}
)
To specify a max cost and willingness to pay for a 10% improvement on model quality on each request, set the parameters in the following way:\
client.chat.completions.create(
model="router",
messages=[
{"role": "user", "content": "Hello world!"}
],
extra_body={
"models": ["gpt-3.5-turbo", "claude-2.1"],
"max_cost": 0.02,
"willingness_to_pay": 0.01
},
)
Finally, to associate metadata (for instance about a user or region) with each request, set the extra body parameter like so:
client.chat.completions.create(
model="router",
messages=[
{"role": "user", "content": "Write small poem"},
],
extra_body={
"models": ["gpt-3.5-turbo", "claude-2.1"],
"max_cost": 0.02,
"willingness_to_pay": 0.01,
"extra": {
"ip": "123.123.123.123",
"Timezone": "UTC+0",
"Country": "US",
"City": "New York",
}
},
)
OpenAI Node​
Using OpenAI's SDK, the following function
import OpenAI from 'openai';
const openai = new OpenAI({
apiKey: 'YOUR_UMAMIAI_API_KEY', // defaults to process.env["UMAMIAI_API_KEY"]
});
async function main() {
const chatCompletion = await openai.chat.completions.create({
model: 'gpt-3.5-turbo',
messages: [{ role: 'user', content: 'Say this is a test' }],
});
}
main();
becomes
import OpenAI from 'openai';
const openai = new OpenAI({
apiKey: 'YOUR_UMAMIAI_API_KEY', // defaults to process.env["UMAMIAI_API_KEY"]
baseURL: "https://api.umamiai.xyz/router"
});
async function main() {
const chatCompletion = await openai.chat.completions.create({
model: 'router',
messages: [{ role: 'user', content: 'Say this is a test' }],
});
}
main();
OpenAI cURL​
Usage​
The following request
curl https://api.openai.com/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $OPENAI_API_KEY" \
-d '{
"model": "gpt-3.5-turbo",
"messages": [
{
"role": "user",
"content": "Hello world!"
},
{
"role": "assistant",
"content": "Hello! How can I assist you today?"
}
],
}'
becomes
curl https://api.umamiai.xyz/router/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_UMAMIAI_API_KEY" \
-d '{
"model": "router",
"messages": [
{
"role": "user",
"content": "Hello world!"
},
{
"role": "assistant",
"content": "Hello! How can I assist you today?"
}
],
}'
LangChain Python​
Usage​
The following code snippet
from langchain.chat_models import ChatOpenAI
from langchain.schema import HumanMessage
llm = ChatOpenAI(
model_name="gpt-3.5-turbo",
openai_api_key="YOUR_UMAMIAI_API_KEY"
)
messages = [HumanMessage(content="Say this is a test")]
llm.invoke(messages)
becomes
from langchain.chat_models import ChatOpenAI
from langchain.schema import HumanMessage
llm = ChatOpenAI(
model_name="router",
openai_api_base="https://api.umamiai.xyz/router",
openai_api_key="YOUR_UMAMIAI_API_KEY"
)
messages = [HumanMessage(content="Say this is a test")]
llm.invoke(messages)
LangChain JS​
Usage​
The following code snippet
import { ChatOpenAI } from "langchain/chat_models/openai";
import { HumanMessage } from "langchain/schema";
async function main() {
const chat = new ChatOpenAI({
modelName: "gpt-3.5-turbo",
openAIApiKey: "YOUR_UMAMIAI_API_KEY",
});
await chat.call([new HumanMessage("Say this is a test")]);
}
main();
becomes
import { ChatOpenAI } from "langchain/chat_models/openai";
import { HumanMessage } from "langchain/schema";
async function main() {
const chat = new ChatOpenAI({
modelName: "router",
configuration: {
baseURL: "https://api.umamiai.xyz/router",
},
openAIApiKey: "YOUR_UMAMIAI_API_KEY",
});
await chat.call([new HumanMessage("Say this is a test")]);
}
main();