Skip to main content

Together AI

Together AI provides fast inference on open-source models with an OpenAI-compatible API.

Install

pip install synapsekit[openai]

Together AI uses the OpenAI-compatible API, so it requires the openai package.

Usage

from synapsekit import LLMConfig
from synapsekit.llm.together import TogetherLLM

llm = TogetherLLM(LLMConfig(
model="meta-llama/Llama-3.3-70B-Instruct-Turbo",
api_key="...",
))

async for token in llm.stream("What is RAG?"):
print(token, end="", flush=True)

Available models

ModelID
Llama 3.3 70B Turbometa-llama/Llama-3.3-70B-Instruct-Turbo
Mixtral 8x7Bmistralai/Mixtral-8x7B-Instruct-v0.1
Qwen 2.5 72BQwen/Qwen2.5-72B-Instruct-Turbo
DeepSeek V3deepseek-ai/DeepSeek-V3

See the full list at together.ai/models.

Function calling

result = await llm.call_with_tools(messages, tools)

Custom base URL

llm = TogetherLLM(config, base_url="http://localhost:8000/v1")

Parameters

ParameterDescription
modelTogether AI model ID
api_keyYour Together AI API key
base_urlCustom API base URL (default: https://api.together.xyz/v1)