Skip to main content

Fireworks AI

Fireworks AI provides optimized inference for open-source models with an OpenAI-compatible API.

Install

pip install synapsekit[openai]

Fireworks AI uses the OpenAI-compatible API, so it requires the openai package.

Usage

from synapsekit import LLMConfig
from synapsekit.llm.fireworks import FireworksLLM

llm = FireworksLLM(LLMConfig(
model="accounts/fireworks/models/llama-v3p3-70b-instruct",
api_key="...",
))

async for token in llm.stream("What is RAG?"):
print(token, end="", flush=True)

Available models

ModelID
Llama 3.3 70Baccounts/fireworks/models/llama-v3p3-70b-instruct
Mixtral 8x7Baccounts/fireworks/models/mixtral-8x7b-instruct
Qwen 2.5 72Baccounts/fireworks/models/qwen2p5-72b-instruct

See the full list at fireworks.ai/models.

Function calling

result = await llm.call_with_tools(messages, tools)

Custom base URL

llm = FireworksLLM(config, base_url="http://localhost:8000/v1")

Parameters

ParameterDescription
modelFireworks model ID
api_keyYour Fireworks API key
base_urlCustom API base URL (default: https://api.fireworks.ai/inference/v1)