Skip to main content

AWS Bedrock

Run Claude, Titan, Llama, and other models via AWS Bedrock.

Install

pip install synapsekit[bedrock]

AWS credentials must be configured (e.g. via ~/.aws/credentials, environment variables, or an IAM role).

Via the RAG facade

from synapsekit import RAG

# Claude on Bedrock
rag = RAG(
model="anthropic.claude-3-sonnet-20240229-v1:0",
api_key="env", # uses AWS credential chain
provider="bedrock",
)
rag.add("Your document text here")
answer = rag.ask_sync("Summarize the document.")

Direct usage

from synapsekit.llm.bedrock import BedrockLLM
from synapsekit.llm.base import LLMConfig

llm = BedrockLLM(
LLMConfig(
model="anthropic.claude-3-haiku-20240307-v1:0",
api_key="env",
provider="bedrock",
temperature=0.3,
max_tokens=1024,
),
region="us-east-1",
)

async for token in llm.stream("What is SynapseKit?"):
print(token, end="", flush=True)

Supported model families

FamilyExample model ID
Anthropic Claudeanthropic.claude-3-sonnet-20240229-v1:0
Amazon Titanamazon.titan-text-express-v1
Meta Llamameta.llama2-13b-chat-v1

See AWS Bedrock docs for the full list.