Skip to main content

Environment Variables

Complete reference for all environment variables read by SynapseKit.

LLM Provider API Keys

VariableProviderExample
OPENAI_API_KEYOpenAIsk-proj-...
ANTHROPIC_API_KEYAnthropicsk-ant-...
COHERE_API_KEYCohere...
MISTRAL_API_KEYMistral...
GROQ_API_KEYGroqgsk_...
DEEPSEEK_API_KEYDeepSeek...
OPENROUTER_API_KEYOpenRoutersk-or-...
TOGETHER_API_KEYTogether AI...
FIREWORKS_API_KEYFireworks AI...
PERPLEXITY_API_KEYPerplexitypplx-...
CEREBRAS_API_KEYCerebras...
GOOGLE_API_KEYGeminiAIza...
TAVILY_API_KEYTavilySearchTooltvly-...
BRAVE_SEARCH_API_KEYBraveSearchToolBSA...
GITHUB_TOKENGitHubAPIToolghp_...

AWS Bedrock

VariableDescription
AWS_ACCESS_KEY_IDAWS access key
AWS_SECRET_ACCESS_KEYAWS secret key
AWS_DEFAULT_REGIONAWS region (e.g. us-east-1)
AWS_PROFILENamed profile from ~/.aws/credentials

Azure OpenAI

VariableDescription
AZURE_OPENAI_API_KEYAzure OpenAI key
AZURE_OPENAI_ENDPOINThttps://your-resource.openai.azure.com/
AZURE_OPENAI_API_VERSIONAPI version (e.g. 2024-10-21)

Backends

VariableModuleDescription
REDIS_URLRedisConversationMemory, RedisCheckpointer, RedisLLMCacheredis://localhost:6379/0
DATABASE_URLPostgresCheckpointerpostgresql://user:pass@localhost/dbname
OLLAMA_BASE_URLOllamaLLMhttp://localhost:11434 (default)

Email Tool

VariableDescription
SMTP_HOSTSMTP server hostname
SMTP_PORTSMTP port (default: 587)
SMTP_USERSMTP username
SMTP_PASSWORDSMTP password
SMTP_FROMDefault from address

Slack Tool

VariableDescription
SLACK_WEBHOOK_URLIncoming webhook URL
SLACK_BOT_TOKENBot token (starts with xoxb-)

Jira Tool

VariableDescription
JIRA_URLJira instance URL
JIRA_USERJira username / email
JIRA_API_TOKENJira API token

PromptHub

VariableDefaultDescription
PROMPT_HUB_DIR~/.synapsekit/promptsOverride hub storage directory

Managing API Keys

.env file (development)

# .env
OPENAI_API_KEY=sk-proj-...
ANTHROPIC_API_KEY=sk-ant-...
REDIS_URL=redis://localhost:6379/0
from dotenv import load_dotenv
load_dotenv() # pip install python-dotenv

from synapsekit.llms.openai import OpenAILLM
llm = OpenAILLM(model="gpt-4o-mini") # reads OPENAI_API_KEY automatically

Secrets manager (production)

Use a secrets manager so keys are never stored in files or environment at rest.

AWS Secrets Manager:

import boto3, os

def get_secret(name: str) -> str:
client = boto3.client("secretsmanager")
return client.get_secret_value(SecretId=name)["SecretString"]

os.environ["OPENAI_API_KEY"] = get_secret("prod/openai-api-key")
os.environ["ANTHROPIC_API_KEY"] = get_secret("prod/anthropic-api-key")

GCP Secret Manager:

from google.cloud import secretmanager
import os

def get_gcp_secret(project_id: str, secret_id: str) -> str:
client = secretmanager.SecretManagerServiceClient()
name = f"projects/{project_id}/secrets/{secret_id}/versions/latest"
response = client.access_secret_version(request={"name": name})
return response.payload.data.decode("UTF-8")

os.environ["OPENAI_API_KEY"] = get_gcp_secret("my-project", "openai-api-key")

Azure Key Vault:

from azure.keyvault.secrets import SecretClient
from azure.identity import DefaultAzureCredential
import os

client = SecretClient(
vault_url="https://my-vault.vault.azure.net/",
credential=DefaultAzureCredential()
)
os.environ["OPENAI_API_KEY"] = client.get_secret("openai-api-key").value

Passing keys explicitly (avoid — prefer env vars)

# Only do this for testing or when env vars are impossible.
# Hardcoding keys in source code is a security risk.
llm = OpenAILLM(model="gpt-4o-mini", api_key="sk-...")

Kubernetes Secrets

apiVersion: v1
kind: Secret
metadata:
name: synapsekit-secrets
type: Opaque
stringData:
OPENAI_API_KEY: "sk-proj-..."
ANTHROPIC_API_KEY: "sk-ant-..."
---
apiVersion: apps/v1
kind: Deployment
spec:
template:
spec:
containers:
- name: app
envFrom:
- secretRef:
name: synapsekit-secrets

Docker secrets (compose)

services:
app:
environment:
OPENAI_API_KEY_FILE: /run/secrets/openai_key
secrets:
- openai_key
secrets:
openai_key:
file: ./secrets/openai_key.txt

Validation at startup

It is good practice to validate required variables early so your app fails fast with a clear message rather than at the first LLM call.

import os
from synapsekit.utils import require_env

# Raises EnvironmentError with a helpful message if variable is missing
OPENAI_API_KEY = require_env("OPENAI_API_KEY")
REDIS_URL = require_env("REDIS_URL", default="redis://localhost:6379/0")

Or roll your own:

import os

def require_env(name: str, default: str | None = None) -> str:
value = os.environ.get(name, default)
if value is None:
raise EnvironmentError(
f"Required environment variable '{name}' is not set. "
f"See https://synapsekit.github.io/synapsekit-docs/reference/env-vars"
)
return value