Python
Overview
Floopy is a zero-SDK gateway. You do not need to install a Floopy-specific library. Just point the standard OpenAI Python SDK at https://api.floopy.ai/v1 and use your Floopy API key. All features — caching, rate limiting, fallbacks, observability — work automatically through headers.
Installation
pip install openaipoetry add openaiConfiguration
from openai import OpenAIimport os
client = OpenAI( base_url="https://api.floopy.ai/v1", api_key=os.environ["FLOOPY_API_KEY"], # starts with fp_)Set FLOOPY_API_KEY in your environment. You can create one in the dashboard.
Basic Request
response = client.chat.completions.create( model="gpt-4o", messages=[ {"role": "user", "content": "Explain quantum computing in one sentence."} ],)
print(response.choices[0].message.content)Switch providers by changing the model name — no other code changes needed:
# Anthropicresponse = client.chat.completions.create( model="claude-sonnet-4-20250514", messages=[{"role": "user", "content": "Hello!"}],)Streaming
stream = client.chat.completions.create( model="gpt-4o", messages=[{"role": "user", "content": "Write a short poem."}], stream=True,)
for chunk in stream: content = chunk.choices[0].delta.content if content: print(content, end="")Custom Headers
Pass Floopy-specific headers using extra_headers:
response = client.chat.completions.create( model="gpt-4o", messages=[{"role": "user", "content": "Hello"}], extra_headers={ "Floopy-Cache": "semantic", "floopy-property-environment": "production", "floopy-property-feature": "chat", "floopy-fallback": "claude-sonnet-4-20250514", },)| Header | Description |
|---|---|
Floopy-Cache | Cache strategy: semantic or exact |
floopy-property-* | Attach custom metadata for filtering in the dashboard |
floopy-fallback | Fallback model if the primary provider fails |
floopy-session-id | Group related requests into a session |
floopy-user-id | Associate requests with an end user |
See the Headers Reference for the full list.