Google Gemini
Overview
Google Gemini is a family of multimodal AI models built by Google DeepMind. Floopy automatically translates OpenAI-format requests to Google’s generateContent API, so you can use Gemini models with the same code you use for any other provider.
Supported Models
| Model | Context Window | Notes |
|---|---|---|
gemini-2.5-pro | 1M | Most capable Gemini model |
gemini-2.5-flash | 1M | Fast, high-quality |
gemini-2.0-flash | 1M | Previous generation flash |
gemini-1.5-pro | 2M | Long-context specialist |
gemini-1.5-flash | 1M | Legacy flash model |
Setup
- Go to Settings > Providers in the dashboard.
- Click Add provider and select Google Gemini.
- Paste your Google AI API key and click Save.
Usage
import OpenAI from "openai";
const client = new OpenAI({ baseURL: "https://api.floopy.ai/v1", apiKey: process.env.FLOOPY_API_KEY,});
const response = await client.chat.completions.create({ model: "gemini-2.5-flash", messages: [{ role: "user", content: "Explain quantum computing." }],});from openai import OpenAI
client = OpenAI(base_url="https://api.floopy.ai/v1", api_key=os.environ["FLOOPY_API_KEY"])
response = client.chat.completions.create( model="gemini-2.5-flash", messages=[{"role": "user", "content": "Explain quantum computing."}],)curl https://api.floopy.ai/v1/chat/completions \ -H "Authorization: Bearer $FLOOPY_API_KEY" \ -H "Content-Type: application/json" \ -d '{"model": "gemini-2.5-flash", "messages": [{"role": "user", "content": "Explain quantum computing."}]}'Provider-Specific Features
- Automatic format translation — Floopy converts OpenAI chat completion format to Google’s
generateContentAPI and back. System instructions, function calling, and multi-turn conversations are all supported. - Long context — Gemini models support up to 2M token context windows. Floopy passes the full context through without truncation.
Fallback
Route to OpenAI if Gemini is unavailable:
curl https://api.floopy.ai/v1/chat/completions \ -H "Authorization: Bearer $FLOOPY_API_KEY" \ -H "x-floopy-fallback-provider: openai" \ -H "x-floopy-fallback-model: gpt-4o" \ -H "Content-Type: application/json" \ -d '{"model": "gemini-2.5-flash", "messages": [{"role": "user", "content": "Hello"}]}'