Node SDK (floopy-sdk)
floopy-sdk is the official Node/TypeScript client for the Floopy gateway. It wraps the upstream openai package by composition (no fork, no reimplementation) and adds first-class typed methods for every Floopy-only endpoint.
- npm:
floopy-sdk - source: github.com/floopy/floopy-sdk-node
- minimum Node:
>=20
Install
Section titled “Install”pnpm add floopy-sdk # or: npm i floopy-sdk / bun add floopy-sdkQuick start
Section titled “Quick start”import { Floopy } from "floopy-sdk";
const floopy = new Floopy({ apiKey: process.env.FLOOPY_API_KEY!,});
const response = await floopy.chat.completions.create({ model: "gpt-4o", messages: [{ role: "user", content: "Hello from Floopy!" }],});
console.log(response.choices[0]?.message?.content);floopy.chat, floopy.embeddings, and floopy.models delegate to a lazy openai client preconfigured to talk to https://api.floopy.ai/v1. Types match the upstream openai package one-for-one.
Floopy options
Section titled “Floopy options”Pass typed gateway behavior toggles through options. They map to Floopy-* headers and apply to every request the client makes:
const floopy = new Floopy({ apiKey: process.env.FLOOPY_API_KEY!, options: { cache: { enabled: true, bucketMaxSize: 3 }, promptId: "cd4249d5-44d5-46c8-8961-9eb3861e1f7e", promptVersion: "1", llmSecurityEnabled: true, },});| Option | Header | Purpose |
|---|---|---|
cache.enabled | Floopy-Cache-Enabled | Toggle exact + semantic cache |
cache.bucketMaxSize | Floopy-Cache-Bucket-Max-Size | Max entries per semantic bucket |
promptId | Floopy-Prompt-Id | Stored prompt to resolve |
promptVersion | Floopy-Prompt-Version | Pinned version for promptId |
llmSecurityEnabled | floopy-llm-security-enabled | LLM firewall pre-check |
Per-call overrides go through the second argument of every Floopy resource method.
Floopy-only resources
Section titled “Floopy-only resources”Each maps to a public /v1/* endpoint. Errors throw FloopyError subclasses (see below).
feedback
Section titled “feedback”const r = await floopy.chat.completions.create({ /* ... */ });await floopy.feedback.submit({ score: 9, useful: true, sessionId: r.id });decisions
Section titled “decisions”const decision = await floopy.decisions.get(requestId);
const page = await floopy.decisions.list({ from, to, limit: 50 });
for await (const d of floopy.decisions.iterate({ from })) { /* one decision */ }for await (const p of floopy.decisions.pages({ from })) { /* one page */ }experiments
Section titled “experiments”const exp = await floopy.experiments.create({ name: "cost-vs-quality", variantARoutingRuleId: ruleA, variantBRoutingRuleId: ruleB,});const results = await floopy.experiments.results(exp.id);await floopy.experiments.rollback(exp.id);create and rollback automatically include the X-Floopy-Confirm: experiments header the gateway requires.
constraints
Section titled “constraints”const current = await floopy.constraints.get();await floopy.constraints.put({ costLimitMonthlyUsd: 100 });put is full-replace: omitted fields are reset to null.
export
Section titled “export”for await (const row of floopy.export.decisions({ from, to })) { // streamed JSONL, parsed and typed}
const { rows, trailer } = floopy.export.decisionsWithTrailer({ from, to });for await (const r of rows) { /* ... */ }console.log(trailer.value); // populated after iteration completesevaluations
Section titled “evaluations”const run = await floopy.evaluations.create({ datasetId, model: "gpt-4o" });const status = await floopy.evaluations.get(run.id);const page = await floopy.evaluations.results(run.id, { limit: 100 });await floopy.evaluations.cancel(run.id);routing.explain
Section titled “routing.explain”const explain = await floopy.routing.explain({ model, messages });console.log(explain.wouldSelect, explain.firewallDecision);Pro plan only. wouldSelect is null when the firewall would block the request.
Streaming
Section titled “Streaming”Chat streaming is delegated to openai-node and returns an AsyncIterable<ChatCompletionChunk>:
const stream = await floopy.chat.completions.create({ model: "gpt-4o-mini", messages: [{ role: "user", content: "stream a haiku" }], stream: true,});for await (const chunk of stream) { process.stdout.write(chunk.choices[0]?.delta?.content ?? "");}floopy.export.decisions is a true async iterator over the gateway’s JSONL stream and respects the trailer record.
Error handling
Section titled “Error handling”import { FloopyAuthError, FloopyPlanError, FloopyRateLimitError, FloopyValidationError, FloopyServerError,} from "floopy-sdk";
try { await floopy.export.decisions({ from, to });} catch (err) { if (err instanceof FloopyRateLimitError) { await sleep((err.retryAfterSeconds ?? 1) * 1000); } else if (err instanceof FloopyPlanError) { console.error(`Upgrade plan: feature ${err.feature} not in current plan`); } else throw err;}chat.completions and embeddings throw OpenAI.APIError subclasses from the upstream SDK.
Self-hosting / custom base URL
Section titled “Self-hosting / custom base URL”const floopy = new Floopy({ apiKey: process.env.FLOOPY_API_KEY!, baseURL: "https://gateway.internal.acme.com/v1",});Roadmap
Section titled “Roadmap”The SDK is on 0.x while the surface stabilizes. Coming up:
- React hooks helper (
floopy-sdk/react). - Edge / Deno entry points.
- Generated types from the gateway’s Rust models (today the types are hand-written and kept in sync via tests).