Skip to content

LangChain.js

Overview

Floopy is a zero-SDK gateway. The @langchain/openai package already supports custom base URLs, so you can route all LangChain.js requests through Floopy without any extra dependencies. You get caching, rate limiting, fallbacks, and observability for free.

Installation

Terminal window
npm install @langchain/openai

Configuration

import { ChatOpenAI } from "@langchain/openai";
const model = new ChatOpenAI({
configuration: {
baseURL: "https://api.floopy.ai/v1",
apiKey: process.env.FLOOPY_API_KEY, // starts with fp_
},
modelName: "gpt-4o",
});

Set FLOOPY_API_KEY in your environment. You can create one in the dashboard.

Basic Request

const response = await model.invoke("Explain quantum computing in one sentence.");
console.log(response.content);

Switch providers by changing the model name:

const anthropicModel = new ChatOpenAI({
configuration: {
baseURL: "https://api.floopy.ai/v1",
apiKey: process.env.FLOOPY_API_KEY,
},
modelName: "claude-sonnet-4-20250514",
});
const response = await anthropicModel.invoke("Hello!");

Streaming

const stream = await model.stream("Write a short poem about AI.");
for await (const chunk of stream) {
process.stdout.write(chunk.content as string);
}

Custom Headers

Pass Floopy-specific headers using configuration.defaultHeaders:

const model = new ChatOpenAI({
configuration: {
baseURL: "https://api.floopy.ai/v1",
apiKey: process.env.FLOOPY_API_KEY,
defaultHeaders: {
"Floopy-Cache": "semantic",
"floopy-property-environment": "production",
"floopy-property-feature": "chat",
"floopy-fallback": "claude-sonnet-4-20250514",
},
},
modelName: "gpt-4o",
});
HeaderDescription
Floopy-CacheCache strategy: semantic or exact
floopy-property-*Attach custom metadata for filtering in the dashboard
floopy-fallbackFallback model if the primary provider fails
floopy-session-idGroup related requests into a session
floopy-user-idAssociate requests with an end user

See the Headers Reference for the full list.