Skip to content

OpenAI

FlowKit supports OpenAI models via the OpenAIAdapter.

Setup

bash
export OPENAI_API_KEY=sk-...

Usage

typescript
import { OpenAIAdapter } from "@andresaya/flowkit";

const adapter = new OpenAIAdapter({
  apiKey: process.env.OPENAI_API_KEY!,
  model: "gpt-4o-mini",
});

const engine = new FlowEngine(flow, { llm: adapter, storage });

Configuration

typescript
interface OpenAIConfig {
  /** OpenAI API key (required) */
  apiKey: string;
  /** Model to use (default: gpt-4o-mini) */
  model?: string;
  /** Temperature (default: 0) */
  temperature?: number;
  /** Timeout in ms (default: 60000) */
  timeout?: number;
  /** Base URL (default: https://api.openai.com/v1) */
  baseUrl?: string;
  /** Enable streaming (default: false) */
  streaming?: boolean;
}

Streaming

Enable streaming for real-time responses:

typescript
const adapter = new OpenAIAdapter({
  apiKey: process.env.OPENAI_API_KEY!,
  model: "gpt-4o-mini",
  streaming: true,
});

Available Models

ModelContextBest ForCost
gpt-4o-mini128KFast, cheap, good quality$
gpt-4o128KBest quality$$$
gpt-5.2400KComplex tasks, coding, agentic workflows$$$
gpt-5-mini400KFaster, cost-efficient, well-defined tasks$$
gpt-5-nano400KFastest, most cost-efficient, basic tasks$

Example

typescript
import { 
  agent, flow, FlowEngine, MemoryStorage, OpenAIAdapter, 
  name, yesNo 
} from "@andresaya/flowkit";

const bot = agent("Alex")
  .company("MyCompany")
  .personality("friendly")
  .build();

const myFlow = flow("greeting", bot)
  .ask("name", "What's your name?", name(), "user_name")
  .then("confirm")
  .ask("confirm", "Nice to meet you {{user_name}}! Need help?", yesNo(), "needs_help")
  .when({ yes: "help", no: "bye" })
  .say("help", "How can I help?")
  .done()
  .say("bye", "Goodbye!")
  .done()
  .build();

const engine = new FlowEngine(myFlow, {
  llm: new OpenAIAdapter({
    apiKey: process.env.OPENAI_API_KEY!,
    model: "gpt-4o-mini",
  }),
  storage: new MemoryStorage(),
});

const result = await engine.start("session-1");
console.log(result.message);

Features

FeatureSupported
StreamingYes
Tool CallingYes
JSON ModeYes

Tips

  1. Use gpt-4o-mini for most use cases - Best balance of cost and quality
  2. Enable streaming for better UX - Users see responses in real-time
  3. Set temperature to 0 - For more consistent, deterministic responses
  4. Use JSON mode - FlowKit automatically enables JSON mode for structured extraction

Released under the MIT License.