Skip to content

What is FlowKit?

FlowKit is a TypeScript library for building structured AI conversational flows. It provides a declarative way to define conversation logic while leveraging Large Language Models (LLMs) for natural language understanding.

The Problem

Building conversational AI applications typically requires:

  • Complex state management - Tracking conversation context, user data, and flow position
  • Regex-based parsing - Brittle pattern matching for extracting information
  • Provider lock-in - Code tightly coupled to a specific LLM provider
  • Boilerplate code - Hundreds of lines just to handle basic conversations

The Solution

FlowKit abstracts away the complexity:

typescript
const myFlow = flow("support", agent)
  .ask("issue", "What's the problem?", text(), "issue")
  .then("category")
  .ask("category", "Is this about: billing, technical, or other?", 
       oneOf(["billing", "technical", "other"]), "category")
  .when({
    billing: "billing_help",
    technical: "tech_help",
    other: "general_help"
  })
  // ... branches continue

Key Concepts

Agents

An Agent defines your bot's personality, language, and behavior:

typescript
const bot = agent("Alex")
  .company("TechCorp")
  .personality("friendly and professional")
  .language("en")
  .build();

Flows

A Flow defines the conversation structure with steps, transitions, and data extraction:

typescript
const flow = flow("onboarding", bot)
  .ask("name", "What's your name?", name(), "user_name")
  .then("email")
  .ask("email", "What's your email?", email(), "user_email")
  .done()
  .build();

Extractors

Extractors use the LLM to pull structured data from natural language:

  • name() - Extract names
  • email() - Extract email addresses
  • number() - Extract numbers
  • yesNo() - Extract yes/no responses
  • oneOf([...]) - Extract from a list of options
  • custom(...) - Custom extraction logic

FlowEngine

The FlowEngine orchestrates everything:

typescript
const engine = new FlowEngine(flow, {
  llm: new OllamaAdapter({ model: "llama3.2" }),
  storage: new MemoryStorage(),
});

const result = await engine.start("session-123");
const response = await engine.handle("session-123", "I'm John");

Features at a Glance

FeatureDescription
Multi-ProviderOllama, OpenAI, Anthropic, Gemini, Groq, OpenRouter
Storage AdaptersMemory, File, Redis, SQLite
Smart ExtractionLLM-powered data extraction from natural language
Flow ControlBranching, loops, conditions, inline actions
EventsTrack conversation progress for analytics
ToolsExecute custom functions during flows
MiddlewarePre/post processing hooks
HandoffDetect when human intervention is needed
Multi-ChannelWhatsApp, Telegram, Slack, Web adapters

Next Steps

  • Quick Start - Get up and running in 5 minutes
  • Agents - Learn about agent configuration
  • Flows - Master the flow builder API

Released under the MIT License.