Skip to content

Examples

Complete examples showing FlowKit in action. Each example demonstrates different features and use cases.

Available Examples

TIP

These examples are fully functional and can be run with npx tsx examples/<name>.ts

ExampleDescriptionFeatures
Customer Service BotSpanish telecom support botStrict mode, branching, multilingual
Support BotEnglish tech supportFlexible mode, natural conversation
Lead CaptureLead collection with toolsTools, email/name extraction

Running Examples

bash
# Clone the repository
git clone https://github.com/andresayac/flowkit
cd flowkit

# Install dependencies
pnpm install

# Make sure Ollama is running (for local examples)
ollama run qwen3:4b

# Run an example
npx tsx examples/simple-bot.ts

Example Structure

All examples follow a similar pattern:

typescript
import { agent, flow, FlowEngine, MemoryStorage, OllamaAdapter } from "@andresaya/flowkit";

// 1. Define the agent
const myAgent = agent("Bot Name")
  .personality("...")
  .build();

// 2. Build the flow
const myFlow = flow("flow-id", myAgent)
  .ask(...)
  .then(...)
  .say(...)
  .done()
  .build();

// 3. Create the engine
const engine = new FlowEngine(myFlow, {
  llm: new OllamaAdapter({ model: "qwen3:4b" }),
  storage: new MemoryStorage(),
});

// 4. Run the conversation
const result = await engine.start("session-id");

Quick Tips

Use the right mode

  • Strict mode for scripted, predictable flows (customer service, verification)
  • Flexible mode for natural, adaptive conversations (general assistance)

Choose your model wisely

  • qwen3:4b - Best for JSON extraction (recommended for strict mode)
  • llama3.2 - Good general purpose
  • gpt-4o-mini - Best quality with OpenAI

Event Handling

Remember to use onEvent in the engine config, not engine.on():

typescript
const engine = new FlowEngine(flow, {
  llm, storage,
  onEvent: (event) => console.log(event.type)
});

Released under the MIT License.