Quick Start
Get FlowKit running in 5 minutes.
Prerequisites
- Node.js 18+
- An LLM provider (Ollama for local, or OpenAI/OpenRouter API key)
Installation
bash
npm install @andresaya/flowkit
# or
pnpm add @andresaya/flowkitYour First Bot
1. Create a file bot.ts
typescript
import {
agent,
flow,
FlowEngine,
MemoryStorage,
OllamaAdapter,
name,
yesNo,
} from "@andresaya/flowkit";
// Define your agent
const assistant = agent("Alex")
.company("MyCompany")
.personality("friendly and helpful")
.language("en")
.build();
// Define your conversation flow
const greetingFlow = flow("greeting", assistant)
// Step 1: Ask for name
.ask("ask_name", "Hello! I'm Alex. What's your name?", name(), "user_name")
.then("ask_help")
// Step 2: Ask if they need help
.ask("ask_help", "Nice to meet you, {{user_name}}! Can I help you with something?", yesNo(), "wants_help")
.when({ "yes": "offer_help", "no": "say_bye" })
// Step 3a: Offer help
.say("offer_help", "Great! I'm here to help. What do you need, {{user_name}}?")
.done()
// Step 3b: Say goodbye
.say("say_bye", "No problem! Have a wonderful day, {{user_name}}!")
.done()
.build();
// Create the engine
const engine = new FlowEngine(greetingFlow, {
llm: new OllamaAdapter({
model: "llama3.2",
baseUrl: "http://localhost:11434"
}),
storage: new MemoryStorage(),
});
// Run the conversation
async function main() {
const sessionId = "user-123";
// Start the conversation
let result = await engine.start(sessionId);
console.log("Bot:", result.message);
// Simulate user responses
result = await engine.handle(sessionId, "I'm Sarah");
console.log("Bot:", result.message);
result = await engine.handle(sessionId, "Yes please!");
console.log("Bot:", result.message);
// Get collected data
console.log("Collected:", result.state.slots);
}
main();2. Run it
bash
# Make sure Ollama is running with llama3.2
ollama run llama3.2
# Run your bot
npx tsx bot.ts3. Output
Bot: Hello! I'm Alex. What's your name?
Bot: Nice to meet you, Sarah! Can I help you with something?
Bot: Great! I'm here to help. What do you need, Sarah?
Collected: { user_name: "Sarah", wants_help: "yes" }Next Steps
- Learn about Agents - Customize your bot's personality
- Build Complex Flows - Branching, loops, and more
- Add Tools - Execute code during conversations
- Use Other Providers - OpenAI, Claude, etc.
Common Patterns
Interactive CLI Bot
typescript
import readline from "node:readline/promises";
import { stdin, stdout } from "node:process";
async function chat() {
const rl = readline.createInterface({ input: stdin, output: stdout });
const sessionId = `session-${Date.now()}`;
let result = await engine.start(sessionId);
console.log(`Bot: ${result.message}`);
while (!result.done) {
const input = await rl.question("You: ");
result = await engine.handle(sessionId, input);
console.log(`Bot: ${result.message}`);
}
rl.close();
}Web API Integration
typescript
// Express.js example
app.post("/chat", async (req, res) => {
const { sessionId, message } = req.body;
let result;
if (message) {
result = await engine.handle(sessionId, message);
} else {
result = await engine.start(sessionId);
}
res.json({
message: result.message,
done: result.done,
data: result.state.slots,
});
});Pre-fill User Data
typescript
// If you already know user info
await engine.setSlot(sessionId, "user_name", "John");
await engine.setSlot(sessionId, "user_email", "john@example.com");
// Start from a specific step
const result = await engine.start(sessionId, {
user_name: "John",
is_premium: true
});