Features
AI Models
Kpow supports 'bring your own' (BYO) AI Models for optional AI features and MCP tool use within the product.
Configuration
You can configure one or more AI model provider to use within Kpow. Within your user preferences you can set the default model to use when using Kpow's AI functionality.
OpenAI
Kpow supports integration with OpenAI for AI/LLM features such as filter generation and other natural language tools.
To enable OpenAI models, set the following environment variables:
Variable | Description | Default | Example |
---|---|---|---|
OPENAI_API_KEY | Your OpenAI API key | (required) | XXXX |
OPENAI_MODEL | Model ID to use | gpt-4o-mini | o3-mini |
See this page for a list of supported OpenAI models.
Example
export OPENAI_API_KEY="XXXX"
export OPENAI_MODEL="o3-mini" # default gpt-4o-mini
Anthropic
Kpow supports integration with Anthropic for AI/LLM use cases.
To enable Anthropic models, set the following environment variables:
Variable | Description | Default | Example |
---|---|---|---|
ANTHROPIC_API_KEY | Your Anthropic API key | (required) | XXXX |
ANTHROPIC_MODEL | Model ID to use | claude-3-7-sonnet-20250219 | claude-opus-4-20250514 |
See this page for a list of supported Anthropic models. Kpow supports any model from the Anthropic API column.
Example
export ANTHROPIC_API_KEY="XXXX"
export ANTHROPIC_MODEL="claude-opus-4-20250514" # default claude-3-7-sonnet-20250219
Ollama
Kpow supports integration with Ollama for AI/LLM use cases.
To enable Ollama models, set the following environment variables:
Variable | Description | Default | Example |
---|---|---|---|
OLLAMA_MODEL | Model ID to use | — | llama3.1:8b |
OLLAMA_URL | URL of the Ollama model server | http://localhost:11434 | https://prod.ollama.mycorp.io |
Note: we only support Ollama models that support tools.
Example
export OLLAMA_MODEL="llama3.1:8b"
export OLLAMA_URL="https://prod.ollama.mycorp.io" # default http://localhost:11434
Missing AI provider?
Contact [email protected] to request support for your AI model.
AI features
kJQ filter generation
Transform natural language queries into powerful kJQ filters with AI-assisted query generation. This feature empowers users of all technical backgrounds to extract insights from Kafka topics without requiring deep JQ programming knowledge.
How it works
Simply describe what you're looking for in plain English, and the AI model generates a syntactically correct kJQ filter tailored to your data. The system leverages:
- Natural language processing: Convert conversational prompts like "show me all orders over $100 from the last hour" into precise kJQ expressions.
- Schema-aware generation: When topic schemas are available, the AI optionally incorporates field names, data types, and structure to create more accurate filters.
- Validation integration: Generated filters are automatically validated against Kpow's kJQ engine to ensure syntactic correctness before execution.
Usage
Navigate to any topic's Data Inspect view and select the AI Filter option. Enter your query in natural language, and Kpow will generate the corresponding kJQ filter. You can then execute, modify, or save the generated filter for future use.
The AI filter generator works best when provided with specific, actionable descriptions of the data you want to find. Include field names, value ranges, example data and logical operators in your natural language query for optimal results.
AI feature notes
- We cannot control how AI providers process your data. For sensitive data, use local models or enterprise AI services with verified privacy guarantees.
- AI-generated filters are probabilistic and may miss edge cases. Provide detailed context and examples for more accurate results.
- We recommend models with capabilities equivalent to
gpt-4o-mini
orclaude-3-5-sonnet-20241022
or newer.