- Core LLM integration infrastructure
- AI generated reports for performance, security, and design on servers, databases, and schemas where appropriate.
- An AI Assistant in the Query Tool to help with query generation.
- An AI Insights panel on the Query Tool's EXPLAIN UI to provide analysis and recommendations from query plans.
Support is included for use with Anthropic and OpenAI in the cloud (for best results), or with Ollama or Docker Model Runner on local infrastructure (including the same machine), with models such as qwen3-coder or gemma.