Skip to content

Core LLM integration infrastructure to allow pgAdmin to connect to AI providers #9641

@akshay-joshi

Description

@akshay-joshi
  • Core LLM integration infrastructure
  • AI generated reports for performance, security, and design on servers, databases, and schemas where appropriate.
  • An AI Assistant in the Query Tool to help with query generation.
  • An AI Insights panel on the Query Tool's EXPLAIN UI to provide analysis and recommendations from query plans.

Support is included for use with Anthropic and OpenAI in the cloud (for best results), or with Ollama or Docker Model Runner on local infrastructure (including the same machine), with models such as qwen3-coder or gemma.

Metadata

Metadata

Assignees

Type

No type

Projects

Status

In Testing

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions