…UI for top_k/num_hops
- Fix chatbot agent using wrong model (llm_model instead of chat_model)
- Ensure get_completion_config always returns chat_model with llm_model fallback
- Restore startup validation for llm_service and llm_model
- Add _config_file_lock to prevent concurrent config file overwrites
- Replace clear()+update() with atomic dict updates in reload functions
- Load community summarization prompt at call time instead of import time
- Add top_k and num_hops fields to GraphRAG config UI
- Fix ECC URL defaults to match docker-compose service names
- Document all supported config parameters in README
- Bump TigerGraph version to 4.2.2
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
User description
Updated code for
-Pages for Configuration
-Role based access to setup configurations
-graph level access to change configuration
PR Type
Enhancement, Bug fix
Description
Add setup UI for configs and prompts
Enforce role-based access across endpoints
Support graph-specific LLM/prompt configurations
Fix VertexAI imports and model naming
Diagram Walkthrough
File Walkthrough
16 files
Add config/prompts APIs with role-based accessGraph-specific config resolution and reload utilitiesUse per-graph completion config for agentsReload configs at job start and consistency routesLLM provider selection via per-graph configLoad community summary prompt from configured pathPass graph to LLM provider for summarizationKG admin page for init, ingest, refreshUI to edit GraphRAG processing settingsShow Setup link based on resolved rolesRouter for setup sections and redirectsIngestion UI for local and cloud sourcesUI to edit and test LLM servicesUI to edit and test DB connectionUI to view and save prompt filesShared layout for setup navigation2 files
Fix VertexAI embeddings import and parametersSwitch to official VertexAI client and args1 files
Add reverse proxy rules for /setup routes1 files
Add langchain-google-vertexai dependency6 files