feat: add Novita AI as LLM provider#1847
Open
Alex-wuhu wants to merge 1 commit intounclecode:mainfrom
Open
Conversation
Register Novita AI models in PROVIDER_MODELS and PROVIDER_MODELS_PREFIXES so that NOVITA_API_KEY is auto-resolved when the novita/ prefix is used. Novita AI exposes an OpenAI-compatible endpoint; callers should set base_url="https://api.novita.ai/openai" in LLMConfig. Supported models: - novita/moonshotai/kimi-k2.5 (default, 262k context) - novita/zai-org/glm-5 - novita/minimax/minimax-m2.5 Also document NOVITA_API_KEY and NOVITA_BASE_URL in the Docker README.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
PROVIDER_MODELSandPROVIDER_MODELS_PREFIXES(crawl4ai/config.py) soNOVITA_API_KEYis auto-resolved for anynovita/…provider stringNOVITA_API_KEYandNOVITA_BASE_URLin the Docker deployment READMEUsage
Novita AI exposes an OpenAI-compatible endpoint. Use it with
LLMConfig:Supported models:
novita/moonshotai/kimi-k2.5novita/zai-org/glm-5novita/minimax/minimax-m2.5For Docker deployments, pass the following environment variables:
The existing
get_llm_base_url()utility already picks upNOVITA_BASE_URLvia the generic{PROVIDER_NAME}_BASE_URLpattern — no Docker-layer changes were needed.How it works
crawl4ai routes all LLM calls through litellm. For OpenAI-compatible providers, litellm accepts any model string with an explicit
api_base. ThePROVIDER_MODELS_PREFIXESentry ensures the API key is resolved automatically fromNOVITA_API_KEYwhenever anovita/…provider string is used, matching the pattern of existing providers (deepseek, groq, etc.).