Checked other resources
System Info
Windows
Code Version
Latest Release
Description
i already changed VLLM_MODEL_DICT in model_config.py and downloaded chatglm-6b in llm_models folder. Docker image has some dependencies error, so i used local services. Even though i modified the code as wrote in fastchat.md but i can not connect with local llm, also do not record a log at llm_api.log and sdfile_api.log .
Example Code
#thats how i changed VLLM_MODEL_DICT
VLLM_MODEL_DICT = VLLM_MODEL_DICT or {
'chatglm2-6b': "chatglm-6b",
}
Error Message and Stack Trace (if applicable)
No response
Checked other resources
System Info
Windows
Code Version
Latest Release
Description
i already changed VLLM_MODEL_DICT in model_config.py and downloaded chatglm-6b in llm_models folder. Docker image has some dependencies error, so i used local services. Even though i modified the code as wrote in fastchat.md but i can not connect with local llm, also do not record a log at llm_api.log and sdfile_api.log .
Example Code
#thats how i changed VLLM_MODEL_DICT
VLLM_MODEL_DICT = VLLM_MODEL_DICT or {
'chatglm2-6b': "chatglm-6b",
}
Error Message and Stack Trace (if applicable)
No response