Skip to content

Conversation

@thuanha-groove
Copy link

@thuanha-groove thuanha-groove commented Nov 5, 2024

This PR enables compatibility with OpenAI-compatible APIs. It also adds the missing nodemon dependency and an ignore rule for locally generated data.

I have tested with:

  • LM Studio
  • OpenRouter
  • Jan

@raidendotai
Copy link
Owner

  • aiming to move all of inference outside of the main api
  • to have langsmith-like eval ops that can also trace back to which architecture module (and which architecture) called the inference
  • prompts to be at ^ abstraction level , to run benchmarks in batch against prompts

@FeelsDaumenMan
Copy link

I tried it with JAN but i dont seem to get it working. A tutorial might be good

@kntjspr
Copy link

kntjspr commented Nov 29, 2024

It's working on OpenRouter but it doesn't work on other OpenAI compatible api like https://glhf.chat/.
It returns 'Failed validation: model must begin with valid prefix (eg. hf:)',

I assume this is also the same case on any huggingface models.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants