Pinned Loading
-
OpenAI-compatible proxy for DeepSeek...
OpenAI-compatible proxy for DeepSeek V4 Flash with intelligent auto context compression features 1#!/usr/bin/env python32"""3Zero-dependency OpenAI-compatible proxy for DeepSeek V4 Flash.45Author: g023 -
aroc
aroc PublicAgentic Read-Only Chat - A rich terminal chat interface powered by locally installed llama.cpp and g023/g023-Qwen3.5-9B-GGUF:IQ2_M
Python
-
harnessharvest
harnessharvest PublicA self-learning, self-correcting, LLM-powered harness creation and management system with FAISS-powered RAG, sandboxed execution, and autonomous improvement modes. Powered by Ollama and offline mod…
-
localmodelrouter
localmodelrouter PublicLocal LLM server that provides drop-in API compatibility with both Ollama and OpenAI, using your locally installed [llama.cpp](https://github.com/ggerganov/llama.cpp) 's `llama-server` as the infer…
Python 2
-
g023-OllamaMan
g023-OllamaMan PublicA Concept Ollama Server Management OS that runs in a web browser.
PHP 7
If the problem persists, check the GitHub status page or contact support.
