Self-hosted personal AI assistant
Feishu channel · Built-in Web console · LiteLLM multi-model
OpenFox runs on your own machine: it brings LLM chat, scheduled jobs, Feishu bot, browser tools, MCP, and local skills together in one HTTP service. It ships with an embedded Web UI at /web, and you can also talk to the assistant from Feishu.
| Path | Description |
|---|---|
~/.openfox/config.json |
LLM, Feishu, cors_origin_list, MCP, and related settings |
~/.openfox/storage.db |
SQLite storage for sessions and scheduling |
~/.openfox/skills |
Local Skills root (SKILLS_PATH in openfox/utils/const.py); on first run, if missing, copied from packaged openfox/skills (see openfox/core/skills.py) |
| Capability | Description |
|---|---|
| Web console | Chat, session list, usage metrics, skill upload/management, Cron scheduling, JSON config editor (login required; use os_security_key from config) |
| Feishu | Event and message intake; DM and group chat (mention the bot) |
| Scheduled jobs | Built-in scheduler; enable with tools.scheduler (SchedulerConfig). Agent toolkit SchedulerTools creates recurring tasks (cron → POST Agent run endpoint) |
| Tools | See “Built-in Agent tools” below; you can also attach MCP via config.mcps. JSON config editing in the Web console uses ConfigTools (not an Agent chat tool). |
| Skills | SKILL.md under SKILLS_PATH (~/.openfox/skills, LocalSkills); upload skill packs from the Web UI |
| Models | LiteLLM for OpenAI-compatible APIs (see “Models” below) |
| Tool class | What it does |
|---|---|
| ShellTools | Run shell commands on the host where OpenFox runs (Agno) |
| SchedulerTools | Registered when tools.scheduler.activate is true. Create / list / get / delete / disable jobs; cron expressions invoke this Agent's run endpoint |
| FeishuTools | Feishu-related actions (e.g. messaging with the channel) |
| MCPConfigTools | Add / remove / update MCP-related config in chat to extend tools dynamically |
| WebSearchTools | Search the web |
| ArxivTools | Search arXiv papers and metadata |
| HackerNewsTools | Read Hacker News stories and discussions |
| PubmedTools | Search PubMed biomedical literature |
| WikipediaTools | Look up Wikipedia articles and summaries |
| Crawl4aiTools | Fetch and parse page content with Crawl4AI (suited to structured extraction) |
| CalculatorTools | Evaluate math expressions |
| DockerTools | Manage local Docker: containers, images, volumes, networks (Docker must be available) |
| YouTubeTools | YouTube: metadata, captions, timestamps, etc. (requires youtube_transcript_api) |
| WebBrowserTools | Open a URL in the system default browser on this machine (new tab or new window) |
Environment: Python 3.12+. Install dependencies with uv from the repo root.
uv sync # or: pip install -e .First run: If ~/.openfox/config.json is missing, an interactive setup runs (API docs toggle, auth, os_security_key, timezone, LLM, Feishu, etc.), then the server starts.
python -m openfox
# Binds to 0.0.0.0:7777 by defaultFor custom --host / --port, run python -m openfox --help for CLI subcommands and flags.
- Web UI: Open
http://127.0.0.1:7777/web(adjust port as needed). - Auth token: The
os_security_keyvalue from config; enter it on the Web login page. - Non-7777 ports: Default CORS includes
/webon:7777. If you change the port, add e.g.http://127.0.0.1:<port>andhttp://localhost:<port>tocors_origin_list, or the frontend may fail API calls.
To re-run setup only, delete ~/.openfox/config.json and run python -m openfox again. Existing config skips the wizard and starts directly.
Request path prefix is /feishu.
- Event / webhook example:
http://<your-host-or-domain>/feishu/event(match whatever Feishu Open Platform expects and your routes).
Steps in brief:
- Create an app on Feishu Open Platform and obtain App ID and App Secret.
- Configure event subscription and message permissions; set the request URL to your service (public URL or tunnel); fill Encrypt Key and Verification Token.
- Write these into
channels.feishuin~/.openfox/config.json, then restartpython -m openfox. - In Feishu DM or group chat, use the app (e.g. @ bot) to talk to OpenFox.
OpenFox uses LiteLLM. For any provider in LiteLLM’s supported list that exposes an OpenAI-style Chat Completions API, you usually only need llm.model_name, llm.api_base, and llm.api_key.
Example model strings (see official docs for the full list):
openai/gpt-4o-mini
deepseek/deepseek-chat
dashscope/qwen-max
ollama/llama3.1
...
Chat
Sessions
Usage
Skills
Scheduled jobs
Config
| Aspect | OpenClaw | OpenFox |
|---|---|---|
| Stack | Node / TypeScript | Python, Agno, FastAPI |
| Channels | Many IM platforms | Feishu-first (extensible) |
| Extensions | Browser, Canvas, Cron, etc. | Cron, Shell, browser (Playwright), MCP, local Skills |
| Focus | Cross-platform personal assistant | Self-hosted, bilingual-friendly, lightweight control plane with integrated Web UI |
Stars help us keep improving.
When joining, say you’re here for openfox:








