返回顶部
o

ondeckllm

>

作者: admin | 来源: ClawHub
源自
ClawHub
版本
V 1.4.3
安全检测
已通过
90
下载量
0
收藏
概述
安装方式
版本历史

ondeckllm

# OnDeckLLM — AI Model Lineup Manager ## Prerequisites ```bash npm install -g ondeckllm ``` Verify: `ondeckllm --help` or check install with `npm list -g ondeckllm`. ## What It Does OnDeckLLM is a localhost web dashboard that: - **Auto-discovers** LLM providers from OpenClaw config (`~/.openclaw/openclaw.json`) - **Manages** a batting-order priority list for model routing (primary + fallbacks) - **Tests** provider health and latency - **Syncs** model lineup back to OpenClaw config with one click - **Tracks** session costs (JSONL usage log + Chart.js) - **Supports** Anthropic, OpenAI, Google AI, Groq, xAI/Grok, Ollama (local + remote), Mistral, DeepSeek, Together, OpenRouter ## Starting the Dashboard ```bash # Default port 3900 ondeckllm # Custom port PORT=3901 ondeckllm ``` The dashboard runs at `http://localhost:3900` (or custom port). ### As a Background Service Use the helper script to check status or start OnDeckLLM: ```bash node scripts/status.js ``` Output: JSON with `running` (bool), `port`, `url`, and `pid` if active. ## Agent Workflow ### Check if OnDeckLLM is running ```bash node scripts/status.js ``` ### Open the dashboard for the user Direct them to `http://localhost:3900` (or the configured port/URL). ### Provider management OnDeckLLM reads provider config from `~/.openclaw/openclaw.json` automatically. Changes made in the dashboard sync back to OpenClaw config. No separate API or CLI commands needed — it's a web UI tool. ## Configuration OnDeckLLM stores its data in `~/.ondeckllm/`: - `config.json` — provider settings, port, Ollama URL - `usage.jsonl` — cost tracking log - `profiles/` — saved batting-order profiles ### Remote Ollama To connect to a remote Ollama instance, configure in the dashboard UI: Settings → Ollama → Remote URL (e.g., `http://192.168.55.80:11434`) ## Links - 🌐 [ondeckllm.com](https://ondeckllm.com) - 📦 [npm](https://www.npmjs.com/package/ondeckllm) - 🐛 [GitHub Issues](https://github.com/canonflip/ondeckllm/issues)

标签

skill ai

通过对话安装

该技能支持在以下平台通过对话安装:

OpenClaw WorkBuddy QClaw Kimi Claude

方式一:安装 SkillHub 和技能

帮我安装 SkillHub 和 ondeckllm-1776080048 技能

方式二:设置 SkillHub 为优先技能安装源

设置 SkillHub 为我的优先技能安装源,然后帮我安装 ondeckllm-1776080048 技能

通过命令行安装

skillhub install ondeckllm-1776080048

下载 Zip 包

⬇ 下载 ondeckllm v1.4.3

文件大小: 2.63 KB | 发布时间: 2026-4-14 11:47

v1.4.3 最新 2026-4-14 11:47
Republished under canonflip-git; LLM dashboard skill

Archiver·手机版·闲社网·闲社论坛·羊毛社区· 多链控股集团有限公司 · 苏ICP备2025199260号-1

Powered by Discuz! X5.0   © 2024-2025 闲社网·线报更新论坛·羊毛分享社区·http://xianshe.com

p2p_official_large
返回顶部