Multi-LLM Fallback
CCaution๐ค AI Models ยท by leohan123123
Multi-LLM intelligent switching. Use command 'multi llm' to activate local model selection based on task type. Default uses Claude Opus 4.5.
Install
openclaw plugins install leohan123123/mlti-llm-fallbackOn ClawHub, this skill has no security information.
ClawStack independently scans every skill for permissions, network requests, author reputation, and more. Learn how we score โ
Security Analysis
Score: 45/100Reviews (17)
No reviews yet
Be the first to review this skill!
Details
- Author
- @leohan123123
- Source
- GitHub
- ClawHub
- View
- Category
- ๐ค AI Models
- Rating
- โ 3.9 (17)
Similar Skills
Parquet Converter
Data storage and processing challenges:
Llmrouter
Intelligent LLM proxy that routes requests to appropriate models based on complexity. Save money by using cheaper models for simple tasks. Tested with Anthropic, OpenAI, Gemini, Kimi/Moonshot, and Ollama.
Ollama Local
Manage and use local Ollama models. Use for model management (list/pull/remove), chat/completions, embeddings, and tool-use with local LLMs. Covers OpenClaw sub-agent integration and model selection guidance.