Show HN: Skeet – A local-friendly command-line copilot that works with any LLM
github.comI've been using GitHub Copilot CLI, and while it's great, I found myself wanting something that could work with any LLM (including running local models through Ollama), so I built Skeet.
The key features that make it different:
- Works with any LLM provider through LiteLLM (OpenAI, Anthropic, local models, etc.)
- Automatically retries and adapts commands when they fail
- Can generate and execute Python scripts with dependencies (powered by uv) without virtual environment hassles
You can try simple tasks like:
```
skeet show me system information
skeet what is using port 8000
skeet --python "what's the current time on the ISS?"
```
Demo: https://asciinema.org/a/697092
Code: https://github.com/knowsuchagency/skeet
I built it for myself, and I've been really happy with the results. It's interesting to see how different models fare against one another with everyday tasks. If running a local model, I've had decent luck with ollama_chat/phi3:medium but I'm curious to know what others use.
Cheers!