A free, provider-agnostic Claude Code clone built for Termux.
21 AI providers. Auto-failover. Full agentic loop. Zero cost.
██████╗ ██╗ ██╗ ██████╗ ███████╗████████╗
██╔════╝ ██║ ██║██╔═══██╗██╔════╝╚══██╔══╝
██║ ███╗███████║██║ ██║███████╗ ██║
██║ ██║██╔══██║██║ ██║╚════██║ ██║
╚██████╔╝██║ ██║╚██████╔╝███████║ ██║
╚═════╝ ╚═╝ ╚═╝ ╚═════╝ ╚══════╝ ╚═╝
- 21 AI providers with automatic failover — if one fails, the next kicks in silently
- Full agentic loop — reads/writes files, runs shell commands, edits code
- Claude Code-style tools —
read_file,write_file,edit_file,run_shell,search_files,list_files - Zero dependencies beyond Python 3 stdlib (no pip required)
- Termux-native — works on Android without root
- Session persistence — conversation history saved across restarts
- One-shot + interactive modes
# 1. Clone or download
git clone https://github.com/yourhandle/ghost
cd ghost
# 2. Install
bash install.sh
# 3. Add at least one API key
nano ~/.ghost/.env
# 4. Run
ghostMinimum setup: One API key is enough. GROQ and GEMINI are recommended — both have generous free tiers.
| # | Provider | Free Tier | Speed |
|---|---|---|---|
| 1 | GROQ | ✓ Generous | ⚡⚡⚡ Fastest |
| 2 | CEREBRAS | ✓ Good | ⚡⚡⚡ |
| 3 | GEMINI | ✓ Very generous | ⚡⚡ |
| 4 | OPENROUTER | ✓ Free models | ⚡⚡ |
| 5 | MISTRAL | ✓ Free tier | ⚡⚡ |
| 6 | TOGETHER | ✓ Credits | ⚡⚡ |
| 7 | COHERE | ✓ Free tier | ⚡⚡ |
| 8 | HYPERBOLIC | Paid | ⚡⚡ |
| 9 | NVIDIA NIM | Credits | ⚡⚡ |
| 10 | GITHUB Models | ✓ Free with GH | ⚡⚡ |
| 11 | XAI (Grok) | Paid | ⚡⚡ |
| 12 | CLOUDFLARE | ✓ Free tier | ⚡ |
| 13 | VENICE | Paid | ⚡ |
| 14 | MOONSHOT | Credits | ⚡ |
| 15 | ZAI (GLM) | ✓ Free | ⚡ |
| 16 | LONGCAT | ✓ Free | ⚡ |
| 17 | OPENAI | Paid | ⚡⚡ |
| 18 | ANTHROPIC | Paid | ⚡⚡ |
| 19 | HUGGINGFACE | ✓ Free | ⚡ |
| 20 | OLLAMA | ✓ Local/Free | Varies |
| 21 | OLLAMA Cloud | Varies | Varies |
ghost
ghost --workspace ~/myprojectghost "write a Python web scraper for HackerNews"
ghost "explain this error: segfault at 0x0"
ghost --no-tools "what is async/await"ghost --provider GEMINI
ghost --provider OPENROUTER
ghost --provider OLLAMA # Use local Ollamaghost --model GROQ mixtral-8x7b-32768
ghost --model OPENROUTER deepseek/deepseek-coder-v2:free
ghost --model OLLAMA codellamaghost --status # Show all providers + key status
ghost --list-providers # List provider IDs| Command | Description |
|---|---|
/status |
Show all providers + which have keys |
/provider GEMINI |
Switch primary provider |
/model GROQ mixtral-8x7b |
Switch provider + model |
/tools |
Toggle agentic tool use on/off |
/stats |
Session stats (tokens, turns, providers) |
/clear |
Clear conversation context |
/sessions |
List saved sessions |
/cd <path> |
Change workspace directory |
/help |
Show all commands |
/exit |
Save and quit |
GHOST runs a full agentic loop like Claude Code:
User input
↓
LLM generates response
↓
Does it contain a tool call?
├── YES → Execute tool → Feed result back → Loop
└── NO → Print response → Wait for next input
Available tools the AI can use autonomously:
read_file— Read any file with line numberswrite_file— Create or overwrite filesedit_file— Targeted string replacement (safer than full rewrite)run_shell— Execute shell commands (with 60s timeout)list_files— Directory listing (recursive or flat)search_files— Grep-style search across filescreate_dir— Create directoriesdelete_file— Delete files or directoriesfile_info— File metadata (size, permissions, mtime)done— Signal task completion with summary
~/.ghost/
├── ghost.py # Entry point + provider registry
├── provider_engine.py # API layer + failover router
├── tools.py # Agentic tool execution
├── session.py # Conversation history + persistence
├── repl.py # Interactive shell + display
├── .env # Your API keys (edit this!)
└── sessions/ # Saved conversation sessions
└── <id>.json
| Provider | Get Key |
|---|---|
| GROQ | https://console.groq.com |
| CEREBRAS | https://cloud.cerebras.ai |
| GEMINI | https://aistudio.google.com |
| OPENROUTER | https://openrouter.ai |
| MISTRAL | https://console.mistral.ai |
| TOGETHER | https://api.together.xyz |
| COHERE | https://dashboard.cohere.com |
| NVIDIA NIM | https://build.nvidia.com |
| GITHUB | https://github.com/settings/tokens |
| CLOUDFLARE | https://dash.cloudflare.com |
| HUGGINGFACE | https://huggingface.co/settings/tokens |
| ZAI | https://bigmodel.cn |
- Zero pip dependencies — uses only Python stdlib (
urllib,json,subprocess,pathlib) - jq optional — only needed for the legacy bash
ai()wrapper in your shell config - Provider failover — HTTP 401/403 permanently blacklists a provider this session; rate limits trigger retry on next call
- Context window management — auto-trims oldest messages to stay within ~60K chars
- Workspace isolation — file tools operate relative to
--workspace(default: current dir)
# Fastest setup: just GROQ + GEMINI covers most use cases
export GROQ_API_KEY="gsk_..."
export GEMINI_API_KEY="AIza..."
ghost
# Use Ollama for fully offline operation
ollama pull codellama
ghost --provider OLLAMA
# Pipe files directly
cat myfile.py | ghost "review this code for bugs"
# Use as a shell tool in scripts
RESULT=$(ghost --no-tools "generate a UUID")
echo $RESULTMIT — hack it, fork it, ship it. 👻