Skip to content
#

free-inference

Here are 2 public repositories matching this topic...

Language: All
Filter by language

For OpenClaw, Hermes and more. Find free and low-cost inference (LLM models). Use them directly. Provides both a CLI and MCP server that knows which free-tier LLM APIs exist, which ones you have keys for, and which one fits your task. Returns endpoints so can you call models directly. No proxy, no middleware, no latency tax.

  • Updated Apr 6, 2026
  • Rust

Improve this page

Add a description, image, and links to the free-inference topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the free-inference topic, visit your repo's landing page and select "manage topics."

Learn more