How to Run Claude Code for Free (OpenRouter + Ollama Cloud Models)

Claude Code is one of the most capable AI coding tools available right now. It reads your entire codebase, writes real changes to your files, runs terminal commands, and reasons through multi-step problems all from your command line.
The catch is cost. Claude Code runs against Anthropic's API by default, and an agentic coding session burns through tokens fast. Every file it reads, every edit it makes, every command it runs is all being billed.
But then, Claude Code doesn't actually need to talk to Anthropic. It just needs something that speaks its API format. And there's a free tool that does exactly that.
Option 1: Run Claude Code for Free Using OpenRouter
What is OpenRouter?
OpenRouter is a platform that gives you access to dozens of AI models from different companies — Meta, Google, Alibaba, Mistral — through a single API. Many of these models have a free tier, meaning you can use them without spending anything.
The reason this works with Claude Code is that OpenRouter already speaks Anthropic's message format. So instead of pointing Claude Code at Anthropic's servers, you redirect it to OpenRouter, pick a free model, and everything works the same way.
Same interface. Same agentic workflows. Someone else's compute budget.
What You'll Need
- Claude Code installed (
npm install -g @anthropic-ai/claude-code) - A free OpenRouter account at openrouter.ai
- A terminal
That's it. No special hardware. No GPU. Works on any machine.
Step 1: Create an OpenRouter Account
Go to openrouter.ai and sign up. Verification is quick.
Once you're in, navigate to the API Keys section in your dashboard and create a new key. Copy it — you'll need it shortly.
Step 2: Pick a Free Model
In your OpenRouter dashboard, go to the Models section and filter by free. Free models are marked with a :free suffix in their model ID.
Good options for coding tasks:
meta-llama/llama-3.3-70b-instruct:free— Meta's 70B model, strong at reasoning and codegoogle/gemma-3-27b-it:free— Google's Gemma 3, solid general performanceqwen/qwen3-8b:free— Alibaba's Qwen3, good at code specifically
For serious coding work, the Llama 3.3 70B free tier is the strongest option available at no cost. It will outperform most small local models by a significant margin.
Copy the full model ID of whichever you choose.
Step 3: Configure Claude Code
Open your terminal and run these three export commands, replacing the placeholders with your actual values:
export ANTHROPIC_BASE_URL=https://openrouter.ai/api
export ANTHROPIC_AUTH_TOKEN=your-openrouter-api-key
export ANTHROPIC_API_KEY=""
A few things worth knowing:
ANTHROPIC_BASE_URLredirects Claude Code away from Anthropic and toward OpenRouterANTHROPIC_AUTH_TOKENis where your OpenRouter key goesANTHROPIC_API_KEYmust be set to empty — this prevents Claude Code from trying to authenticate with Anthropic directly
These variables apply only to your current terminal session. If you open a new terminal window, you'll need to run them again. To make them permanent, add the three lines to your ~/.zshrc or ~/.bashrc file.
Step 4: Launch Claude Code
In the same terminal window where you set those variables, navigate to your project folder and run:
claude --model meta-llama/llama-3.3-70b-instruct:free
Replace the model name with whichever one you chose from OpenRouter's free list.
Claude Code will start up and behave exactly as it normally does. You can ask it to read files, write code, run tests, debug errors — the full workflow.
What to Expect
Free tier models on OpenRouter have rate limits. If you hit one, you'll see an error message and need to wait a few minutes or switch to a different free model. Adding even a small credit balance to your OpenRouter account (a few dollars) removes most of this friction.
Performance is real. The 70B models available on OpenRouter's free tier are genuinely capable for most coding tasks. They won't match Claude Opus on complex architectural reasoning, but for writing functions, debugging, refactoring, and explaining code, they hold up well.
Option 2: Run Claude Code for Free Using OpenRouter
Download Ollana at ollama.com/download. It has cloud-hosted models accessible through the same interface you use for local ones.
Then run Claude Code with:
- Open Ollama and sign in. Make sure to enable cloud models.
- Open your terminal and type in
ollama launch claudeand hit Enter. - Select a recommended Cloud model, and that is it.
You can use Claude Code as normal. And this is a cloud model so it runs fast.
This is the easiest path if Ollama is already part of your workflow. The compute still happens in the cloud. It just routes through Ollama's interface.
The Bottom Line
Claude Code is designed to work with a variety of models. Features like the agentic interface, file reading, and terminal commands do not depend specifically on Anthropic's models. OpenRouter and Ollama allow you to use this interface with free models that are truly capable of handling real coding tasks.