Anon Kode looks to be Claude Code that you can use with local models:

Terminal-based AI coding tool that can use any model that supports the OpenAI-style API.

Fixes your spaghetti code
Explains wtf that function does
Runs tests, shell commands and stuff
Whatever else claude-code can do, depending on the model you use

Sounds amazing, especially since Claude Code (which is great) is currently quite expensive. It probably should be, but I can’t afford dollars-a-day costs for a coding assistant. Can a local version compare?

Here’s my experience setting up and using Anon Kode with Ollama and Qwen2.5-Coder 14B on my 36GB M3 Pro MacBook Pro.

TL;DR: I got it working with Ollama and Qwen2.5-Coder, but then failed to successfully work with it and don’t have time to debug. If you’ve had success with Anon Kode please message me on Bluesky and tell me about it!

Setup

Install Ollama and ollama run qwen2.5-coder:14b if you don’t have an exsting local LLM set up.

Hiccup: I’ve been having issues connecting to the Ollama API via localhost because my machine preferrs ipv6 for localhost (::1), and Ollama only binds to ipv4 (127.0.0.1). In my own code I just use 127.0.0.1 instead of localhost, but the default Ollama configuration for Kode uses localhost, so I “solved” the problem by commenting out the ipv6 entry for localhost in my /etc/hosts file. This is not a good solution, but that’s a future Chris problem. If anyone knows how to fix this properly, please let me know.

Install and run Kode:

npm install -g anon-kode
cd your-project
kode

I walked through the config screens and picked Ollama as my API Provider:

Kode provider selection screen

I set the API KEY to “ollama”. This shouldn’t be necessary, but it doesn’t like it if you don’t set an API KEY value.

Kode provider API KEY screen

Pick your model, I chose to pick the same model for “large” and “small”.

Kode model selection screen

I went with “Default” for tokens.

Kode tokens config screen

Looks good.

Kode model confirmation screen

Success! “Hello” gets a response from Ollama.

Kode response from Ollama

Use

Now that Kode is talking to Ollama, will it work?

I was half way through writing this post, so I thought I would ask Kode to insert the remaining images above for me. It didn’t go well.

Kode failed prompt

Looks like an issue with understanding the project? Maybe running /init will help?

Kode failed init

🫠 That’s unfortunate.

It seems like there are a number of issues; file access, missing tools… unfortunately I don’t have time to dig further right now, so it’s back to Claude.

Result (for now)

I really want this to work, and I’ll try again when I have time. If you’ve had more success with Anon Kode please message me on Bluesky and tell me about it!