Claude Code with Local LLMs and ANTHROPIC_BASE_URL: Ollama, LM Studio, llama.cpp, vLLM
Native Anthropic endpoints, tool-call compatibility, and context-window sizing for local Claude Code. Last tested: April 2026. See Changelog at the bottom. Goal Use MacBook Air Gemma 4 26B-A4B Q4, 32K context, LM Studio or Ollama MacBook Pro Gemma 4 26B-A4B Q4 / UD-Q4, 64K context, llama.cpp or LM S











