Editorial illustration for OpenCode now supports Qwen3-Coder via config.json on Linux, macOS, Windows
OpenCode Adds Qwen3-Coder Model Across Major Platforms
OpenCode now supports Qwen3-Coder via config.json on Linux, macOS, Windows
OpenCode’s latest update brings the Qwen3‑Coder model into its toolbox, but the addition isn’t automatic. To get the new engine running, users must point the IDE at the right configuration file. The process differs slightly depending on the operating system: Linux and macOS store settings under a hidden “.config” directory, while Windows relies on the %APPDATA% folder.
Editing that JSON file is the only step required to activate the model, and any plain‑text editor—whether it’s VS Code, Notepad++, or a terminal‑based nano—will do the job. This small tweak unlocks the full potential of the Qwen3‑Coder integration, letting developers tap into its capabilities without leaving their preferred environment. Below are the exact instructions you’ll need to add the necessary entry.
- Now we need to tell OpenCode to use this model. OpenCode looks for a config.json file in~/.config/opencode/ (on Linux/macOS) or%APPDATA%\opencode\config.json (on Windows). - Using a text editor (like VS Code, Notepad++, or even nano in the terminal), create or edit the config.json file and add the following content:{ "$schema": "https://opencode.ai/config.json", "provider": { "ollama": { "npm": "@ai-sdk/openai-compatible", "options": { "baseURL": "http://localhost:11434/v1" }, "models": { "qwen2.5-coder:7b-16k": { "tools": true } } } } } This configuration does a few important things. It tells OpenCode to use Ollama's OpenAI-compatible API endpoint (which runs at http://localhost:11434/v1 ).
Running a local AI coder is now within reach. By pointing OpenCode at Qwen3‑Coder through a simple config.json, users on Linux, macOS, or Windows can launch the model without an internet connection. The steps involve locating the configuration directory — ~/.config/opencode/ on Unix‑like systems or %APPDATA%\opencode\ on Windows — and inserting the appropriate JSON snippet using any text editor, from VS Code to nano.
No subscription fees are required, and the setup remains entirely offline, which may appeal to developers concerned about data privacy. Free and offline. However, the article doesn’t provide benchmarks, so the actual speed and accuracy of the coding assistance remain unclear.
Likewise, it leaves unanswered how resource‑intensive the model is on typical hardware. Still, the guide demonstrates that a functional, private coding assistant can be assembled with freely available tools. Whether this approach scales for larger projects or integrates smoothly with existing development workflows is something users will have to evaluate themselves.
The simplicity of editing a single JSON file suggests low entry barriers, though users unfamiliar with command‑line environments may need additional guidance.
Further Reading
- Qwen3-Coder-Next: The Complete 2026 Guide to Running Powerful AI Coding Agents Locally - Dev.to
- How to Setup OpenCode on Mac/MacOS | Zero API Costs, Full AI Coding Power - YouTube
- Qwen3-Coder-Next: How to Run Locally - Unsloth Documentation
- OpenCode - Ollama's documentation - Ollama Documentation
Common Questions Answered
How do I configure Qwen3-Coder in OpenCode across different operating systems?
The configuration process involves editing the config.json file located in different directories based on your OS. On Linux and macOS, the file is found in ~/.config/opencode/, while on Windows, it's located in %APPDATA%\opencode\config.json. You'll need to add the specific JSON configuration for the Ollama provider to activate the Qwen3-Coder model.
What text editors can I use to modify the OpenCode config.json file?
You can use a wide variety of text editors to modify the config.json file, including VS Code, Notepad++, nano in the terminal, and other plain-text editing tools. The key requirement is to accurately input the JSON configuration for the Qwen3-Coder model without introducing syntax errors.
Can I run the Qwen3-Coder model in OpenCode without an internet connection?
Yes, the Qwen3-Coder model can be run locally in OpenCode without an internet connection. By configuring the config.json file to point to a local Ollama provider at http://localhost:11434/v1, users can launch and use the AI coding model entirely offline, without any subscription fees.