Symptom:
❌ Sync failed: fetch failed
Error: connect ECONNREFUSED 127.0.0.1:1234
Cause: Your AI server is not running or not accessible.
Solutions:
-
Start your AI server:
# For LM Studio: Start the application and enable Local Server # For Ollama: ollama serve # For vLLM: python -m vllm.entrypoints.openai.api_server --model your-model
-
Verify endpoint:
curl http://localhost:1234/v1/models # Should return JSON with available models -
Check firewall/network:
# Ensure port is accessible nc -zv localhost 1234
Symptom:
⚠️ No models found at http://localhost:1234/v1/models
Make sure your AI server is running and accessible
Cause: Server running but no models loaded.
Solutions:
- For LM Studio: Load a model in the UI
- For Ollama: Pull a model first:
ollama pull llama3.1:8b
- For vLLM: Specify model on startup:
vllm serve meta-llama/Meta-Llama-3.1-8B-Instruct
Cause: OpenCode is using a different config file.
Solutions:
-
Check which config OpenCode is using:
# Open using custom location OPENCODE_CONFIG=~/.config/opencode/opencode.json opencode
-
Verify sync script updated the correct file:
cat ~/.config/opencode/opencode.json | jq '.provider.local.models'
-
In OpenCode, check available models:
# Use /models command within OpenCode
Symptom:
❌ Error reading config (might contain JSON comments?): Unexpected token
Cause: Config file contains comments (JSONC format) that strict JSON parser can't handle.
Solution:
- Remove comments from
opencode.json - Or use a tool to strip comments before syncing:
npm install -g strip-json-comments-cli strip-json-comments ~/.config/opencode/opencode.json > ~/.config/opencode/opencode.json.clean
Symptom:
$ opencode
bash: opencode: command not foundCause: Bash functions not sourced or PATH not updated.
Solutions:
-
Source your bashrc:
source ~/.bashrc
-
Check if functions exist:
type opencode # Should show: opencode is a function
-
Verify OpenCode installation:
which opencode # Should show path like /home/user/.opencode/bin/opencode
Symptom: Model behaves strangely or tries to call non-existent tools.
Cause: Your model doesn't support tool/function calling.
Solution:
-
Edit
opencode.jsonand settools: falsefor problematic models:{ "provider": { "local": { "models": { "my-small-model": { "name": "my-small-model", "tools": false } } } } } -
Or modify the sync script to detect model capabilities.
Symptom:
Error: Cannot find module '/home/user/.config/opencode/sync-local-models.mjs'
Cause: Script not installed or path incorrect.
Solution:
-
Re-run install script:
./scripts/install.sh
-
Or manually copy the scripts:
cp scripts/providers.mjs ~/.config/opencode/ cp scripts/sync-core.mjs ~/.config/opencode/ cp scripts/sync-provider.mjs ~/.config/opencode/ cp scripts/sync-on-launch.mjs ~/.config/opencode/ cp scripts/sync-local-models.mjs ~/.config/opencode/
Symptom:
❌ Error: Provider "xyz" not found
Cause: Provider not synced yet or API key missing.
Solution:
-
Sync specific provider:
export LOCAL_API_BASE="https://api.provider.com/v1" export API_KEY="your-key" node scripts/sync-provider.mjs
-
Or sync all providers:
./scripts/sync-all-providers.sh
-
Verify provider exists:
cat ~/.config/opencode/opencode.json | jq '.provider | keys'
Run sync script with more output:
node ~/.config/opencode/sync-local-models.mjsTest your API endpoint:
curl -s http://localhost:1234/v1/models | jqCheck final config:
cat ~/.config/opencode/opencode.json | jq '.provider.local'Override endpoint temporarily:
LOCAL_API_BASE=http://localhost:11434/v1 node scripts/sync-local-models.mjsIf you still have issues:
- Check the API Reference
- Review OpenCode docs: https://opencode.ai/docs/
- Open an issue with:
- Command you're running
- Full error output
node --versioncurl $LOCAL_API_BASE/v1/modelsoutput- Contents of
opencode.json