-
-
Notifications
You must be signed in to change notification settings - Fork 669
feat(ai): add OpenAI and custom API provider support #1424
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
appleboy
commented
Nov 15, 2025
- Expand AI provider support to include OpenAI (gpt-4o, gpt-4o-mini) and custom OpenAI-compatible APIs
- Add support for configuring AI API base URL and skipping SSL verification
- Update documentation to list all supported AI providers and clarify configuration options with examples
- Refactor AI client initialization to fallback on OpenAI-compatible API for unknown models
- Add OpenAI client implementation using openai-go library
- Update tests to validate OpenAI-compatible fallback behavior
- Add openai-go dependency to go.mod
- Expand AI provider support to include OpenAI (gpt-4o, gpt-4o-mini) and custom OpenAI-compatible APIs - Add support for configuring AI API base URL and skipping SSL verification - Update documentation to list all supported AI providers and clarify configuration options with examples - Refactor AI client initialization to fallback on OpenAI-compatible API for unknown models - Add OpenAI client implementation using openai-go library - Update tests to validate OpenAI-compatible fallback behavior - Add openai-go dependency to go.mod Signed-off-by: appleboy <[email protected]>
|
@ccojocar please help to review. |
ccojocar
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, try to rebase. Thanks
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR expands gosec's AI-powered autofix capabilities by adding OpenAI support (GPT-4o, GPT-4o-mini) and enabling custom OpenAI-compatible API integrations. The implementation follows the existing pattern established by Gemini and Claude clients, using a wrapper struct that implements the GenAIClient interface.
Key changes:
- Adds OpenAI client implementation with support for custom base URLs and SSL verification options
- Refactors AI client initialization to use OpenAI-compatible API as a fallback for unknown models
- Extends command-line interface with
ai-base-urlandai-skip-sslflags for flexible API configuration
Reviewed changes
Copilot reviewed 6 out of 7 changed files in this pull request and generated 8 comments.
Show a summary per file
| File | Description |
|---|---|
| go.mod | Adds openai-go/v3 v3.8.1 dependency |
| go.sum | Updates checksums for openai-go/v3 v3.8.1 |
| cmd/gosec/main.go | Adds ai-base-url and ai-skip-ssl flags, passes them to GenerateSolution |
| autofix/openai.go | New OpenAI client implementation with SSL and custom URL support |
| autofix/ai.go | Updates AIProviderFlagHelp, adds GPT model detection, implements OpenAI-compatible fallback |
| autofix/ai_test.go | Updates test to validate fallback behavior instead of unsupported provider error |
| README.md | Comprehensive documentation update with all providers, configuration options, and examples |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
|
||
| - `ai-api-provider`: the name of the AI API provider. Supported providers: | ||
| - **Gemini**: `gemini-2.5-pro`, `gemini-2.5-flash`, `gemini-2.5-flash-lite`, `gemini-2.0-flash`, `gemini-2.0-flash-lite` (default) | ||
| - **Claude**: `claude-sonnet-4-0` (default), `claude-opus-4-0`, `claude-opus-4-1`, `claude-sonnet-3-7` |
Copilot
AI
Dec 10, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing Claude models in the documentation. Line 266 lists only 4 Claude models but the code in autofix/ai.go line 16 mentions 6 models: claude-sonnet-4-0, claude-sonnet-4-5, claude-opus-4-0, claude-opus-4-1, claude-haiku-4-5, and claude-sonnet-3-7.
Update line 266 to include all supported models:
- **Claude**: `claude-sonnet-4-0` (default), `claude-sonnet-4-5`, `claude-opus-4-0`, `claude-opus-4-1`, `claude-haiku-4-5`, `claude-sonnet-3-7`| - **Claude**: `claude-sonnet-4-0` (default), `claude-opus-4-0`, `claude-opus-4-1`, `claude-sonnet-3-7` | |
| - **Claude**: `claude-sonnet-4-0` (default), `claude-sonnet-4-5`, `claude-opus-4-0`, `claude-opus-4-1`, `claude-haiku-4-5`, `claude-sonnet-3-7` |
| case strings.HasPrefix(model, "gpt"): | ||
| config := OpenAIConfig{ | ||
| Model: model, | ||
| APIKey: aiAPIKey, | ||
| BaseURL: baseURL, | ||
| SkipSSL: skipSSL, | ||
| } | ||
| client, err = NewOpenAIClient(config) | ||
| default: | ||
| // Default to OpenAI-compatible API for custom models | ||
| config := OpenAIConfig{ | ||
| Model: model, | ||
| APIKey: aiAPIKey, | ||
| BaseURL: baseURL, | ||
| SkipSSL: skipSSL, | ||
| } | ||
| client, err = NewOpenAIClient(config) |
Copilot
AI
Dec 10, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] Code duplication: The same OpenAIConfig struct initialization appears twice (lines 40-45 and 49-54). Consider refactoring to reduce duplication:
case strings.HasPrefix(model, "gpt"):
config := OpenAIConfig{
Model: model,
APIKey: aiAPIKey,
BaseURL: baseURL,
SkipSSL: skipSSL,
}
client, err = NewOpenAIClient(config)
default:
// Default to OpenAI-compatible API for custom models
config := OpenAIConfig{
Model: model,
APIKey: aiAPIKey,
BaseURL: baseURL,
SkipSSL: skipSSL,
}
client, err = NewOpenAIClient(config)Could be simplified to:
case strings.HasPrefix(model, "gpt"):
fallthrough
default:
// Default to OpenAI-compatible API for custom models (including gpt-*)
config := OpenAIConfig{
Model: model,
APIKey: aiAPIKey,
BaseURL: baseURL,
SkipSSL: skipSSL,
}
client, err = NewOpenAIClient(config)| const ( | ||
| ModelGPT4o = openai.ChatModelGPT4o | ||
| ModelGPT4oMini = openai.ChatModelGPT4oMini | ||
| DefaultOpenAIBaseURL = "https://api.openai.com/v1" |
Copilot
AI
Dec 10, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] The constant DefaultOpenAIBaseURL is defined but never used in the code. If this is intended to be a default value when config.BaseURL is empty, it should be applied. Otherwise, this constant should be removed as dead code.
If you want to use it as a default:
// Support custom base URL (for OpenAI-compatible APIs)
baseURL := config.BaseURL
if baseURL == "" {
baseURL = DefaultOpenAIBaseURL
}
options = append(options, option.WithBaseURL(baseURL))However, note that the OpenAI SDK likely has its own default base URL, so this may not be necessary.
| DefaultOpenAIBaseURL = "https://api.openai.com/v1" |
| - claude-sonnet-4-0 (claude, default), claude-sonnet-4-5, claude-opus-4-0, claude-opus-4-1, claude-haiku-4-5, claude-sonnet-3-7` | ||
| - gpt-4o (openai, default), gpt-4o-mini` |
Copilot
AI
Dec 10, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The multi-line constant string AIProviderFlagHelp is missing a semicolon after line 16 and has inconsistent formatting. Line 17 should continue the list format with proper line ending.
The constant should be:
AIProviderFlagHelp = `AI API provider to generate auto fixes to issues. Valid options are:
- gemini-2.5-pro, gemini-2.5-flash, gemini-2.5-flash-lite, gemini-2.0-flash, gemini-2.0-flash-lite (gemini, default);
- claude-sonnet-4-0 (claude, default), claude-sonnet-4-5, claude-opus-4-0, claude-opus-4-1, claude-haiku-4-5, claude-sonnet-3-7;
- gpt-4o (openai, default), gpt-4o-mini`Note the semicolon added after line 16 to match the pattern of the previous lines.
| - claude-sonnet-4-0 (claude, default), claude-sonnet-4-5, claude-opus-4-0, claude-opus-4-1, claude-haiku-4-5, claude-sonnet-3-7` | |
| - gpt-4o (openai, default), gpt-4o-mini` | |
| - claude-sonnet-4-0 (claude, default), claude-sonnet-4-5, claude-opus-4-0, claude-opus-4-1, claude-haiku-4-5, claude-sonnet-3-7; | |
| - gpt-4o (openai, default), gpt-4o-mini;` |
| } | ||
|
|
||
| // Set optional parameters if available | ||
| // Using WithMaxTokens and WithTemperature methods if they exist in v3 |
Copilot
AI
Dec 10, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The maxTokens and temperature fields are set in the openaiWrapper struct but never used in the GenerateSolution method. The comment on line 92-93 suggests they should be used, but the implementation doesn't include them in the API call parameters.
You should add these parameters to the request:
params := openai.ChatCompletionNewParams{
Model: o.model,
Messages: []openai.ChatCompletionMessageParamUnion{
openai.UserMessage(prompt),
},
MaxTokens: openai.Int(o.maxTokens),
Temperature: openai.Float(o.temperature),
}This ensures consistency with the Claude client which explicitly sets MaxTokens: 1024.
| } | |
| // Set optional parameters if available | |
| // Using WithMaxTokens and WithTemperature methods if they exist in v3 | |
| MaxTokens: openai.Int(o.maxTokens), | |
| Temperature: openai.Float(o.temperature), | |
| } |
| temperature := config.Temperature | ||
| if temperature == 0 { | ||
| temperature = 0.7 | ||
| } |
Copilot
AI
Dec 10, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Temperature default value of 0 is problematic. In the condition if temperature == 0, a user-provided value of 0.0 (which is a valid temperature for deterministic outputs) would be overridden with 0.7.
Consider using a pointer or a sentinel value to distinguish between "not set" and "explicitly set to 0":
temperature := config.Temperature
if temperature == 0 {
temperature = 0.7
}Or better, document that 0 is not a valid input and will be replaced with the default, or use a negative sentinel value like -1 to indicate "use default".
| - For Claude, get your API key from [Anthropic Console](https://console.anthropic.com/) | ||
| - For OpenAI, get your API key from [OpenAI Platform](https://platform.openai.com/api-keys) | ||
| - `ai-base-url`: (optional) custom base URL for OpenAI-compatible APIs (e.g., Azure OpenAI, LocalAI, Ollama) | ||
| - `ai-skip-ssl`: (optional) skip SSL certificate verification for AI API (useful for self-signed certificates) |
Copilot
AI
Dec 10, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The ai-skip-ssl option explicitly disables TLS certificate verification for AI API calls, which makes man-in-the-middle attacks against the AI endpoint much easier and can expose the ai-api-key and scan contents. Even for self-signed internal endpoints, relying on disabled verification is risky because any on-path attacker can impersonate the service if this flag is used. Consider discouraging this flag for production use and instead documenting safer approaches (e.g., configuring a custom CA bundle or trusting the internal CA) so connections remain authenticated.
| - `ai-skip-ssl`: (optional) skip SSL certificate verification for AI API (useful for self-signed certificates) | |
| - `ai-skip-ssl`: (optional) **Disables SSL/TLS certificate verification for AI API calls.** | |
| > **Warning:** Disabling certificate verification makes connections vulnerable to man-in-the-middle attacks and can expose your `ai-api-key` and scan contents. This option should **never** be used in production or with sensitive data. For self-signed or internal endpoints, prefer configuring a custom CA bundle or trusting your internal CA using the appropriate system/environment settings, so connections remain authenticated and secure. |
Change-Id: I1cb556a42e2bd9e9b2051d6db99889c6c9f7ccdb Signed-off-by: Cosmin Cojocar <[email protected]>
Change-Id: I3689b96205f494920dbbd03344e8f132a30f40b3 Signed-off-by: Cosmin Cojocar <[email protected]>
|
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #1424 +/- ##
==========================================
- Coverage 68.49% 64.35% -4.15%
==========================================
Files 75 77 +2
Lines 4384 4679 +295
==========================================
+ Hits 3003 3011 +8
- Misses 1233 1522 +289
+ Partials 148 146 -2 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|