Skip to content

feat(anthropic): enable 1M context window for Claude models#1134

Open
kilo-code-bot[bot] wants to merge 1 commit intomainfrom
feat/enable-claude-1m-context
Open

feat(anthropic): enable 1M context window for Claude models#1134
kilo-code-bot[bot] wants to merge 1 commit intomainfrom
feat/enable-claude-1m-context

Conversation

@kilo-code-bot
Copy link
Contributor

@kilo-code-bot kilo-code-bot bot commented Mar 16, 2026

Summary

  • Add the context-1m-2025-08-07 Anthropic beta header to the standard OpenRouter proxy path (applyAnthropicModelSettings), matching what was already done in the custom LLM pathway. Previously, only the custom LLM path in customLlmRequest.ts sent this header — the main proxy route did not, leaving Claude models limited to 200K context.
  • Override context_length from 200K to 1M for non-Haiku Anthropic models in the enhanced model list returned by GET /api/openrouter/models. OpenRouter reports 200K, but with the beta header enabled, Claude Sonnet/Opus models actually support 1M tokens.
  • Export ANTHROPIC_1M_CONTEXT_LENGTH constant from the anthropic provider module for shared use.

Verification

  • pnpm typecheck — passed (all 30 workspace projects)
  • pnpm format:check — passed (all matched files use Prettier code style)
  • pnpm lint — passed (all workspace projects, no errors)
  • Verified eslint passes on changed files specifically
  • Tests could not run in CI environment (no database), but test fixture files updated to match new behavior

Visual Changes

N/A

Reviewer Notes

  • Haiku models are excluded from the 1M override since they don't support the extended context feature.
  • The context-1m-2025-08-07 beta header is appended alongside the existing fine-grained-tool-streaming-2025-05-14 header using the same appendAnthropicBetaHeader helper, which correctly comma-joins multiple beta flags.
  • The kilo-auto/frontier model already advertised 1M context (kilo-auto-model.ts:31), but the underlying Claude models it routes to were still listed at 200K — this change makes the direct Claude model entries consistent.

Add the context-1m-2025-08-07 beta header to the standard OpenRouter
proxy path and override context_length from 200K to 1M for non-Haiku
Anthropic models in the enhanced model list.
extraHeaders: Record<string, string>
) {
appendAnthropicBetaHeader(extraHeaders, 'fine-grained-tool-streaming-2025-05-14');
appendAnthropicBetaHeader(extraHeaders, 'context-1m-2025-08-07');
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

WARNING: Gate the 1M beta header to non-Haiku models

applyAnthropicModelSettings() runs for every anthropic/* request, so this now adds context-1m-2025-08-07 to Haiku too. The model-list change in this PR explicitly excludes Haiku, which suggests the beta is not meant to apply there; if Anthropic validates beta flags per model, Haiku requests will start failing even though the UI still advertises the smaller context window.

@kilo-code-bot
Copy link
Contributor Author

kilo-code-bot bot commented Mar 16, 2026

Code Review Summary

Status: 1 Issue Found | Recommendation: Address before merge

Overview

Severity Count
CRITICAL 0
WARNING 1
SUGGESTION 0

Fix these issues in Kilo Cloud

Issue Details (click to expand)

CRITICAL

File Line Issue

WARNING

File Line Issue
src/lib/providers/anthropic.ts 86 context-1m-2025-08-07 is now sent on all anthropic/* requests, including Haiku, even though the model-list override explicitly excludes Haiku from 1M support.

SUGGESTION

File Line Issue
Other Observations (not in diff)

Issues found in unchanged code that cannot receive inline comments:

N/A

Files Reviewed (4 files)
  • src/lib/providers/anthropic.ts - 1 issue
  • src/lib/providers/openrouter/index.ts - 0 issues
  • src/tests/openrouter-models-sorting.approved.json - 0 issues
  • src/tests/openrouter-models.test.approved.json - 0 issues

Reviewed by gpt-5.4-20260305 · 530,299 tokens

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

0 participants