Skip to content

Fix seed parameter TypeError in BaseDRLearner bootstrap CI (#857)#879

Merged
jeongyoonlee merged 2 commits into
uber:masterfrom
mohsinm-dev:fix/857-drlearner-bootstrap-seed
Mar 6, 2026
Merged

Fix seed parameter TypeError in BaseDRLearner bootstrap CI (#857)#879
jeongyoonlee merged 2 commits into
uber:masterfrom
mohsinm-dev:fix/857-drlearner-bootstrap-seed

Conversation

@mohsinm-dev
Copy link
Copy Markdown
Contributor

Fixes #857

BaseDRLearner.estimate_ate() passed seed=seed to self.bootstrap(), which doesn't accept that parameter — raising TypeError whenever bootstrap_ci=True.

Changes:

  • Remove stray seed=seed from bootstrap call
  • Add local RNG seeding so seed controls bootstrap reproducibility without mutating global numpy state
  • Override bootstrap() in BaseDRLearner to pass per-iteration seeds into fit() for deterministic cross-fitting
  • Apply same fix to fit_predict(return_ci=True) path

Tests:

  • Regression test confirming no TypeError
  • Reproducibility check (same seed → identical CI bounds)
  • Global RNG state preservation check

Remove stray seed=seed argument passed to bootstrap() which did not
accept it, causing TypeError when bootstrap_ci=True.

Add local RNG-based seeding for reproducible bootstrap CI in both
estimate_ate() and fit_predict() without mutating global numpy state.
Override bootstrap() in BaseDRLearner to thread per-iteration seeds
into fit() for deterministic cross-fitting.
Comment thread tests/test_meta_learners.py Outdated
Copy link
Copy Markdown
Collaborator

@jeongyoonlee jeongyoonlee left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@jeongyoonlee jeongyoonlee merged commit b55f475 into uber:master Mar 6, 2026
7 checks passed
@mohsinm-dev mohsinm-dev deleted the fix/857-drlearner-bootstrap-seed branch March 8, 2026 07:39
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

seed parameter where it is not requested

2 participants