Skip to content

Intermediate checkpointing for sequential calibration#1152

Open
sugunav14 wants to merge 5 commits intomainfrom
svelury/seq-calib-save-restore
Open

Intermediate checkpointing for sequential calibration#1152
sugunav14 wants to merge 5 commits intomainfrom
svelury/seq-calib-save-restore

Conversation

@sugunav14
Copy link
Copy Markdown
Contributor

@sugunav14 sugunav14 commented Mar 31, 2026

What does this PR do?

Type of change: ?

Usage

# Add a code snippet demonstrating how to use this

Testing

Before your PR is "Ready for review"

Make sure you read and follow Contributor guidelines and your commits are signed (git commit -s -S).

Make sure you read and follow the Security Best Practices (e.g. avoiding hardcoded trust_remote_code=True, torch.load(..., weights_only=False), pickle, etc.).

  • Is this change backward compatible?: ✅ / ❌ / N/A
  • If you copied code from any other sources or added a new PIP dependency, did you follow guidance in CONTRIBUTING.md: ✅ / ❌ / N/A
  • Did you write any new necessary tests?: ✅ / ❌ / N/A
  • Did you update Changelog?: ✅ / ❌ / N/A

Additional Information

Summary by CodeRabbit

Release Notes

  • New Features
    • Sequential quantization calibration now supports checkpoint save and resume functionality, enabling users to preserve calibration progress and resume from intermediate checkpoints when calibration is interrupted or restarted.
    • New configuration fields control checkpoint directory location and automatic save intervals for flexible checkpointing strategies.
    • HuggingFace models now support seamless checkpoint persistence during sequential calibration.

Signed-off-by: Suguna Velury <178320438+sugunav14@users.noreply.github.com>
@copy-pr-bot
Copy link
Copy Markdown

copy-pr-bot bot commented Mar 31, 2026

Auto-sync is disabled for draft pull requests in this repository. Workflows must be run manually.

Contributors can view more details about this message here.

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Mar 31, 2026

Note

Currently processing new changes in this PR. This may take a few minutes, please wait...

⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 7693e81d-61c3-4338-9c0f-67d1b730e51b

📥 Commits

Reviewing files that changed from the base of the PR and between ada1e26 and b9035fb.

📒 Files selected for processing (8)
  • modelopt/torch/quantization/config.py
  • modelopt/torch/quantization/conversion.py
  • modelopt/torch/quantization/mode.py
  • modelopt/torch/quantization/model_calib.py
  • modelopt/torch/quantization/plugins/huggingface.py
  • modelopt/torch/quantization/utils/activation_collector.py
  • modelopt/torch/quantization/utils/checkpoint.py
  • tests/unit/torch/quantization/test_sequential_calibrate.py
 ____________________________________________________________
< To infinity and beyond! Scouring your code for pesky bugs. >
 ------------------------------------------------------------
  \
   \   \
        \ /\
        ( )
      .( o ).
✨ Finishing Touches
📝 Generate docstrings
  • Create stacked PR
  • Commit on current branch
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch svelury/seq-calib-save-restore

Comment @coderabbitai help to get the list of available commands and usage tips.

Tip

You can disable sequence diagrams in the walkthrough.

Disable the reviews.sequence_diagrams setting to disable sequence diagrams in the walkthrough.

Signed-off-by: Suguna Velury <178320438+sugunav14@users.noreply.github.com>
@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Apr 1, 2026

PR Preview Action v1.8.1

QR code for preview link

🚀 View preview at
https://NVIDIA.github.io/Model-Optimizer/pr-preview/pr-1152/

Built to branch gh-pages at 2026-04-01 21:28 UTC.
Preview will be ready when the GitHub Pages deployment is complete.

@codecov
Copy link
Copy Markdown

codecov bot commented Apr 1, 2026

Codecov Report

❌ Patch coverage is 93.84615% with 8 lines in your changes missing coverage. Please review.
✅ Project coverage is 70.31%. Comparing base (ada1e26) to head (0792fab).
⚠️ Report is 5 commits behind head on main.

Files with missing lines Patch % Lines
...t/torch/quantization/utils/activation_collector.py 92.42% 5 Missing ⚠️
modelopt/torch/quantization/conversion.py 66.66% 2 Missing ⚠️
modelopt/torch/quantization/model_calib.py 92.85% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #1152      +/-   ##
==========================================
+ Coverage   70.18%   70.31%   +0.13%     
==========================================
  Files         230      231       +1     
  Lines       26080    26213     +133     
==========================================
+ Hits        18304    18432     +128     
- Misses       7776     7781       +5     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Signed-off-by: Suguna Velury <178320438+sugunav14@users.noreply.github.com>
Signed-off-by: Suguna Velury <178320438+sugunav14@users.noreply.github.com>
Signed-off-by: Suguna Velury <178320438+sugunav14@users.noreply.github.com>
@sugunav14 sugunav14 marked this pull request as ready for review April 1, 2026 21:24
@sugunav14 sugunav14 requested a review from a team as a code owner April 1, 2026 21:24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant