Skip to content

Remove CUDA-specific early exit from XPU test#3192

Open
Silv3S wants to merge 2 commits intointel:mainfrom
Silv3S:torch_xpu_amp
Open

Remove CUDA-specific early exit from XPU test#3192
Silv3S wants to merge 2 commits intointel:mainfrom
Silv3S:torch_xpu_amp

Conversation

@Silv3S
Copy link
Copy Markdown
Contributor

@Silv3S Silv3S commented Mar 26, 2026

Fixes #2509

From amp docs:

torch.cuda.amp.autocast(args...) and torch.cpu.amp.autocast(args...) is deprecated.
Please use torch.amp.autocast("cuda", args...) or torch.amp.autocast("cpu", args...) instead.

This PR fixes test_pickle_gradscaler_xpu, as it tries to use torch.xpu.amp that doesn't exist and also calls CUDA-specific early exit. We should just drop this check and proceed with the test itself (it passes).

Second test from issue doesn't require any work, as test_grad_scaler_deprecated_warning_xpu checks if deprecation warning is raised but it doesn't make sense for XPU, as XPU doesn't have that deprecated torch.xpu.amp implemented

Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Removes an XPU-specific assertion path in the GradScaler pickling test that referenced torch.xpu.amp (which doesn’t exist), aligning the XPU test behavior with the non-CUDA path and preventing AttributeError failures.

Changes:

  • Drop the XPU-only torch.xpu.amp.common.amp_definitely_not_available() check in test_pickle_gradscaler.
  • Let XPU follow the existing non-CUDA assertion path (a.is_enabled() expected true).

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@Silv3S Silv3S added disable_e2e Disable all e2e test jobs for the PR disable_distributed Disable distributed UT test jobs for the PR disable_accelerate Disable accelerate test job in PR CI testing disable_transformers Disable transformers UT test in PR CI labels Mar 26, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

disable_accelerate Disable accelerate test job in PR CI testing disable_distributed Disable distributed UT test jobs for the PR disable_e2e Disable all e2e test jobs for the PR disable_transformers Disable transformers UT test in PR CI

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[upstream_ut] AttributeError: module 'torch.xpu' has no attribute 'amp'

3 participants