Commit a859c10
committed
Replace apex clip_grad_norm_ with PyTorch native in dints templates
apex.contrib.clip_grad.clip_grad_norm_ crashes on PyTorch >=2.10 with
"RuntimeError: Cannot access data pointer of Tensor that doesn't have
storage" because apex's multi_tensor_applier cannot handle tensors with
lazy/functional storage introduced in newer PyTorch versions.
The try/except only caught ModuleNotFoundError (apex not installed) but
not the runtime crash when apex is installed but incompatible.
torch.nn.utils.clip_grad_norm_ handles all tensor types correctly and
is the standard approach. The apex version offered marginal performance
gains that are not worth the compatibility breakage.
Fixes Project-MONAI/MONAI#87371 parent 21ed8e5 commit a859c10
2 files changed
Lines changed: 2 additions & 8 deletions
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
40 | 40 | | |
41 | 41 | | |
42 | 42 | | |
43 | | - | |
44 | | - | |
45 | | - | |
46 | | - | |
| 43 | + | |
47 | 44 | | |
48 | 45 | | |
49 | 46 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
47 | 47 | | |
48 | 48 | | |
49 | 49 | | |
50 | | - | |
51 | | - | |
52 | | - | |
53 | | - | |
| 50 | + | |
54 | 51 | | |
55 | 52 | | |
56 | 53 | | |
| |||
0 commit comments