Pretrain, finetune ANY AI model of ANY size on 1 or 10,000+ GPUs with zero code changes.
fix(fabric): forward set_epoch to underlying sampler in DistributedSamplerWrapper (#21456)
The DistributedSamplerWrapper now forwards set_epoch() calls to the underlying sampler if it supports the method. This fix is generic and works for any sampler subclass that implements set_epoch(), not just specific implementations. This is important for samplers that use the epoch for shuffling or other epoch-dependent behavior in distributed training. Fixes #21454 Co-authored-by: Deependu <[email protected]>
L
littlebullGit committed
f0b1c5274552b607d7a4b22216deaf910ffc2aba
Parent: bddc253
Committed by Luca Antiga <[email protected]>
on 1/30/2026, 1:51:21 PM