Pretrain, finetune ANY AI model of ANY size on 1 or 10,000+ GPUs with zero code changes.
fix(fabric): forward set_epoch to underlying sampler in DistributedSamplerWrapper (#21456)
The DistributedSamplerWrapper now forwards set_epoch() calls to the underlying sampler if it supports the method. This fix is generic and works for any sampler subclass that implements set_epoch(), not just specific implementations. This is important for samplers that use the epoch for shuffling or other epoch-dependent behavior in distributed training. Fixes #21454 Co-authored-by: Deependu <[email protected]>
L
littlebullGit committed
79a39c04d37434f2234d9b518a145854d7c1e642
Parent: aa0ee0d
Committed by GitHub <[email protected]>
on 1/14/2026, 11:07:48 AM