SIGN IN SIGN UP

Pretrain, finetune ANY AI model of ANY size on 1 or 10,000+ GPUs with zero code changes.

0 0 0 Python

fix: FSDP mixed precision initializes params in fp32, not bf16/fp16 (#21586)

B
Bhimraj Yadav committed
6c9267845343c3e72814e23f4a8dbbe7350861ae
Parent: a13361d
Committed by GitHub <noreply@github.com> on 3/17/2026, 9:20:11 AM