Pretrain, finetune ANY AI model of ANY size on 1 or 10,000+ GPUs with zero code changes.
COMMITS
/ tests/plugins/test_deepspeed_plugin.py June 16, 2021
S
Fix Special Tests (#7841)
Sean Naren committed
T
[fix] Enable manual optimization DeepSpeed (#7970)
thomas chaton committed
May 22, 2021
S
refactor accelerator teardown -> training type plugin teardown (#7579)
shuyingsunshine21 committed
May 11, 2021
A
Reduce log output size in special tests (#7481)
Adrian Wälchli committed
May 7, 2021
L
Fix DeepSpeedPlugin with IterableDataset (#7362)
Leonard Lausen committed
April 19, 2021
A
Set `DistributedSampler` seed if `seed_everything` was called (#7024)
Adrian Wälchli committed
April 12, 2021
S
[Feat] DeepSpeed single file saving (#6900)
Sean Naren committed
March 30, 2021
T
DeepSpeed ZeRO Update (#6546)
thomas chaton committed
March 18, 2021
S
[Fix] Move init dist connection into the setup function (#6506)
Sean Naren committed
March 4, 2021
J
missing tests default_root_dir=tmpdir (#6314)
Jirka Borovec committed
March 2, 2021
K
Add fairscale & deepspeed to skipif 4/n (#6281)
Kaushik B committed
J
Refactor: runif for spec 6/6 (#6307)
Jirka Borovec committed
J
Refactor: skipif for AMPs 3/n (#6293)
Jirka Borovec committed
J
Refactor: skipif for Windows 2/n (#6268)
Jirka Borovec committed
J
Refactor: skipif for multi - gpus 1/n (#6266)
Jirka Borovec committed
February 21, 2021
C
Collapse 2 DeepSpeed tests (#6108)
Carlos Mocholí committed
S
Expose DeepSpeed FP16 parameters due to loss instability (#6115)
Sean Naren committed
S
Enable ZeRO tests for CI, fix to/half function calls (#6070)
Sean Naren committed
February 18, 2021
A
rename accelerator_backend -> accelerator (#6034)
Adrian Wälchli committed
February 17, 2021
S
DeepSpeed Integration (#5954)
Sean Naren committed