pytorch分佈式訓練(三DistributedDataParallel)

DistributedDataParallel DistributedDataParallel爲pytorch分佈式接口:html model = torch.nn.parallel.DistributedDataParallel( model, device_ids=[args.local_rank], output_device=args.local_rank,
相關文章
相關標籤/搜索