Pytorch並行計算:nn.parallel.replicate, scatter, gather, parallel_apply

import torch import torch.nn as nn import ipdb class DataParallelModel(nn.Module): def __init__(self): super().__init__() self.block1 = nn.Linear(10, 20) def forward(self, x): x = self.block1(x) retur
相關文章
相關標籤/搜索