torch.Tensor.copy_()和torch.Tensor.detach()和torch.Tensor.clone()學習筆記

參考連接: copy_(src, non_blocking=False) → Tensor
參考連接: detach()
參考連接: clone() → Tensorhtml

在這裏插入圖片描述
代碼實驗1:python

Microsoft Windows [版本 10.0.18363.1256]
(c) 2019 Microsoft Corporation。保留全部權利。

C:\Users\chenxuqi>conda activate ssd4pytorch1_2_0

(ssd4pytorch1_2_0) C:\Users\chenxuqi>python
Python 3.7.7 (default, May  6 2020, 11:45:54) [MSC v.1916 64 bit (AMD64)] :: Anaconda, Inc. on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
>>> torch.manual_seed(seed=20200910)
<torch._C.Generator object at 0x000002421C30D330>
>>>
>>> x = torch.randn(3, requires_grad=True)
>>> y = torch.randn(3, requires_grad=True)
>>> x
tensor([ 0.2824, -0.3715,  0.9088], requires_grad=True)
>>> y
tensor([-1.7601, -0.1806,  2.0937], requires_grad=True)
>>> y.copy_(x)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
RuntimeError: a leaf Variable that requires grad has been used in an in-place operation.
>>>
>>> y = torch.randn(3, requires_grad=False)
>>> y
tensor([ 1.0406, -1.7651,  1.1216])
>>> x
tensor([ 0.2824, -0.3715,  0.9088], requires_grad=True)
>>> y
tensor([ 1.0406, -1.7651,  1.1216])
>>> y.copy_(x)
tensor([ 0.2824, -0.3715,  0.9088], grad_fn=<CopyBackwards>)
>>> y
tensor([ 0.2824, -0.3715,  0.9088], grad_fn=<CopyBackwards>)
>>> x
tensor([ 0.2824, -0.3715,  0.9088], requires_grad=True)
>>>
>>>
Microsoft Windows [版本 10.0.18363.1256]
(c) 2019 Microsoft Corporation。保留全部權利。

C:\Users\chenxuqi>conda activate ssd4pytorch1_2_0

(ssd4pytorch1_2_0) C:\Users\chenxuqi>python
Python 3.7.7 (default, May  6 2020, 11:45:54) [MSC v.1916 64 bit (AMD64)] :: Anaconda, Inc. on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
>>> torch.manual_seed(seed=20200910)
<torch._C.Generator object at 0x000001DE002BD330>
>>>
>>> xt = torch.randn(3, requires_grad=True)
>>> xf = torch.randn(3, requires_grad=False)
>>> yt = torch.randn(3, requires_grad=True)
>>> yf = torch.randn(3, requires_grad=False)
>>> xt
tensor([ 0.2824, -0.3715,  0.9088], requires_grad=True)
>>> xf
tensor([-1.7601, -0.1806,  2.0937])
>>> yt
tensor([ 1.0406, -1.7651,  1.1216], requires_grad=True)
>>> yf
tensor([0.8440, 0.1783, 0.6859])
>>>
>>> yt.copy_(xt)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
RuntimeError: a leaf Variable that requires grad has been used in an in-place operation.
>>>
>>> yf.copy_(xf)
tensor([-1.7601, -0.1806,  2.0937])
>>>
>>> yf
tensor([-1.7601, -0.1806,  2.0937])
>>>
>>>
>>>
>>>
>>>
>>>
>>> torch.manual_seed(seed=20200910)
<torch._C.Generator object at 0x000001DE002BD330>
>>> xt = torch.randn(3, requires_grad=True)
>>> xf = torch.randn(3, requires_grad=False)
>>> yt = torch.randn(3, requires_grad=True)
>>> yf = torch.randn(3, requires_grad=False)
>>> xt
tensor([ 0.2824, -0.3715,  0.9088], requires_grad=True)
>>> xf
tensor([-1.7601, -0.1806,  2.0937])
>>> yt
tensor([ 1.0406, -1.7651,  1.1216], requires_grad=True)
>>> yf
tensor([0.8440, 0.1783, 0.6859])
>>>
>>> yf.copy_(xt)
tensor([ 0.2824, -0.3715,  0.9088], grad_fn=<CopyBackwards>)
>>>
>>> yt.copy_(xf)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
RuntimeError: a leaf Variable that requires grad has been used in an in-place operation.
>>>
>>>
>>>

在這裏插入圖片描述代碼實驗2:web

Microsoft Windows [版本 10.0.18363.1256]
(c) 2019 Microsoft Corporation。保留全部權利。

C:\Users\chenxuqi>conda activate ssd4pytorch1_2_0

(ssd4pytorch1_2_0) C:\Users\chenxuqi>python
Python 3.7.7 (default, May  6 2020, 11:45:54) [MSC v.1916 64 bit (AMD64)] :: Anaconda, Inc. on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
>>> torch.manual_seed(seed=20200910)
<torch._C.Generator object at 0x0000018DA812D330>
>>>
>>> x = torch.randn(3, requires_grad=True)
>>> y = torch.randn(3, requires_grad=True)
>>> x
tensor([ 0.2824, -0.3715,  0.9088], requires_grad=True)
>>> y
tensor([-1.7601, -0.1806,  2.0937], requires_grad=True)
>>> y.detach()
tensor([-1.7601, -0.1806,  2.0937])
>>> y
tensor([-1.7601, -0.1806,  2.0937], requires_grad=True)
>>>
>>> y+1
tensor([-0.7601,  0.8194,  3.0937], grad_fn=<AddBackward0>)
>>>
>>> (y+1).detach()
tensor([-0.7601,  0.8194,  3.0937])
>>>
>>>
>>> y = torch.randn(3, requires_grad=False)
>>> y
tensor([ 1.0406, -1.7651,  1.1216])
>>> y+1
tensor([ 2.0406, -0.7651,  2.1216])
>>> y.detach()
tensor([ 1.0406, -1.7651,  1.1216])
>>> (y+1).detach()
tensor([ 2.0406, -0.7651,  2.1216])
>>>
>>>
>>> x
tensor([ 0.2824, -0.3715,  0.9088], requires_grad=True)
>>> y
tensor([ 1.0406, -1.7651,  1.1216])
>>> x.detach_()
tensor([ 0.2824, -0.3715,  0.9088])
>>> y.detach_()
tensor([ 1.0406, -1.7651,  1.1216])
>>>
>>>
>>>

在這裏插入圖片描述

代碼實驗2:svg

Microsoft Windows [版本 10.0.18363.1256]
(c) 2019 Microsoft Corporation。保留全部權利。

C:\Users\chenxuqi>conda activate ssd4pytorch1_2_0

(ssd4pytorch1_2_0) C:\Users\chenxuqi>python
Python 3.7.7 (default, May  6 2020, 11:45:54) [MSC v.1916 64 bit (AMD64)] :: Anaconda, Inc. on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
>>> torch.manual_seed(seed=20200910)
<torch._C.Generator object at 0x0000016F955DD330>
>>>
>>>
>>> x = torch.randn(3, requires_grad=True)
>>> y = torch.randn(3, requires_grad=False)
>>>
>>> x
tensor([ 0.2824, -0.3715,  0.9088], requires_grad=True)
>>> y
tensor([-1.7601, -0.1806,  2.0937])
>>>
>>> x.clone()
tensor([ 0.2824, -0.3715,  0.9088], grad_fn=<CloneBackward>)
>>> y.clone()
tensor([-1.7601, -0.1806,  2.0937])
>>>
>>> (x+1).clone()
tensor([1.2824, 0.6285, 1.9088], grad_fn=<CloneBackward>)
>>>
>>> (y+1).clone()
tensor([-0.7601,  0.8194,  3.0937])
>>>
>>>
>>>

本文同步分享在 博客「敲代碼的小風」(CSDN)。
若有侵權,請聯繫 support@oschina.cn 刪除。
本文參與「OSC源創計劃」,歡迎正在閱讀的你也加入,一塊兒分享。ui

相關文章
相關標籤/搜索