ICode9

精准搜索请尝试: 精确搜索
首页 > 其他分享> 文章详细

代码笔记17 Pytorch中的requires_grad_()与requires_grad的区别

2022-06-20 01:35:58  阅读:185  来源: 互联网

标签:BatchNorm2d tensor 17 True requires 01 grad


问题

  感谢pycharm,我还不知道有一天我会发现这种问题,我本来是查看一下batchnorm2d中tensor的requires_grad属性,然后我就顺着快捷栏点下来的。结果发现requires_grad_()与requires_grad完全不一样。

代码

requires_grad

 for m in net.modules():
      if isinstance(m,nn.BatchNorm2d):
          print(m,m.weight.requires_grad,m.bias.requires_grad)

requires_grad代码显示为

BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True) False False
BatchNorm2d(2048, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True) False False
BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True) True True
BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True) True True

requires_grad_()

 for m in net.modules():
      if isinstance(m,nn.BatchNorm2d):
          print(m,m.weight.requires_grad_(),m.bias.requires_grad_())

此时代码显示为每个tensor的值与requires_grad属性

tensor([2.3888e-01, 2.9136e-01, 3.1615e-01, 2.7122e-01, 2.1731e-01, 3.0903e-01,
        2.2937e-01, 2.3086e-01, 2.1129e-01, 2.8054e-01, 1.9923e-01, 3.1894e-01,
        1.7955e-01, 1.1246e-08, 1.9704e-01, 2.0996e-01, 2.4317e-01, 2.1697e-01,
        1.9415e-01, 3.1569e-01, 1.9648e-01, 2.3214e-01, 2.1962e-01, 2.1633e-01,
        2.4357e-01, 2.9683e-01, 2.3852e-01, 2.1162e-01, 1.4492e-01, 2.9388e-01,
        2.2911e-01, 9.2716e-02, 4.3334e-01, 2.0782e-01, 2.7990e-01, 3.5804e-01,
        2.9315e-01, 2.5306e-01, 2.4210e-01, 2.1755e-01, 3.8645e-01, 2.1003e-01,
        3.6805e-01, 3.3724e-01, 5.0826e-01, 1.9341e-01, 2.3914e-01, 2.6652e-01,
        3.9020e-01, 1.9840e-01, 2.1694e-01, 2.6666e-01, 4.9806e-01, 2.3553e-01,
        2.1349e-01, 2.5951e-01, 2.3547e-01, 1.7579e-01, 4.5354e-01, 1.7102e-01,
        2.4903e-01, 2.5148e-01, 3.8020e-01, 1.9665e-01], requires_grad=True)

解决

然后我去把requires_grad_()的源码以及说明找了看

    def requires_grad_(self: T, requires_grad: bool = True) -> T:
        r"""Change if autograd should record operations on parameters in this
        module.

        This method sets the parameters' :attr:`requires_grad` attributes
        in-place.

        This method is helpful for freezing part of the module for finetuning
        or training parts of a model individually (e.g., GAN training).

        See :ref:`locally-disable-grad-doc` for a comparison between
        `.requires_grad_()` and several similar mechanisms that may be confused with it.

        Args:
            requires_grad (bool): whether autograd should record operations on
                                  parameters in this module. Default: ``True``.

        Returns:
            Module: self
        """
        for p in self.parameters():
            p.requires_grad_(requires_grad)
        return self

行吧,这原来是个函数,大概就是将tensor的requires_grad属性调整为True

那么requires_grad呢

这是一个定义在tensor类里的属性

标签:BatchNorm2d,tensor,17,True,requires,01,grad
来源: https://www.cnblogs.com/HumbleHater/p/16391935.html

本站声明: 1. iCode9 技术分享网(下文简称本站)提供的所有内容,仅供技术学习、探讨和分享;
2. 关于本站的所有留言、评论、转载及引用,纯属内容发起人的个人观点,与本站观点和立场无关;
3. 关于本站的所有言论和文字,纯属内容发起人的个人观点,与本站观点和立场无关;
4. 本站文章均是网友提供,不完全保证技术分享内容的完整性、准确性、时效性、风险性和版权归属;如您发现该文章侵犯了您的权益,可联系我们第一时间进行删除;
5. 本站为非盈利性的个人网站,所有内容不会用来进行牟利,也不会利用任何形式的广告来间接获益,纯粹是为了广大技术爱好者提供技术内容和技术思想的分享性交流网站。

专注分享技术,共同学习,共同进步。侵权联系[81616952@qq.com]

Copyright (C)ICode9.com, All Rights Reserved.

ICode9版权所有