ICode9

精准搜索请尝试: 精确搜索
首页 > 其他分享> 文章详细

resnet18代码实现

2021-11-24 19:01:13  阅读:297  来源: 互联网

标签:resnet18 dim nn 实现 代码 __ self def out


resnet的特点在于残差结构,可以有效的防止梯度消失。对x+f(x)求导,可以得到1+f’(x),所以梯度是常数。从工程实现上,可以得到的启示包括,目前的网络设计的基本卷积结构都是由一个卷积层+bn层+激活函数 构成的小模块。其他参数都是调优的结果。本文resnet18的实现是基于paddle2.2的版本

import paddle
import paddle.nn as nn

#paddle.set_device('cpu')
class Identity(nn.Layer):
    def __init_(self):
        super().__init__()

    def forward(self, x):
        return x


class Block(nn.Layer):
    def __init__(self, in_dim, out_dim, stride):
        super().__init__()
        ## 补充代码
        self.conv1 = nn.Conv2D(in_dim, out_dim, 3, stride, 1)
        self.bn1 = nn.BatchNorm2D(out_dim)
        self.conv2 = nn.Conv2D(out_dim, out_dim, 3, 1, 1)
        self.bn2 = nn.BatchNorm2D(out_dim)
        self.relu = nn.ReLU()
        if stride != 1:
            downsample = []
            downsample.append(nn.Conv2D(in_dim, out_dim, 3, stride, 1))
            downsample.append(nn.BatchNorm2D(out_dim))
            self.downsample = nn.Sequential(*downsample)
        else:
            self.downsample = Identity()
    def forward(self, x):
        ## 补充代码
        h = x
        out = self.conv1(x)
        out = self.bn1(out)
        out = self.relu(out)
        out = self.conv2(out)
        out = self.bn2(out)
        identity = self.downsample(h)
        out += identity
        out = self.relu(out)
        return out

class ResNet18(nn.Layer):
    def __init__(self, in_dim=64, num_classes=1000):
        super().__init__()
        ## 补充代码
        self.in_dim = in_dim
        self.conv = nn.Conv2D(3, 64, 7, 2)
        self.norm = nn.BatchNorm2D(64)
        self.relu = nn.ReLU()
        self.max_pooling = nn.MaxPool2D(kernel_size=3, stride=2, padding=1)
        self.layer1 = self._make_layer(64, 2, 1)
        self.layer2 = self._make_layer(128, 2, 2)
        self.layer3 = self._make_layer(256, 2, 2)
        self.layer4 = self._make_layer(512, 2, 2)
        self.avgpool = nn.AdaptiveAvgPool2D((1, 1))
        self.fc = nn.Linear(512 , num_classes)


    def _make_layer(self, out_dim, n_blocks, stride):
        ## 补充代码
        layers = []
        layers.append(Block(self.in_dim, out_dim, 2))
        self.in_dim = out_dim
        for _ in range(1, 2):
            layers.append(Block(self.in_dim, out_dim, 1))
        return nn.Sequential(*layers)

    def forward(self, x):
        ## 补充代码
        x = self.conv(x)
        x = self.norm(x)
        x = self.relu(x)
        x = self.max_pooling(x)
        x = self.layer1(x)
        x = self.layer2(x)
        x = self.layer3(x)
        x = self.layer4(x)
        x = self.avgpool(x)
        x = paddle.flatten(x, 1)
        x = self.fc(x)

        return x


def main():
    model = ResNet18()
    print(model)
    x = paddle.randn([2, 3, 32, 32])
    out = model(x)
    print(out.shape)

if __name__ == "__main__":
    main()

标签:resnet18,dim,nn,实现,代码,__,self,def,out
来源: https://blog.csdn.net/lanmengyiyu/article/details/121521099

本站声明: 1. iCode9 技术分享网(下文简称本站)提供的所有内容,仅供技术学习、探讨和分享;
2. 关于本站的所有留言、评论、转载及引用,纯属内容发起人的个人观点,与本站观点和立场无关;
3. 关于本站的所有言论和文字,纯属内容发起人的个人观点,与本站观点和立场无关;
4. 本站文章均是网友提供,不完全保证技术分享内容的完整性、准确性、时效性、风险性和版权归属;如您发现该文章侵犯了您的权益,可联系我们第一时间进行删除;
5. 本站为非盈利性的个人网站,所有内容不会用来进行牟利,也不会利用任何形式的广告来间接获益,纯粹是为了广大技术爱好者提供技术内容和技术思想的分享性交流网站。

专注分享技术,共同学习,共同进步。侵权联系[81616952@qq.com]

Copyright (C)ICode9.com, All Rights Reserved.

ICode9版权所有