本文主要是介绍【知识---ResNet(Residual Network)作用及代码】,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!
提示:文章写完后,目录可以自动生成,如何生成可参考右边的帮助文档
文章目录
- 前言
- ResNet(Residual Network)作用
- ResNet残差块的示例:
- tensorflow 版本
- pytorch 版本
- 总结
前言
提示:这里可以添加本文要记录的大概内容:
BN(批量归一化)可以解决网络层数太深而出现的梯度消失问题,但是如果网络层数太多,这个方法也是不太管用的。所以就提出了resnet,下面对该网络作进一步学习:
提示:以下是本篇文章正文内容,下面案例可供参考
ResNet(Residual Network)作用
ResNet(Residual Network)是由Microsoft Research提出的一种深度神经网络架构,
其主要特点是使用了残差块(Residual Blocks),通过跳跃连接(skip connection)来解决深度神经网络训练过程中的梯度消失和梯度爆炸问题。
这种结构使得网络可以更容易地学习恒等映射,从而更轻松地训练非常深的网络。
ResNet的基本构建块是残差块,其中输入通过跳跃连接直接添加到输出。
这样的设计有助于信息的流动,避免了梯度在网络中传递时过度减小,从而使得训练更加稳定。
ResNet残差块的示例:
tensorflow 版本
import tensorflow as tf
from tensorflow.keras.layers import Input, Conv2D, BatchNormalization, ReLU, Adddef residual_block(x, filters, kernel_size=3, stride=1):# 主要路径y = Conv2D(filters, kernel_size=kernel_size, strides=stride, padding='same')(x)y = BatchNormalization()(y)y = ReLU()(y)y = Conv2D(filters, kernel_size=kernel_size, padding='same')(y)y = BatchNormalization()(y)# 跳跃连接if stride != 1 or x.shape[-1] != filters:x = Conv2D(filters, kernel_size=1, strides=stride, padding='same')(x)# 相加操作out = Add()([x, y])out = ReLU()(out)return out# 构建简单的ResNet模型
input_tensor = Input(shape=(224, 224, 3))
x = Conv2D(64, kernel_size=7, strides=2, padding='same')(input_tensor)
x = BatchNormalization()(x)
x = ReLU()(x)
x = residual_block(x, filters=64)
x = residual_block(x, filters=64)
x = residual_block(x, filters=128, stride=2)
x = residual_block(x, filters=128)
x = residual_block(x, filters=256, stride=2)
x = residual_block(x, filters=256)
x = residual_block(x, filters=512, stride=2)
x = residual_block(x, filters=512)
output = tf.keras.layers.GlobalAveragePooling2D()(x)model = tf.keras.Model(inputs=input_tensor, outputs=output)model.summary()
上述代码中的residual_block函数定义了一个简单的残差块,而模型则是通过堆叠这些残差块来构建的。
这只是一个简单的ResNet模型示例,实际中可能会使用更深的网络。
pytorch 版本
import torch
import torch.nn as nn
import torch.nn.functional as Fclass BasicBlock(nn.Module):def __init__(self, in_channels, out_channels, stride=1):super(BasicBlock, self).__init__()self.conv1 = nn.Conv2d(in_channels, out_channels, kernel_size=3, stride=stride, padding=1, bias=False)self.bn1 = nn.BatchNorm2d(out_channels)self.relu = nn.ReLU(inplace=True)self.conv2 = nn.Conv2d(out_channels, out_channels, kernel_size=3, stride=1, padding=1, bias=False)self.bn2 = nn.BatchNorm2d(out_channels)# 跳跃连接self.shortcut = nn.Sequential()if stride != 1 or in_channels != out_channels:self.shortcut = nn.Sequential(nn.Conv2d(in_channels, out_channels, kernel_size=1, stride=stride, bias=False),nn.BatchNorm2d(out_channels))def forward(self, x):residual = xout = self.conv1(x)out = self.bn1(out)out = self.relu(out)out = self.conv2(out)out = self.bn2(out)out += self.shortcut(residual)out = self.relu(out)return outclass ResNet(nn.Module):def __init__(self, block, layers, num_classes=1000):super(ResNet, self).__init__()self.in_channels = 64self.conv1 = nn.Conv2d(3, 64, kernel_size=7, stride=2, padding=3, bias=False)self.bn1 = nn.BatchNorm2d(64)self.relu = nn.ReLU(inplace=True)self.maxpool = nn.MaxPool2d(kernel_size=3, stride=2, padding=1)self.layer1 = self._make_layer(block, 64, layers[0])self.layer2 = self._make_layer(block, 128, layers[1], stride=2)self.layer3 = self._make_layer(block, 256, layers[2], stride=2)self.layer4 = self._make_layer(block, 512, layers[3], stride=2)self.avgpool = nn.AdaptiveAvgPool2d((1, 1))self.fc = nn.Linear(512, num_classes)def _make_layer(self, block, out_channels, blocks, stride=1):layers = []layers.append(block(self.in_channels, out_channels, stride))self.in_channels = out_channelsfor _ in range(
总结
在实际应用中,可以根据任务的需求和计算资源的限制来选择不同深度的ResNet模型。
这篇关于【知识---ResNet(Residual Network)作用及代码】的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!