Dilated Residual Networks

2023-11-10 19:08
文章标签 networks residual dilated

本文主要是介绍Dilated Residual Networks,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!

在这里插入图片描述

完整代码请见: https://github.com/fyu/drn

class DRN(nn.Module):def __init__(self, block, layers, num_classes=1000,channels=(16, 32, 64, 128, 256, 512, 512, 512),out_map=False, out_middle=False, pool_size=28, arch='D'):super(DRN, self).__init__()self.inplanes = channels[0]self.out_map = out_mapself.out_dim = channels[-1]self.out_middle = out_middleself.arch = archif arch == 'C':self.conv1 = nn.Conv2d(3, channels[0], kernel_size=7, stride=1,padding=3, bias=False)self.bn1 = BatchNorm(channels[0])self.relu = nn.ReLU(inplace=True)self.layer1 = self._make_layer(BasicBlock, channels[0], layers[0], stride=1)self.layer2 = self._make_layer(BasicBlock, channels[1], layers[1], stride=2)elif arch == 'D':self.layer0 = nn.Sequential(nn.Conv2d(3, channels[0], kernel_size=7, stride=1, padding=3,bias=False),BatchNorm(channels[0]),nn.ReLU(inplace=True))self.layer1 = self._make_conv_layers(channels[0], layers[0], stride=1)self.layer2 = self._make_conv_layers(channels[1], layers[1], stride=2)self.layer3 = self._make_layer(block, channels[2], layers[2], stride=2)self.layer4 = self._make_layer(block, channels[3], layers[3], stride=2)self.layer5 = self._make_layer(block, channels[4], layers[4],dilation=2, new_level=False)self.layer6 = None if layers[5] == 0 else \self._make_layer(block, channels[5], layers[5], dilation=4,new_level=False)if arch == 'C':self.layer7 = None if layers[6] == 0 else \self._make_layer(BasicBlock, channels[6], layers[6], dilation=2,new_level=False, residual=False)self.layer8 = None if layers[7] == 0 else \self._make_layer(BasicBlock, channels[7], layers[7], dilation=1,new_level=False, residual=False)elif arch == 'D':self.layer7 = None if layers[6] == 0 else \self._make_conv_layers(channels[6], layers[6], dilation=2)self.layer8 = None if layers[7] == 0 else \self._make_conv_layers(channels[7], layers[7], dilation=1)if num_classes > 0:self.avgpool = nn.AvgPool2d(pool_size)self.fc = nn.Conv2d(self.out_dim, num_classes, kernel_size=1,stride=1, padding=0, bias=True)for m in self.modules():if isinstance(m, nn.Conv2d):n = m.kernel_size[0] * m.kernel_size[1] * m.out_channelsm.weight.data.normal_(0, math.sqrt(2. / n))elif isinstance(m, BatchNorm):m.weight.data.fill_(1)m.bias.data.zero_()def _make_layer(self, block, planes, blocks, stride=1, dilation=1,new_level=True, residual=True):assert dilation == 1 or dilation % 2 == 0downsample = Noneif stride != 1 or self.inplanes != planes * block.expansion:downsample = nn.Sequential(nn.Conv2d(self.inplanes, planes * block.expansion,kernel_size=1, stride=stride, bias=False),BatchNorm(planes * block.expansion),)layers = list()layers.append(block(self.inplanes, planes, stride, downsample,dilation=(1, 1) if dilation == 1 else (dilation // 2 if new_level else dilation, dilation),residual=residual))self.inplanes = planes * block.expansionfor i in range(1, blocks):layers.append(block(self.inplanes, planes, residual=residual,dilation=(dilation, dilation)))return nn.Sequential(*layers)def _make_conv_layers(self, channels, convs, stride=1, dilation=1):modules = []for i in range(convs):modules.extend([nn.Conv2d(self.inplanes, channels, kernel_size=3,stride=stride if i == 0 else 1,padding=dilation, bias=False, dilation=dilation),BatchNorm(channels),nn.ReLU(inplace=True)])self.inplanes = channelsreturn nn.Sequential(*modules)

这篇关于Dilated Residual Networks的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!



http://www.chinasem.cn/article/384681

相关文章

A Comprehensive Survey on Graph Neural Networks笔记

一、摘要-Abstract 1、传统的深度学习模型主要处理欧几里得数据(如图像、文本),而图神经网络的出现和发展是为了有效处理和学习非欧几里得域(即图结构数据)的信息。 2、将GNN划分为四类:recurrent GNNs(RecGNN), convolutional GNNs,(GCN), graph autoencoders(GAE), and spatial–temporal GNNs(S

Complex Networks Package for MatLab

http://www.levmuchnik.net/Content/Networks/ComplexNetworksPackage.html 翻译: 复杂网络的MATLAB工具包提供了一个高效、可扩展的框架,用于在MATLAB上的网络研究。 可以帮助描述经验网络的成千上万的节点,生成人工网络,运行鲁棒性实验,测试网络在不同的攻击下的可靠性,模拟任意复杂的传染病的传

Convolutional Neural Networks for Sentence Classification论文解读

基本信息 作者Yoon Kimdoi发表时间2014期刊EMNLP网址https://doi.org/10.48550/arXiv.1408.5882 研究背景 1. What’s known 既往研究已证实 CV领域著名的CNN。 2. What’s new 创新点 将CNN应用于NLP,打破了传统NLP任务主要依赖循环神经网络(RNN)及其变体的局面。 用预训练的词向量(如word2v

【机器学习】生成对抗网络(Generative Adversarial Networks, GANs)详解

🌈个人主页: 鑫宝Code 🔥热门专栏: 闲话杂谈| 炫酷HTML | JavaScript基础 ​💫个人格言: "如无必要,勿增实体" 文章目录 生成对抗网络(Generative Adversarial Networks, GANs)详解GANs的基本原理GANs的训练过程GANs的发展历程GANs在实际任务中的应用小结 生成对

Image Transformation can make Neural Networks more robust against Adversarial Examples

Image Transformation can make Neural Networks more robust against Adversarial Examples 创新点 1.旋转解决误分类 总结 可以说简单粗暴有效

Segmentation简记2-RESIDUAL PYRAMID FCN FOR ROBUST FOLLICLE SEGMENTATION

创新点 与resnet结合,五层/level的分割由此带来的梯度更新问题,设计了两种方案。 总结 有点意思。看图吧,很明了。 细节图: 全流程图: 实验 Res-Seg-Net-horz: 在UNet上堆叠5个细节图中的结构,没有上采样层。 Res-Seg-Net-non-fixed: 普通方式的更新 Res-Seg-Net-fixed: 每一层的更新,只依据距离它最近的一

吴恩达深度学习笔记:卷积神经网络(Foundations of Convolutional Neural Networks)1.9-1.10

目录 第四门课 卷积神经网络(Convolutional Neural Networks)第一周 卷积神经网络(Foundations of Convolutional Neural Networks)1.9 池化层(Pooling layers)1.10 卷 积 神 经 网 络 示 例 ( Convolutional neural network example) 第四门课

NLP-文本匹配-2016:SiamseNet【Learning text similarity with siamese recurrent networks】

NLP-文本匹配-2016:SiamseNet【Learning text similarity with siamese recurrent networks】

CV-CNN-2016:GoogleNet-V4【用ResNet模型的残差连接(Residual Connection)思想改进GoogleNet-V3的结构】

Inception V4研究了Inception模块与残差连接的结合。 ResNet结构大大地加深了网络深度,还极大地提升了训练速度,同时性能也有提升。 Inception V4主要利用残差连接(Residual Connection)来改进V3结构,得到Inception-ResNet-v1,Inception-ResNet-v2,Inception-v4网络。 ResNet的残差结构如下

GNN-频域-2014:Spectral Networks and Locally Connected Networks on Graphs(频谱图卷积神经网络)【第一篇从频域角度分析】

《原始论文:Spectral Networks and Locally Connected Networks on Graphs》 空域卷积非常直观地借鉴了图像里的卷积操作,但缺乏一定的理论基础。 而频域卷积则不同,相比于空域卷积而言,它主要利用的是**图傅里叶变换(Graph Fourier Transform)**实现卷积。 简单来讲,它利用图的**拉普拉斯矩阵(Laplacian ma