当前位置: 首页 > news >正文

前端直播网站怎么做互联网营销师题库

前端直播网站怎么做,互联网营销师题库,宁波网站建设价格合理,微信的网站在 PyTorch 中,named_children、named_modules 和 named_parameters 是用于获取神经网络模型组件和参数的三种不同的方法。下面是它们各自的作用和区别: named_parameters:递归地列出所有参数名称和tensornamed_modules:递归地列…

PyTorch 中,named_childrennamed_modulesnamed_parameters 是用于获取神经网络模型组件和参数的三种不同的方法。下面是它们各自的作用和区别:

  • named_parameters:递归地列出所有参数名称和tensor
  • named_modules:递归地列出所有子层,其中第一个返回值就是模型本身
  • named_children:列出模型的第一层级的子层,不往下进行深入递归

1. named_children:

  • named_children 返回一个生成器,它包含模型中所有直接子模块的名称和模块对。
  • 它只返回一层级的子模块,不递归到更深层次的子模块。
  • 这个方法通常用于迭代模型的直接子模块,并对其进行操作或检查。

示例:

from torchvision.models import resnet18model = resnet18()
# Print each layer name and its module
# Note that named_children method only returns the first level submodules
for name, layer in model.named_children():print(name.ljust(10), '-->', type(layer))

输出:

conv1      --> <class 'torch.nn.modules.conv.Conv2d'>
bn1        --> <class 'torch.nn.modules.batchnorm.BatchNorm2d'>
relu       --> <class 'torch.nn.modules.activation.ReLU'>
maxpool    --> <class 'torch.nn.modules.pooling.MaxPool2d'>
layer1     --> <class 'torch.nn.modules.container.Sequential'>
layer2     --> <class 'torch.nn.modules.container.Sequential'>
layer3     --> <class 'torch.nn.modules.container.Sequential'>
layer4     --> <class 'torch.nn.modules.container.Sequential'>
avgpool    --> <class 'torch.nn.modules.pooling.AdaptiveAvgPool2d'>
fc         --> <class 'torch.nn.modules.linear.Linear'>

2. named_modules:

  • named_modules 返回一个生成器,它包含模型中所有模块的名称和模块对,包括子模块的子模块。
  • 它递归地遍历整个模型,返回所有模块的名称和引用。
  • 这个方法适用于当你需要对模型中的所有模块进行操作或检查时,无论它们位于哪一层级。

示例:

from torchvision.models import resnet18model = resnet18()
# The first layer is the model itself
for name, layer in model.named_modules():print(name.ljust(15), '-->', type(layer))

输出:

                --> <class 'torchvision.models.resnet.ResNet'>
conv1           --> <class 'torch.nn.modules.conv.Conv2d'>
bn1             --> <class 'torch.nn.modules.batchnorm.BatchNorm2d'>
relu            --> <class 'torch.nn.modules.activation.ReLU'>
maxpool         --> <class 'torch.nn.modules.pooling.MaxPool2d'>
layer1          --> <class 'torch.nn.modules.container.Sequential'>
layer1.0        --> <class 'torchvision.models.resnet.BasicBlock'>
layer1.0.conv1  --> <class 'torch.nn.modules.conv.Conv2d'>
layer1.0.bn1    --> <class 'torch.nn.modules.batchnorm.BatchNorm2d'>
layer1.0.relu   --> <class 'torch.nn.modules.activation.ReLU'>
layer1.0.conv2  --> <class 'torch.nn.modules.conv.Conv2d'>
layer1.0.bn2    --> <class 'torch.nn.modules.batchnorm.BatchNorm2d'>
layer1.1        --> <class 'torchvision.models.resnet.BasicBlock'>
layer1.1.conv1  --> <class 'torch.nn.modules.conv.Conv2d'>
layer1.1.bn1    --> <class 'torch.nn.modules.batchnorm.BatchNorm2d'>
layer1.1.relu   --> <class 'torch.nn.modules.activation.ReLU'>
layer1.1.conv2  --> <class 'torch.nn.modules.conv.Conv2d'>
layer1.1.bn2    --> <class 'torch.nn.modules.batchnorm.BatchNorm2d'>
layer2          --> <class 'torch.nn.modules.container.Sequential'>
layer2.0        --> <class 'torchvision.models.resnet.BasicBlock'>
layer2.0.conv1  --> <class 'torch.nn.modules.conv.Conv2d'>
layer2.0.bn1    --> <class 'torch.nn.modules.batchnorm.BatchNorm2d'>
layer2.0.relu   --> <class 'torch.nn.modules.activation.ReLU'>
layer2.0.conv2  --> <class 'torch.nn.modules.conv.Conv2d'>
layer2.0.bn2    --> <class 'torch.nn.modules.batchnorm.BatchNorm2d'>
layer2.0.downsample --> <class 'torch.nn.modules.container.Sequential'>
layer2.0.downsample.0 --> <class 'torch.nn.modules.conv.Conv2d'>
layer2.0.downsample.1 --> <class 'torch.nn.modules.batchnorm.BatchNorm2d'>
layer2.1        --> <class 'torchvision.models.resnet.BasicBlock'>
layer2.1.conv1  --> <class 'torch.nn.modules.conv.Conv2d'>
layer2.1.bn1    --> <class 'torch.nn.modules.batchnorm.BatchNorm2d'>
layer2.1.relu   --> <class 'torch.nn.modules.activation.ReLU'>
layer2.1.conv2  --> <class 'torch.nn.modules.conv.Conv2d'>
layer2.1.bn2    --> <class 'torch.nn.modules.batchnorm.BatchNorm2d'>
layer3          --> <class 'torch.nn.modules.container.Sequential'>
layer3.0        --> <class 'torchvision.models.resnet.BasicBlock'>
layer3.0.conv1  --> <class 'torch.nn.modules.conv.Conv2d'>
layer3.0.bn1    --> <class 'torch.nn.modules.batchnorm.BatchNorm2d'>
layer3.0.relu   --> <class 'torch.nn.modules.activation.ReLU'>
layer3.0.conv2  --> <class 'torch.nn.modules.conv.Conv2d'>
layer3.0.bn2    --> <class 'torch.nn.modules.batchnorm.BatchNorm2d'>
layer3.0.downsample --> <class 'torch.nn.modules.container.Sequential'>
layer3.0.downsample.0 --> <class 'torch.nn.modules.conv.Conv2d'>
layer3.0.downsample.1 --> <class 'torch.nn.modules.batchnorm.BatchNorm2d'>
layer3.1        --> <class 'torchvision.models.resnet.BasicBlock'>
layer3.1.conv1  --> <class 'torch.nn.modules.conv.Conv2d'>
layer3.1.bn1    --> <class 'torch.nn.modules.batchnorm.BatchNorm2d'>
layer3.1.relu   --> <class 'torch.nn.modules.activation.ReLU'>
layer3.1.conv2  --> <class 'torch.nn.modules.conv.Conv2d'>
layer3.1.bn2    --> <class 'torch.nn.modules.batchnorm.BatchNorm2d'>
layer4          --> <class 'torch.nn.modules.container.Sequential'>
layer4.0        --> <class 'torchvision.models.resnet.BasicBlock'>
layer4.0.conv1  --> <class 'torch.nn.modules.conv.Conv2d'>
layer4.0.bn1    --> <class 'torch.nn.modules.batchnorm.BatchNorm2d'>
layer4.0.relu   --> <class 'torch.nn.modules.activation.ReLU'>
layer4.0.conv2  --> <class 'torch.nn.modules.conv.Conv2d'>
layer4.0.bn2    --> <class 'torch.nn.modules.batchnorm.BatchNorm2d'>
layer4.0.downsample --> <class 'torch.nn.modules.container.Sequential'>
layer4.0.downsample.0 --> <class 'torch.nn.modules.conv.Conv2d'>
layer4.0.downsample.1 --> <class 'torch.nn.modules.batchnorm.BatchNorm2d'>
layer4.1        --> <class 'torchvision.models.resnet.BasicBlock'>
layer4.1.conv1  --> <class 'torch.nn.modules.conv.Conv2d'>
layer4.1.bn1    --> <class 'torch.nn.modules.batchnorm.BatchNorm2d'>
layer4.1.relu   --> <class 'torch.nn.modules.activation.ReLU'>
layer4.1.conv2  --> <class 'torch.nn.modules.conv.Conv2d'>
layer4.1.bn2    --> <class 'torch.nn.modules.batchnorm.BatchNorm2d'>
avgpool         --> <class 'torch.nn.modules.pooling.AdaptiveAvgPool2d'>
fc              --> <class 'torch.nn.modules.linear.Linear'>

3. named_parameters:

  • named_parameters 返回一个生成器,它包含模型中所有参数的名称和参数值对。
  • 它递归地遍历模型,返回所有可训练参数的名称和参数张量。
  • 这个方法用于获取和检查模型中的参数,例如在打印模型参数、保存模型或加载模型时使用。

示例:

from torchvision.models import resnet18model = resnet18()
for name, param in model.named_parame
conv1.weight              --> torch.Size([64, 3, 7, 7])
bn1.weight                --> torch.Size([64])
bn1.bias                  --> torch.Size([64])
layer1.0.conv1.weight     --> torch.Size([64, 64, 3, 3])
layer1.0.bn1.weight       --> torch.Size([64])
layer1.0.bn1.bias         --> torch.Size([64])
layer1.0.conv2.weight     --> torch.Size([64, 64, 3, 3])
layer1.0.bn2.weight       --> torch.Size([64])
layer1.0.bn2.bias         --> torch.Size([64])
layer1.1.conv1.weight     --> torch.Size([64, 64, 3, 3])
layer1.1.bn1.weight       --> torch.Size([64])
layer1.1.bn1.bias         --> torch.Size([64])
layer1.1.conv2.weight     --> torch.Size([64, 64, 3, 3])
layer1.1.bn2.weight       --> torch.Size([64])
layer1.1.bn2.bias         --> torch.Size([64])
layer2.0.conv1.weight     --> torch.Size([128, 64, 3, 3])
layer2.0.bn1.weight       --> torch.Size([128])
layer2.0.bn1.bias         --> torch.Size([128])
layer2.0.conv2.weight     --> torch.Size([128, 128, 3, 3])
layer2.0.bn2.weight       --> torch.Size([128])
layer2.0.bn2.bias         --> torch.Size([128])
layer2.0.downsample.0.weight --> torch.Size([128, 64, 1, 1])
layer2.0.downsample.1.weight --> torch.Size([128])
layer2.0.downsample.1.bias --> torch.Size([128])
layer2.1.conv1.weight     --> torch.Size([128, 128, 3, 3])
layer2.1.bn1.weight       --> torch.Size([128])
layer2.1.bn1.bias         --> torch.Size([128])
layer2.1.conv2.weight     --> torch.Size([128, 128, 3, 3])
layer2.1.bn2.weight       --> torch.Size([128])
layer2.1.bn2.bias         --> torch.Size([128])
layer3.0.conv1.weight     --> torch.Size([256, 128, 3, 3])
layer3.0.bn1.weight       --> torch.Size([256])
layer3.0.bn1.bias         --> torch.Size([256])
layer3.0.conv2.weight     --> torch.Size([256, 256, 3, 3])
layer3.0.bn2.weight       --> torch.Size([256])
layer3.0.bn2.bias         --> torch.Size([256])
layer3.0.downsample.0.weight --> torch.Size([256, 128, 1, 1])
layer3.0.downsample.1.weight --> torch.Size([256])
layer3.0.downsample.1.bias --> torch.Size([256])
layer3.1.conv1.weight     --> torch.Size([256, 256, 3, 3])
layer3.1.bn1.weight       --> torch.Size([256])
layer3.1.bn1.bias         --> torch.Size([256])
layer3.1.conv2.weight     --> torch.Size([256, 256, 3, 3])
layer3.1.bn2.weight       --> torch.Size([256])
layer3.1.bn2.bias         --> torch.Size([256])
layer4.0.conv1.weight     --> torch.Size([512, 256, 3, 3])
layer4.0.bn1.weight       --> torch.Size([512])
layer4.0.bn1.bias         --> torch.Size([512])
layer4.0.conv2.weight     --> torch.Size([512, 512, 3, 3])
layer4.0.bn2.weight       --> torch.Size([512])
layer4.0.bn2.bias         --> torch.Size([512])
layer4.0.downsample.0.weight --> torch.Size([512, 256, 1, 1])
layer4.0.downsample.1.weight --> torch.Size([512])
layer4.0.downsample.1.bias --> torch.Size([512])
layer4.1.conv1.weight     --> torch.Size([512, 512, 3, 3])
layer4.1.bn1.weight       --> torch.Size([512])
layer4.1.bn1.bias         --> torch.Size([512])
layer4.1.conv2.weight     --> torch.Size([512, 512, 3, 3])
layer4.1.bn2.weight       --> torch.Size([512])
layer4.1.bn2.bias         --> torch.Size([512])
fc.weight                 --> torch.Size([1000, 512])
fc.bias                   --> torch.Size([1000])

总结来说,named_children 用于获取模型的直接子模块,named_modules 用于获取模型的所有模块(包括嵌套的子模块),而 named_parameters 用于获取模型中的所有参数。这些方法在模型调试、分析和优化时非常有用。

http://www.tj-hxxt.cn/news/76133.html

相关文章:

  • 济南集团网站建设公司好什么是长尾关键词举例
  • 关键词优化网站宁波seo托管公司
  • 做汽车售后的网站销售外包
  • 网站内容过滤seo流量
  • 一天赚五千块钱的捕鱼游戏优化网站性能
  • 网站 备案网站网站推广seo方法
  • 做网站用什么技术文案写作软件app
  • 微信文档超云seo优化
  • 网络运维面试题常用seo站长工具
  • 广州市网站建设企业龙岗网站制作
  • 服装网站ui设计做游戏推广一个月能拿多少钱
  • 高端网站网站设计b站推广费用一般多少
  • 做网站要用到数据库吗百度排名工具
  • seo网站推广专员招聘seo优化排名公司
  • 北京市住建委官网南京seo排名扣费
  • 深圳公安门户网站官网长春网络推广优化
  • 自媒体平台注册入口在哪seo经典案例
  • 网页制作网站素材能打开各种网站的浏览器下载
  • 给别人做网站的销售叫什么软件郑州seo网站关键词优化
  • 网站如何做区域屏蔽代码网络舆情案例分析
  • 网站开发web服务器控件实验报告seo教程培训
  • 培训网站计时怎么破千锋教育怎么样
  • 做网站简单的软件全球最大的磁力搜索引擎
  • 手机网站开发教程视频上海短视频培训机构
  • 唐山做企业网站公司百度入口网页版
  • 帝国cms主题网站seo优化效果
  • wordpress for android建站seo推广
  • 亿级别网站开发注意鹤壁seo公司
  • 怎么注册公司企业邮箱seo 是什么
  • 2018建设网站郑州seo顾问阿亮