首页
学习
活动
专区
圈层
工具
发布
首页
学习
活动
专区
圈层
工具
MCP广场
社区首页 >专栏 >2021-05-16

2021-05-16

原创
作者头像
Hi0703
修改于 2021-05-17 03:03:01
修改于 2021-05-17 03:03:01
36100
代码可运行
举报
文章被收录于专栏:Hi0703Hi0703
运行总次数:0
代码可运行

1. 关于NCCL和cuda等

训练大型神经网络方法总结,地址:https://blog.csdn.net/xixiaoyaoww/article/details/104645796/

2. CPU和GPU运行的区别

详细:pytorch中model.to(device)和map_location=device的区别

3. 查看服务器GPU内存使用情况

地址:https://zhuanlan.zhihu.com/p/266586826

4. 目前遇到的问题是,用上次的软件看的节点不是代码里的形式,但是没找到有介绍每一个结点的文章,但如果不知道,就没办法在代码里使用。

对应遇到的bug是:KeyError。

文章:pytorch中保存的模型文件.pth深入解析

地址:https://blog.csdn.net/qq_27825451/article/details/100773473?utm_term=读取pth文件&utm_medium=distribute.pc_aggpage_search_result.none-task-blog-2~all~sobaiduweb~default-4-100773473&spm=3001.4430

参考文章后,加了一段代码查看.pth里的每一层节点,

代码如下:

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
#.pth预训练模型的认识
    pthfile = r'/home/xx/CrowdDet/tools/data/model/rcnn_fpn_baseline.pth'  #faster_rcnn_ckpt.pth
    net = torch.load(pthfile, map_location=torch.device('cuda:0')) 
    #net = torch.load(pthfile,map_location=torch.device('cpu'))

    print(type(net))  # 类型是 dict
    print(len(net))   # 长度为 3,即存在3个 key-value 键值对

    for k in net.keys():
        print(k)     # 查看3个键,分别是epoch,state_dict,optimizer

    print('\n')

    # print(net["model"]) # 返回的是一个OrderedDict 对象
    for key,value in net["state_dict"].items():
        print(key,value.size(),sep="   ")

    print('\n'+'state_dict打印完毕'+'\n')

最后shell里的输出是这样的:

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
<class 'dict'>
3
epoch
state_dict
optimizer


resnet50.conv1.weight   torch.Size([64, 3, 7, 7])
resnet50.bn1.weight   torch.Size([64])
resnet50.bn1.bias   torch.Size([64])
resnet50.bn1.running_mean   torch.Size([64])
resnet50.bn1.running_var   torch.Size([64])
resnet50.layer1.0.downsample.0.weight   torch.Size([256, 64, 1, 1])
resnet50.layer1.0.downsample.1.weight   torch.Size([256])
resnet50.layer1.0.downsample.1.bias   torch.Size([256])
resnet50.layer1.0.downsample.1.running_mean   torch.Size([256])
resnet50.layer1.0.downsample.1.running_var   torch.Size([256])
resnet50.layer1.0.conv1.weight   torch.Size([64, 64, 1, 1])
resnet50.layer1.0.bn1.weight   torch.Size([64])
resnet50.layer1.0.bn1.bias   torch.Size([64])
resnet50.layer1.0.bn1.running_mean   torch.Size([64])
resnet50.layer1.0.bn1.running_var   torch.Size([64])
resnet50.layer1.0.conv2.weight   torch.Size([64, 64, 3, 3])
resnet50.layer1.0.bn2.weight   torch.Size([64])
resnet50.layer1.0.bn2.bias   torch.Size([64])
resnet50.layer1.0.bn2.running_mean   torch.Size([64])
resnet50.layer1.0.bn2.running_var   torch.Size([64])
resnet50.layer1.0.conv3.weight   torch.Size([256, 64, 1, 1])
resnet50.layer1.0.bn3.weight   torch.Size([256])
resnet50.layer1.0.bn3.bias   torch.Size([256])
resnet50.layer1.0.bn3.running_mean   torch.Size([256])
resnet50.layer1.0.bn3.running_var   torch.Size([256])
resnet50.layer1.1.conv1.weight   torch.Size([64, 256, 1, 1])
resnet50.layer1.1.bn1.weight   torch.Size([64])
resnet50.layer1.1.bn1.bias   torch.Size([64])
resnet50.layer1.1.bn1.running_mean   torch.Size([64])
resnet50.layer1.1.bn1.running_var   torch.Size([64])
resnet50.layer1.1.conv2.weight   torch.Size([64, 64, 3, 3])
resnet50.layer1.1.bn2.weight   torch.Size([64])
resnet50.layer1.1.bn2.bias   torch.Size([64])
resnet50.layer1.1.bn2.running_mean   torch.Size([64])
resnet50.layer1.1.bn2.running_var   torch.Size([64])
resnet50.layer1.1.conv3.weight   torch.Size([256, 64, 1, 1])
resnet50.layer1.1.bn3.weight   torch.Size([256])
resnet50.layer1.1.bn3.bias   torch.Size([256])
resnet50.layer1.1.bn3.running_mean   torch.Size([256])
resnet50.layer1.1.bn3.running_var   torch.Size([256])
resnet50.layer1.2.conv1.weight   torch.Size([64, 256, 1, 1])
resnet50.layer1.2.bn1.weight   torch.Size([64])
resnet50.layer1.2.bn1.bias   torch.Size([64])
resnet50.layer1.2.bn1.running_mean   torch.Size([64])
resnet50.layer1.2.bn1.running_var   torch.Size([64])
resnet50.layer1.2.conv2.weight   torch.Size([64, 64, 3, 3])
resnet50.layer1.2.bn2.weight   torch.Size([64])
resnet50.layer1.2.bn2.bias   torch.Size([64])
resnet50.layer1.2.bn2.running_mean   torch.Size([64])
resnet50.layer1.2.bn2.running_var   torch.Size([64])
resnet50.layer1.2.conv3.weight   torch.Size([256, 64, 1, 1])
resnet50.layer1.2.bn3.weight   torch.Size([256])
resnet50.layer1.2.bn3.bias   torch.Size([256])
resnet50.layer1.2.bn3.running_mean   torch.Size([256])
resnet50.layer1.2.bn3.running_var   torch.Size([256])
resnet50.layer2.0.downsample.0.weight   torch.Size([512, 256, 1, 1])
resnet50.layer2.0.downsample.1.weight   torch.Size([512])
resnet50.layer2.0.downsample.1.bias   torch.Size([512])
resnet50.layer2.0.downsample.1.running_mean   torch.Size([512])
resnet50.layer2.0.downsample.1.running_var   torch.Size([512])
resnet50.layer2.0.conv1.weight   torch.Size([128, 256, 1, 1])
resnet50.layer2.0.bn1.weight   torch.Size([128])
resnet50.layer2.0.bn1.bias   torch.Size([128])
resnet50.layer2.0.bn1.running_mean   torch.Size([128])
resnet50.layer2.0.bn1.running_var   torch.Size([128])
resnet50.layer2.0.conv2.weight   torch.Size([128, 128, 3, 3])
resnet50.layer2.0.bn2.weight   torch.Size([128])
resnet50.layer2.0.bn2.bias   torch.Size([128])
resnet50.layer2.0.bn2.running_mean   torch.Size([128])
resnet50.layer2.0.bn2.running_var   torch.Size([128])
resnet50.layer2.0.conv3.weight   torch.Size([512, 128, 1, 1])
resnet50.layer2.0.bn3.weight   torch.Size([512])
resnet50.layer2.0.bn3.bias   torch.Size([512])
resnet50.layer2.0.bn3.running_mean   torch.Size([512])
resnet50.layer2.0.bn3.running_var   torch.Size([512])
resnet50.layer2.1.conv1.weight   torch.Size([128, 512, 1, 1])
resnet50.layer2.1.bn1.weight   torch.Size([128])
resnet50.layer2.1.bn1.bias   torch.Size([128])
resnet50.layer2.1.bn1.running_mean   torch.Size([128])
resnet50.layer2.1.bn1.running_var   torch.Size([128])
resnet50.layer2.1.conv2.weight   torch.Size([128, 128, 3, 3])
resnet50.layer2.1.bn2.weight   torch.Size([128])
resnet50.layer2.1.bn2.bias   torch.Size([128])
resnet50.layer2.1.bn2.running_mean   torch.Size([128])
resnet50.layer2.1.bn2.running_var   torch.Size([128])
resnet50.layer2.1.conv3.weight   torch.Size([512, 128, 1, 1])
resnet50.layer2.1.bn3.weight   torch.Size([512])
resnet50.layer2.1.bn3.bias   torch.Size([512])
resnet50.layer2.1.bn3.running_mean   torch.Size([512])
resnet50.layer2.1.bn3.running_var   torch.Size([512])
resnet50.layer2.2.conv1.weight   torch.Size([128, 512, 1, 1])
resnet50.layer2.2.bn1.weight   torch.Size([128])
resnet50.layer2.2.bn1.bias   torch.Size([128])
resnet50.layer2.2.bn1.running_mean   torch.Size([128])
resnet50.layer2.2.bn1.running_var   torch.Size([128])
resnet50.layer2.2.conv2.weight   torch.Size([128, 128, 3, 3])
resnet50.layer2.2.bn2.weight   torch.Size([128])
resnet50.layer2.2.bn2.bias   torch.Size([128])
resnet50.layer2.2.bn2.running_mean   torch.Size([128])
resnet50.layer2.2.bn2.running_var   torch.Size([128])
resnet50.layer2.2.conv3.weight   torch.Size([512, 128, 1, 1])
resnet50.layer2.2.bn3.weight   torch.Size([512])
resnet50.layer2.2.bn3.bias   torch.Size([512])
resnet50.layer2.2.bn3.running_mean   torch.Size([512])
resnet50.layer2.2.bn3.running_var   torch.Size([512])
resnet50.layer2.3.conv1.weight   torch.Size([128, 512, 1, 1])
resnet50.layer2.3.bn1.weight   torch.Size([128])
resnet50.layer2.3.bn1.bias   torch.Size([128])
resnet50.layer2.3.bn1.running_mean   torch.Size([128])
resnet50.layer2.3.bn1.running_var   torch.Size([128])
resnet50.layer2.3.conv2.weight   torch.Size([128, 128, 3, 3])
resnet50.layer2.3.bn2.weight   torch.Size([128])
resnet50.layer2.3.bn2.bias   torch.Size([128])
resnet50.layer2.3.bn2.running_mean   torch.Size([128])
resnet50.layer2.3.bn2.running_var   torch.Size([128])
resnet50.layer2.3.conv3.weight   torch.Size([512, 128, 1, 1])
resnet50.layer2.3.bn3.weight   torch.Size([512])
resnet50.layer2.3.bn3.bias   torch.Size([512])
resnet50.layer2.3.bn3.running_mean   torch.Size([512])
resnet50.layer2.3.bn3.running_var   torch.Size([512])
resnet50.layer3.0.downsample.0.weight   torch.Size([1024, 512, 1, 1])
resnet50.layer3.0.downsample.1.weight   torch.Size([1024])
resnet50.layer3.0.downsample.1.bias   torch.Size([1024])
resnet50.layer3.0.downsample.1.running_mean   torch.Size([1024])
resnet50.layer3.0.downsample.1.running_var   torch.Size([1024])
resnet50.layer3.0.conv1.weight   torch.Size([256, 512, 1, 1])
resnet50.layer3.0.bn1.weight   torch.Size([256])
resnet50.layer3.0.bn1.bias   torch.Size([256])
resnet50.layer3.0.bn1.running_mean   torch.Size([256])
resnet50.layer3.0.bn1.running_var   torch.Size([256])
resnet50.layer3.0.conv2.weight   torch.Size([256, 256, 3, 3])
resnet50.layer3.0.bn2.weight   torch.Size([256])
resnet50.layer3.0.bn2.bias   torch.Size([256])
resnet50.layer3.0.bn2.running_mean   torch.Size([256])
resnet50.layer3.0.bn2.running_var   torch.Size([256])
resnet50.layer3.0.conv3.weight   torch.Size([1024, 256, 1, 1])
resnet50.layer3.0.bn3.weight   torch.Size([1024])
resnet50.layer3.0.bn3.bias   torch.Size([1024])
resnet50.layer3.0.bn3.running_mean   torch.Size([1024])
resnet50.layer3.0.bn3.running_var   torch.Size([1024])
resnet50.layer3.1.conv1.weight   torch.Size([256, 1024, 1, 1])
resnet50.layer3.1.bn1.weight   torch.Size([256])
resnet50.layer3.1.bn1.bias   torch.Size([256])
resnet50.layer3.1.bn1.running_mean   torch.Size([256])
resnet50.layer3.1.bn1.running_var   torch.Size([256])
resnet50.layer3.1.conv2.weight   torch.Size([256, 256, 3, 3])
resnet50.layer3.1.bn2.weight   torch.Size([256])
resnet50.layer3.1.bn2.bias   torch.Size([256])
resnet50.layer3.1.bn2.running_mean   torch.Size([256])
resnet50.layer3.1.bn2.running_var   torch.Size([256])
resnet50.layer3.1.conv3.weight   torch.Size([1024, 256, 1, 1])
resnet50.layer3.1.bn3.weight   torch.Size([1024])
resnet50.layer3.1.bn3.bias   torch.Size([1024])
resnet50.layer3.1.bn3.running_mean   torch.Size([1024])
resnet50.layer3.1.bn3.running_var   torch.Size([1024])
resnet50.layer3.2.conv1.weight   torch.Size([256, 1024, 1, 1])
resnet50.layer3.2.bn1.weight   torch.Size([256])
resnet50.layer3.2.bn1.bias   torch.Size([256])
resnet50.layer3.2.bn1.running_mean   torch.Size([256])
resnet50.layer3.2.bn1.running_var   torch.Size([256])
resnet50.layer3.2.conv2.weight   torch.Size([256, 256, 3, 3])
resnet50.layer3.2.bn2.weight   torch.Size([256])
resnet50.layer3.2.bn2.bias   torch.Size([256])
resnet50.layer3.2.bn2.running_mean   torch.Size([256])
resnet50.layer3.2.bn2.running_var   torch.Size([256])
resnet50.layer3.2.conv3.weight   torch.Size([1024, 256, 1, 1])
resnet50.layer3.2.bn3.weight   torch.Size([1024])
resnet50.layer3.2.bn3.bias   torch.Size([1024])
resnet50.layer3.2.bn3.running_mean   torch.Size([1024])
resnet50.layer3.2.bn3.running_var   torch.Size([1024])
resnet50.layer3.3.conv1.weight   torch.Size([256, 1024, 1, 1])
resnet50.layer3.3.bn1.weight   torch.Size([256])
resnet50.layer3.3.bn1.bias   torch.Size([256])
resnet50.layer3.3.bn1.running_mean   torch.Size([256])
resnet50.layer3.3.bn1.running_var   torch.Size([256])
resnet50.layer3.3.conv2.weight   torch.Size([256, 256, 3, 3])
resnet50.layer3.3.bn2.weight   torch.Size([256])
resnet50.layer3.3.bn2.bias   torch.Size([256])
resnet50.layer3.3.bn2.running_mean   torch.Size([256])
resnet50.layer3.3.bn2.running_var   torch.Size([256])
resnet50.layer3.3.conv3.weight   torch.Size([1024, 256, 1, 1])
resnet50.layer3.3.bn3.weight   torch.Size([1024])
resnet50.layer3.3.bn3.bias   torch.Size([1024])
resnet50.layer3.3.bn3.running_mean   torch.Size([1024])
resnet50.layer3.3.bn3.running_var   torch.Size([1024])
resnet50.layer3.4.conv1.weight   torch.Size([256, 1024, 1, 1])
resnet50.layer3.4.bn1.weight   torch.Size([256])
resnet50.layer3.4.bn1.bias   torch.Size([256])
resnet50.layer3.4.bn1.running_mean   torch.Size([256])
resnet50.layer3.4.bn1.running_var   torch.Size([256])
resnet50.layer3.4.conv2.weight   torch.Size([256, 256, 3, 3])
resnet50.layer3.4.bn2.weight   torch.Size([256])
resnet50.layer3.4.bn2.bias   torch.Size([256])
resnet50.layer3.4.bn2.running_mean   torch.Size([256])
resnet50.layer3.4.bn2.running_var   torch.Size([256])
resnet50.layer3.4.conv3.weight   torch.Size([1024, 256, 1, 1])
resnet50.layer3.4.bn3.weight   torch.Size([1024])
resnet50.layer3.4.bn3.bias   torch.Size([1024])
resnet50.layer3.4.bn3.running_mean   torch.Size([1024])
resnet50.layer3.4.bn3.running_var   torch.Size([1024])
resnet50.layer3.5.conv1.weight   torch.Size([256, 1024, 1, 1])
resnet50.layer3.5.bn1.weight   torch.Size([256])
resnet50.layer3.5.bn1.bias   torch.Size([256])
resnet50.layer3.5.bn1.running_mean   torch.Size([256])
resnet50.layer3.5.bn1.running_var   torch.Size([256])
resnet50.layer3.5.conv2.weight   torch.Size([256, 256, 3, 3])
resnet50.layer3.5.bn2.weight   torch.Size([256])
resnet50.layer3.5.bn2.bias   torch.Size([256])
resnet50.layer3.5.bn2.running_mean   torch.Size([256])
resnet50.layer3.5.bn2.running_var   torch.Size([256])
resnet50.layer3.5.conv3.weight   torch.Size([1024, 256, 1, 1])
resnet50.layer3.5.bn3.weight   torch.Size([1024])
resnet50.layer3.5.bn3.bias   torch.Size([1024])
resnet50.layer3.5.bn3.running_mean   torch.Size([1024])
resnet50.layer3.5.bn3.running_var   torch.Size([1024])
resnet50.layer4.0.downsample.0.weight   torch.Size([2048, 1024, 1, 1])
resnet50.layer4.0.downsample.1.weight   torch.Size([2048])
resnet50.layer4.0.downsample.1.bias   torch.Size([2048])
resnet50.layer4.0.downsample.1.running_mean   torch.Size([2048])
resnet50.layer4.0.downsample.1.running_var   torch.Size([2048])
resnet50.layer4.0.conv1.weight   torch.Size([512, 1024, 1, 1])
resnet50.layer4.0.bn1.weight   torch.Size([512])
resnet50.layer4.0.bn1.bias   torch.Size([512])
resnet50.layer4.0.bn1.running_mean   torch.Size([512])
resnet50.layer4.0.bn1.running_var   torch.Size([512])
resnet50.layer4.0.conv2.weight   torch.Size([512, 512, 3, 3])
resnet50.layer4.0.bn2.weight   torch.Size([512])
resnet50.layer4.0.bn2.bias   torch.Size([512])
resnet50.layer4.0.bn2.running_mean   torch.Size([512])
resnet50.layer4.0.bn2.running_var   torch.Size([512])
resnet50.layer4.0.conv3.weight   torch.Size([2048, 512, 1, 1])
resnet50.layer4.0.bn3.weight   torch.Size([2048])
resnet50.layer4.0.bn3.bias   torch.Size([2048])
resnet50.layer4.0.bn3.running_mean   torch.Size([2048])
resnet50.layer4.0.bn3.running_var   torch.Size([2048])
resnet50.layer4.1.conv1.weight   torch.Size([512, 2048, 1, 1])
resnet50.layer4.1.bn1.weight   torch.Size([512])
resnet50.layer4.1.bn1.bias   torch.Size([512])
resnet50.layer4.1.bn1.running_mean   torch.Size([512])
resnet50.layer4.1.bn1.running_var   torch.Size([512])
resnet50.layer4.1.conv2.weight   torch.Size([512, 512, 3, 3])
resnet50.layer4.1.bn2.weight   torch.Size([512])
resnet50.layer4.1.bn2.bias   torch.Size([512])
resnet50.layer4.1.bn2.running_mean   torch.Size([512])
resnet50.layer4.1.bn2.running_var   torch.Size([512])
resnet50.layer4.1.conv3.weight   torch.Size([2048, 512, 1, 1])
resnet50.layer4.1.bn3.weight   torch.Size([2048])
resnet50.layer4.1.bn3.bias   torch.Size([2048])
resnet50.layer4.1.bn3.running_mean   torch.Size([2048])
resnet50.layer4.1.bn3.running_var   torch.Size([2048])
resnet50.layer4.2.conv1.weight   torch.Size([512, 2048, 1, 1])
resnet50.layer4.2.bn1.weight   torch.Size([512])
resnet50.layer4.2.bn1.bias   torch.Size([512])
resnet50.layer4.2.bn1.running_mean   torch.Size([512])
resnet50.layer4.2.bn1.running_var   torch.Size([512])
resnet50.layer4.2.conv2.weight   torch.Size([512, 512, 3, 3])
resnet50.layer4.2.bn2.weight   torch.Size([512])
resnet50.layer4.2.bn2.bias   torch.Size([512])
resnet50.layer4.2.bn2.running_mean   torch.Size([512])
resnet50.layer4.2.bn2.running_var   torch.Size([512])
resnet50.layer4.2.conv3.weight   torch.Size([2048, 512, 1, 1])
resnet50.layer4.2.bn3.weight   torch.Size([2048])
resnet50.layer4.2.bn3.bias   torch.Size([2048])
resnet50.layer4.2.bn3.running_mean   torch.Size([2048])
resnet50.layer4.2.bn3.running_var   torch.Size([2048])
FPN.lateral_convs.0.weight   torch.Size([256, 2048, 1, 1])
FPN.lateral_convs.0.bias   torch.Size([256])
FPN.lateral_convs.1.weight   torch.Size([256, 1024, 1, 1])
FPN.lateral_convs.1.bias   torch.Size([256])
FPN.lateral_convs.2.weight   torch.Size([256, 512, 1, 1])
FPN.lateral_convs.2.bias   torch.Size([256])
FPN.lateral_convs.3.weight   torch.Size([256, 256, 1, 1])
FPN.lateral_convs.3.bias   torch.Size([256])
FPN.output_convs.0.weight   torch.Size([256, 256, 3, 3])
FPN.output_convs.0.bias   torch.Size([256])
FPN.output_convs.1.weight   torch.Size([256, 256, 3, 3])
FPN.output_convs.1.bias   torch.Size([256])
FPN.output_convs.2.weight   torch.Size([256, 256, 3, 3])
FPN.output_convs.2.bias   torch.Size([256])
FPN.output_convs.3.weight   torch.Size([256, 256, 3, 3])
FPN.output_convs.3.bias   torch.Size([256])
FPN.bottom_up.conv1.weight   torch.Size([64, 3, 7, 7])
FPN.bottom_up.bn1.weight   torch.Size([64])
FPN.bottom_up.bn1.bias   torch.Size([64])
FPN.bottom_up.bn1.running_mean   torch.Size([64])
FPN.bottom_up.bn1.running_var   torch.Size([64])
FPN.bottom_up.layer1.0.downsample.0.weight   torch.Size([256, 64, 1, 1])
FPN.bottom_up.layer1.0.downsample.1.weight   torch.Size([256])
FPN.bottom_up.layer1.0.downsample.1.bias   torch.Size([256])
FPN.bottom_up.layer1.0.downsample.1.running_mean   torch.Size([256])
FPN.bottom_up.layer1.0.downsample.1.running_var   torch.Size([256])
FPN.bottom_up.layer1.0.conv1.weight   torch.Size([64, 64, 1, 1])
FPN.bottom_up.layer1.0.bn1.weight   torch.Size([64])
FPN.bottom_up.layer1.0.bn1.bias   torch.Size([64])
FPN.bottom_up.layer1.0.bn1.running_mean   torch.Size([64])
FPN.bottom_up.layer1.0.bn1.running_var   torch.Size([64])
FPN.bottom_up.layer1.0.conv2.weight   torch.Size([64, 64, 3, 3])
FPN.bottom_up.layer1.0.bn2.weight   torch.Size([64])
FPN.bottom_up.layer1.0.bn2.bias   torch.Size([64])
FPN.bottom_up.layer1.0.bn2.running_mean   torch.Size([64])
FPN.bottom_up.layer1.0.bn2.running_var   torch.Size([64])
FPN.bottom_up.layer1.0.conv3.weight   torch.Size([256, 64, 1, 1])
FPN.bottom_up.layer1.0.bn3.weight   torch.Size([256])
FPN.bottom_up.layer1.0.bn3.bias   torch.Size([256])
FPN.bottom_up.layer1.0.bn3.running_mean   torch.Size([256])
FPN.bottom_up.layer1.0.bn3.running_var   torch.Size([256])
FPN.bottom_up.layer1.1.conv1.weight   torch.Size([64, 256, 1, 1])
FPN.bottom_up.layer1.1.bn1.weight   torch.Size([64])
FPN.bottom_up.layer1.1.bn1.bias   torch.Size([64])
FPN.bottom_up.layer1.1.bn1.running_mean   torch.Size([64])
FPN.bottom_up.layer1.1.bn1.running_var   torch.Size([64])
FPN.bottom_up.layer1.1.conv2.weight   torch.Size([64, 64, 3, 3])
FPN.bottom_up.layer1.1.bn2.weight   torch.Size([64])
FPN.bottom_up.layer1.1.bn2.bias   torch.Size([64])
FPN.bottom_up.layer1.1.bn2.running_mean   torch.Size([64])
FPN.bottom_up.layer1.1.bn2.running_var   torch.Size([64])
FPN.bottom_up.layer1.1.conv3.weight   torch.Size([256, 64, 1, 1])
FPN.bottom_up.layer1.1.bn3.weight   torch.Size([256])
FPN.bottom_up.layer1.1.bn3.bias   torch.Size([256])
FPN.bottom_up.layer1.1.bn3.running_mean   torch.Size([256])
FPN.bottom_up.layer1.1.bn3.running_var   torch.Size([256])
FPN.bottom_up.layer1.2.conv1.weight   torch.Size([64, 256, 1, 1])
FPN.bottom_up.layer1.2.bn1.weight   torch.Size([64])
FPN.bottom_up.layer1.2.bn1.bias   torch.Size([64])
FPN.bottom_up.layer1.2.bn1.running_mean   torch.Size([64])
FPN.bottom_up.layer1.2.bn1.running_var   torch.Size([64])
FPN.bottom_up.layer1.2.conv2.weight   torch.Size([64, 64, 3, 3])
FPN.bottom_up.layer1.2.bn2.weight   torch.Size([64])
FPN.bottom_up.layer1.2.bn2.bias   torch.Size([64])
FPN.bottom_up.layer1.2.bn2.running_mean   torch.Size([64])
FPN.bottom_up.layer1.2.bn2.running_var   torch.Size([64])
FPN.bottom_up.layer1.2.conv3.weight   torch.Size([256, 64, 1, 1])
FPN.bottom_up.layer1.2.bn3.weight   torch.Size([256])
FPN.bottom_up.layer1.2.bn3.bias   torch.Size([256])
FPN.bottom_up.layer1.2.bn3.running_mean   torch.Size([256])
FPN.bottom_up.layer1.2.bn3.running_var   torch.Size([256])
FPN.bottom_up.layer2.0.downsample.0.weight   torch.Size([512, 256, 1, 1])
FPN.bottom_up.layer2.0.downsample.1.weight   torch.Size([512])
FPN.bottom_up.layer2.0.downsample.1.bias   torch.Size([512])
FPN.bottom_up.layer2.0.downsample.1.running_mean   torch.Size([512])
FPN.bottom_up.layer2.0.downsample.1.running_var   torch.Size([512])
FPN.bottom_up.layer2.0.conv1.weight   torch.Size([128, 256, 1, 1])
FPN.bottom_up.layer2.0.bn1.weight   torch.Size([128])
FPN.bottom_up.layer2.0.bn1.bias   torch.Size([128])
FPN.bottom_up.layer2.0.bn1.running_mean   torch.Size([128])
FPN.bottom_up.layer2.0.bn1.running_var   torch.Size([128])
FPN.bottom_up.layer2.0.conv2.weight   torch.Size([128, 128, 3, 3])
FPN.bottom_up.layer2.0.bn2.weight   torch.Size([128])
FPN.bottom_up.layer2.0.bn2.bias   torch.Size([128])
FPN.bottom_up.layer2.0.bn2.running_mean   torch.Size([128])
FPN.bottom_up.layer2.0.bn2.running_var   torch.Size([128])
FPN.bottom_up.layer2.0.conv3.weight   torch.Size([512, 128, 1, 1])
FPN.bottom_up.layer2.0.bn3.weight   torch.Size([512])
FPN.bottom_up.layer2.0.bn3.bias   torch.Size([512])
FPN.bottom_up.layer2.0.bn3.running_mean   torch.Size([512])
FPN.bottom_up.layer2.0.bn3.running_var   torch.Size([512])
FPN.bottom_up.layer2.1.conv1.weight   torch.Size([128, 512, 1, 1])
FPN.bottom_up.layer2.1.bn1.weight   torch.Size([128])
FPN.bottom_up.layer2.1.bn1.bias   torch.Size([128])
FPN.bottom_up.layer2.1.bn1.running_mean   torch.Size([128])
FPN.bottom_up.layer2.1.bn1.running_var   torch.Size([128])
FPN.bottom_up.layer2.1.conv2.weight   torch.Size([128, 128, 3, 3])
FPN.bottom_up.layer2.1.bn2.weight   torch.Size([128])
FPN.bottom_up.layer2.1.bn2.bias   torch.Size([128])
FPN.bottom_up.layer2.1.bn2.running_mean   torch.Size([128])
FPN.bottom_up.layer2.1.bn2.running_var   torch.Size([128])
FPN.bottom_up.layer2.1.conv3.weight   torch.Size([512, 128, 1, 1])
FPN.bottom_up.layer2.1.bn3.weight   torch.Size([512])
FPN.bottom_up.layer2.1.bn3.bias   torch.Size([512])
FPN.bottom_up.layer2.1.bn3.running_mean   torch.Size([512])
FPN.bottom_up.layer2.1.bn3.running_var   torch.Size([512])
FPN.bottom_up.layer2.2.conv1.weight   torch.Size([128, 512, 1, 1])
FPN.bottom_up.layer2.2.bn1.weight   torch.Size([128])
FPN.bottom_up.layer2.2.bn1.bias   torch.Size([128])
FPN.bottom_up.layer2.2.bn1.running_mean   torch.Size([128])
FPN.bottom_up.layer2.2.bn1.running_var   torch.Size([128])
FPN.bottom_up.layer2.2.conv2.weight   torch.Size([128, 128, 3, 3])
FPN.bottom_up.layer2.2.bn2.weight   torch.Size([128])
FPN.bottom_up.layer2.2.bn2.bias   torch.Size([128])
FPN.bottom_up.layer2.2.bn2.running_mean   torch.Size([128])
FPN.bottom_up.layer2.2.bn2.running_var   torch.Size([128])
FPN.bottom_up.layer2.2.conv3.weight   torch.Size([512, 128, 1, 1])
FPN.bottom_up.layer2.2.bn3.weight   torch.Size([512])
FPN.bottom_up.layer2.2.bn3.bias   torch.Size([512])
FPN.bottom_up.layer2.2.bn3.running_mean   torch.Size([512])
FPN.bottom_up.layer2.2.bn3.running_var   torch.Size([512])
FPN.bottom_up.layer2.3.conv1.weight   torch.Size([128, 512, 1, 1])
FPN.bottom_up.layer2.3.bn1.weight   torch.Size([128])
FPN.bottom_up.layer2.3.bn1.bias   torch.Size([128])
FPN.bottom_up.layer2.3.bn1.running_mean   torch.Size([128])
FPN.bottom_up.layer2.3.bn1.running_var   torch.Size([128])
FPN.bottom_up.layer2.3.conv2.weight   torch.Size([128, 128, 3, 3])
FPN.bottom_up.layer2.3.bn2.weight   torch.Size([128])
FPN.bottom_up.layer2.3.bn2.bias   torch.Size([128])
FPN.bottom_up.layer2.3.bn2.running_mean   torch.Size([128])
FPN.bottom_up.layer2.3.bn2.running_var   torch.Size([128])
FPN.bottom_up.layer2.3.conv3.weight   torch.Size([512, 128, 1, 1])
FPN.bottom_up.layer2.3.bn3.weight   torch.Size([512])
FPN.bottom_up.layer2.3.bn3.bias   torch.Size([512])
FPN.bottom_up.layer2.3.bn3.running_mean   torch.Size([512])
FPN.bottom_up.layer2.3.bn3.running_var   torch.Size([512])
FPN.bottom_up.layer3.0.downsample.0.weight   torch.Size([1024, 512, 1, 1])
FPN.bottom_up.layer3.0.downsample.1.weight   torch.Size([1024])
FPN.bottom_up.layer3.0.downsample.1.bias   torch.Size([1024])
FPN.bottom_up.layer3.0.downsample.1.running_mean   torch.Size([1024])
FPN.bottom_up.layer3.0.downsample.1.running_var   torch.Size([1024])
FPN.bottom_up.layer3.0.conv1.weight   torch.Size([256, 512, 1, 1])
FPN.bottom_up.layer3.0.bn1.weight   torch.Size([256])
FPN.bottom_up.layer3.0.bn1.bias   torch.Size([256])
FPN.bottom_up.layer3.0.bn1.running_mean   torch.Size([256])
FPN.bottom_up.layer3.0.bn1.running_var   torch.Size([256])
FPN.bottom_up.layer3.0.conv2.weight   torch.Size([256, 256, 3, 3])
FPN.bottom_up.layer3.0.bn2.weight   torch.Size([256])
FPN.bottom_up.layer3.0.bn2.bias   torch.Size([256])
FPN.bottom_up.layer3.0.bn2.running_mean   torch.Size([256])
FPN.bottom_up.layer3.0.bn2.running_var   torch.Size([256])
FPN.bottom_up.layer3.0.conv3.weight   torch.Size([1024, 256, 1, 1])
FPN.bottom_up.layer3.0.bn3.weight   torch.Size([1024])
FPN.bottom_up.layer3.0.bn3.bias   torch.Size([1024])
FPN.bottom_up.layer3.0.bn3.running_mean   torch.Size([1024])
FPN.bottom_up.layer3.0.bn3.running_var   torch.Size([1024])
FPN.bottom_up.layer3.1.conv1.weight   torch.Size([256, 1024, 1, 1])
FPN.bottom_up.layer3.1.bn1.weight   torch.Size([256])
FPN.bottom_up.layer3.1.bn1.bias   torch.Size([256])
FPN.bottom_up.layer3.1.bn1.running_mean   torch.Size([256])
FPN.bottom_up.layer3.1.bn1.running_var   torch.Size([256])
FPN.bottom_up.layer3.1.conv2.weight   torch.Size([256, 256, 3, 3])
FPN.bottom_up.layer3.1.bn2.weight   torch.Size([256])
FPN.bottom_up.layer3.1.bn2.bias   torch.Size([256])
FPN.bottom_up.layer3.1.bn2.running_mean   torch.Size([256])
FPN.bottom_up.layer3.1.bn2.running_var   torch.Size([256])
FPN.bottom_up.layer3.1.conv3.weight   torch.Size([1024, 256, 1, 1])
FPN.bottom_up.layer3.1.bn3.weight   torch.Size([1024])
FPN.bottom_up.layer3.1.bn3.bias   torch.Size([1024])
FPN.bottom_up.layer3.1.bn3.running_mean   torch.Size([1024])
FPN.bottom_up.layer3.1.bn3.running_var   torch.Size([1024])
FPN.bottom_up.layer3.2.conv1.weight   torch.Size([256, 1024, 1, 1])
FPN.bottom_up.layer3.2.bn1.weight   torch.Size([256])
FPN.bottom_up.layer3.2.bn1.bias   torch.Size([256])
FPN.bottom_up.layer3.2.bn1.running_mean   torch.Size([256])
FPN.bottom_up.layer3.2.bn1.running_var   torch.Size([256])
FPN.bottom_up.layer3.2.conv2.weight   torch.Size([256, 256, 3, 3])
FPN.bottom_up.layer3.2.bn2.weight   torch.Size([256])
FPN.bottom_up.layer3.2.bn2.bias   torch.Size([256])
FPN.bottom_up.layer3.2.bn2.running_mean   torch.Size([256])
FPN.bottom_up.layer3.2.bn2.running_var   torch.Size([256])
FPN.bottom_up.layer3.2.conv3.weight   torch.Size([1024, 256, 1, 1])
FPN.bottom_up.layer3.2.bn3.weight   torch.Size([1024])
FPN.bottom_up.layer3.2.bn3.bias   torch.Size([1024])
FPN.bottom_up.layer3.2.bn3.running_mean   torch.Size([1024])
FPN.bottom_up.layer3.2.bn3.running_var   torch.Size([1024])
FPN.bottom_up.layer3.3.conv1.weight   torch.Size([256, 1024, 1, 1])
FPN.bottom_up.layer3.3.bn1.weight   torch.Size([256])
FPN.bottom_up.layer3.3.bn1.bias   torch.Size([256])
FPN.bottom_up.layer3.3.bn1.running_mean   torch.Size([256])
FPN.bottom_up.layer3.3.bn1.running_var   torch.Size([256])
FPN.bottom_up.layer3.3.conv2.weight   torch.Size([256, 256, 3, 3])
FPN.bottom_up.layer3.3.bn2.weight   torch.Size([256])
FPN.bottom_up.layer3.3.bn2.bias   torch.Size([256])
FPN.bottom_up.layer3.3.bn2.running_mean   torch.Size([256])
FPN.bottom_up.layer3.3.bn2.running_var   torch.Size([256])
FPN.bottom_up.layer3.3.conv3.weight   torch.Size([1024, 256, 1, 1])
FPN.bottom_up.layer3.3.bn3.weight   torch.Size([1024])
FPN.bottom_up.layer3.3.bn3.bias   torch.Size([1024])
FPN.bottom_up.layer3.3.bn3.running_mean   torch.Size([1024])
FPN.bottom_up.layer3.3.bn3.running_var   torch.Size([1024])
FPN.bottom_up.layer3.4.conv1.weight   torch.Size([256, 1024, 1, 1])
FPN.bottom_up.layer3.4.bn1.weight   torch.Size([256])
FPN.bottom_up.layer3.4.bn1.bias   torch.Size([256])
FPN.bottom_up.layer3.4.bn1.running_mean   torch.Size([256])
FPN.bottom_up.layer3.4.bn1.running_var   torch.Size([256])
FPN.bottom_up.layer3.4.conv2.weight   torch.Size([256, 256, 3, 3])
FPN.bottom_up.layer3.4.bn2.weight   torch.Size([256])
FPN.bottom_up.layer3.4.bn2.bias   torch.Size([256])
FPN.bottom_up.layer3.4.bn2.running_mean   torch.Size([256])
FPN.bottom_up.layer3.4.bn2.running_var   torch.Size([256])
FPN.bottom_up.layer3.4.conv3.weight   torch.Size([1024, 256, 1, 1])
FPN.bottom_up.layer3.4.bn3.weight   torch.Size([1024])
FPN.bottom_up.layer3.4.bn3.bias   torch.Size([1024])
FPN.bottom_up.layer3.4.bn3.running_mean   torch.Size([1024])
FPN.bottom_up.layer3.4.bn3.running_var   torch.Size([1024])
FPN.bottom_up.layer3.5.conv1.weight   torch.Size([256, 1024, 1, 1])
FPN.bottom_up.layer3.5.bn1.weight   torch.Size([256])
FPN.bottom_up.layer3.5.bn1.bias   torch.Size([256])
FPN.bottom_up.layer3.5.bn1.running_mean   torch.Size([256])
FPN.bottom_up.layer3.5.bn1.running_var   torch.Size([256])
FPN.bottom_up.layer3.5.conv2.weight   torch.Size([256, 256, 3, 3])
FPN.bottom_up.layer3.5.bn2.weight   torch.Size([256])
FPN.bottom_up.layer3.5.bn2.bias   torch.Size([256])
FPN.bottom_up.layer3.5.bn2.running_mean   torch.Size([256])
FPN.bottom_up.layer3.5.bn2.running_var   torch.Size([256])
FPN.bottom_up.layer3.5.conv3.weight   torch.Size([1024, 256, 1, 1])
FPN.bottom_up.layer3.5.bn3.weight   torch.Size([1024])
FPN.bottom_up.layer3.5.bn3.bias   torch.Size([1024])
FPN.bottom_up.layer3.5.bn3.running_mean   torch.Size([1024])
FPN.bottom_up.layer3.5.bn3.running_var   torch.Size([1024])
FPN.bottom_up.layer4.0.downsample.0.weight   torch.Size([2048, 1024, 1, 1])
FPN.bottom_up.layer4.0.downsample.1.weight   torch.Size([2048])
FPN.bottom_up.layer4.0.downsample.1.bias   torch.Size([2048])
FPN.bottom_up.layer4.0.downsample.1.running_mean   torch.Size([2048])
FPN.bottom_up.layer4.0.downsample.1.running_var   torch.Size([2048])
FPN.bottom_up.layer4.0.conv1.weight   torch.Size([512, 1024, 1, 1])
FPN.bottom_up.layer4.0.bn1.weight   torch.Size([512])
FPN.bottom_up.layer4.0.bn1.bias   torch.Size([512])
FPN.bottom_up.layer4.0.bn1.running_mean   torch.Size([512])
FPN.bottom_up.layer4.0.bn1.running_var   torch.Size([512])
FPN.bottom_up.layer4.0.conv2.weight   torch.Size([512, 512, 3, 3])
FPN.bottom_up.layer4.0.bn2.weight   torch.Size([512])
FPN.bottom_up.layer4.0.bn2.bias   torch.Size([512])
FPN.bottom_up.layer4.0.bn2.running_mean   torch.Size([512])
FPN.bottom_up.layer4.0.bn2.running_var   torch.Size([512])
FPN.bottom_up.layer4.0.conv3.weight   torch.Size([2048, 512, 1, 1])
FPN.bottom_up.layer4.0.bn3.weight   torch.Size([2048])
FPN.bottom_up.layer4.0.bn3.bias   torch.Size([2048])
FPN.bottom_up.layer4.0.bn3.running_mean   torch.Size([2048])
FPN.bottom_up.layer4.0.bn3.running_var   torch.Size([2048])
FPN.bottom_up.layer4.1.conv1.weight   torch.Size([512, 2048, 1, 1])
FPN.bottom_up.layer4.1.bn1.weight   torch.Size([512])
FPN.bottom_up.layer4.1.bn1.bias   torch.Size([512])
FPN.bottom_up.layer4.1.bn1.running_mean   torch.Size([512])
FPN.bottom_up.layer4.1.bn1.running_var   torch.Size([512])
FPN.bottom_up.layer4.1.conv2.weight   torch.Size([512, 512, 3, 3])
FPN.bottom_up.layer4.1.bn2.weight   torch.Size([512])
FPN.bottom_up.layer4.1.bn2.bias   torch.Size([512])
FPN.bottom_up.layer4.1.bn2.running_mean   torch.Size([512])
FPN.bottom_up.layer4.1.bn2.running_var   torch.Size([512])
FPN.bottom_up.layer4.1.conv3.weight   torch.Size([2048, 512, 1, 1])
FPN.bottom_up.layer4.1.bn3.weight   torch.Size([2048])
FPN.bottom_up.layer4.1.bn3.bias   torch.Size([2048])
FPN.bottom_up.layer4.1.bn3.running_mean   torch.Size([2048])
FPN.bottom_up.layer4.1.bn3.running_var   torch.Size([2048])
FPN.bottom_up.layer4.2.conv1.weight   torch.Size([512, 2048, 1, 1])
FPN.bottom_up.layer4.2.bn1.weight   torch.Size([512])
FPN.bottom_up.layer4.2.bn1.bias   torch.Size([512])
FPN.bottom_up.layer4.2.bn1.running_mean   torch.Size([512])
FPN.bottom_up.layer4.2.bn1.running_var   torch.Size([512])
FPN.bottom_up.layer4.2.conv2.weight   torch.Size([512, 512, 3, 3])
FPN.bottom_up.layer4.2.bn2.weight   torch.Size([512])
FPN.bottom_up.layer4.2.bn2.bias   torch.Size([512])
FPN.bottom_up.layer4.2.bn2.running_mean   torch.Size([512])
FPN.bottom_up.layer4.2.bn2.running_var   torch.Size([512])
FPN.bottom_up.layer4.2.conv3.weight   torch.Size([2048, 512, 1, 1])
FPN.bottom_up.layer4.2.bn3.weight   torch.Size([2048])
FPN.bottom_up.layer4.2.bn3.bias   torch.Size([2048])
FPN.bottom_up.layer4.2.bn3.running_mean   torch.Size([2048])
FPN.bottom_up.layer4.2.bn3.running_var   torch.Size([2048])
RPN.rpn_conv.weight   torch.Size([256, 256, 3, 3])
RPN.rpn_conv.bias   torch.Size([256])
RPN.rpn_cls_score.weight   torch.Size([6, 256, 1, 1])
RPN.rpn_cls_score.bias   torch.Size([6])
RPN.rpn_bbox_offsets.weight   torch.Size([12, 256, 1, 1])
RPN.rpn_bbox_offsets.bias   torch.Size([12])
RCNN.fc1.weight   torch.Size([1024, 12544])
RCNN.fc1.bias   torch.Size([1024])
RCNN.fc2.weight   torch.Size([1024, 1024])
RCNN.fc2.bias   torch.Size([1024])
RCNN.pred_cls.weight   torch.Size([2, 1024])
RCNN.pred_cls.bias   torch.Size([2])
RCNN.pred_delta.weight   torch.Size([8, 1024])
RCNN.pred_delta.bias   torch.Size([8])

state_dict打印完毕

目前存在的问题是,不知道为什么

del backbone_dict['state_dict']['fc.weight']

代码是错误的,报错是KeyError。

改了一下代码,变成del backbone_dict['state_dict']['RCNN.fc1.weight']之后就不报错了,但是打印之后发现没变化,是del只是暂时删除吗?还是没明白源代码是想要删除什么,就不知道怎么改代码比较好....

5.

原创声明:本文系作者授权腾讯云开发者社区发表,未经许可,不得转载。

如有侵权,请联系 cloudcommunity@tencent.com 删除。

原创声明:本文系作者授权腾讯云开发者社区发表,未经许可,不得转载。

如有侵权,请联系 cloudcommunity@tencent.com 删除。

评论
登录后参与评论
暂无评论
推荐阅读
特征工程|时间特征构造以及时间序列特征构造
数据和特征决定了机器学习的上限,而模型和算法只是逼近这个上限而已。由此可见,特征工程在机器学习中占有相当重要的地位。在实际应用当中,可以说特征工程是机器学习成功的关键。
石晓文
2019/12/02
3.3K0
Python用Transformer、Prophet、RNN、LSTM、SARIMAX时间序列预测分析用电量、销售、交通事故数据
在数据驱动决策的时代,时间序列预测作为揭示数据时序规律的核心技术,已成为各行业解决预测需求的关键工具。从能源消耗趋势分析到公共安全事件预测,不同领域的数据特征对预测模型的适应性提出了差异化挑战。本文基于某咨询项目的实际需求,通过对比分析五种主流预测模型(SARIMAX、RNN、LSTM、Prophet、Transformer)在多类数据集上的表现,探讨模型选择逻辑与参数调优策略,为行业应用提供可落地的解决方案(点击文末“阅读原文”获取完整智能体、代码、数据、文档)。
拓端
2025/05/23
2520
Python用Transformer、Prophet、RNN、LSTM、SARIMAX时间序列预测分析用电量、销售、交通事故数据
AI 技术讲座精选:Python中使用LSTM网络进行时间序列预测
长短记忆型递归神经网络拥有学习长观察值序列的潜力。 它似乎是实现时间序列预测的完美方法,事实上,它可能就是。 在此教程中,你将学习如何构建解决单步单变量时间序列预测问题的LSTM预测模型。 在学习完
AI科技大本营
2018/04/26
1.8K0
AI 技术讲座精选:Python中使用LSTM网络进行时间序列预测
​【特征工程】时序特征挖掘的奇技淫巧
除了对数据进行统计外,也可以对节假日等进行统计,以刻画历史数据中所含节假日的情况。(还可以统计未来的节假日的情况。)
阿泽 Crz
2020/07/21
1.6K0
​【特征工程】时序特征挖掘的奇技淫巧
掌握时间序列特征工程:常用特征总结与 Feature-engine 的应用
时间序列数据的特征工程是一种技术,用于从时间序列数据中提取信息或构造特征,这些特征可用于提高机器学习模型的性能。以下是一些常见的时间序列特征工程技术:
deephub
2024/04/26
2.4K0
掌握时间序列特征工程:常用特征总结与 Feature-engine 的应用
基于长短期记忆神经网络LSTM的多步长时间序列预测
长短时记忆网络(LSTM)是一种能够学习和预测长序列的递归神经网络。LSTMs除了学习长序列外,还可以学习一次多步预测,这对于时间序列的预测非常有用。LSTMs的一个困难在于,它们可能难以配置,而且需要大量的准备工作才能获得适合学习的格式的数据。
代码医生工作室
2020/06/16
7K0
基于长短期记忆神经网络LSTM的多步长时间序列预测
​特征工程系列:聚合特征构造以及转换特征构造
关于作者:JunLiang,一个热爱挖掘的数据从业者,勤学好问、动手达人,期待与大家一起交流探讨机器学习相关内容~
木东居士
2019/10/12
2.8K0
​特征工程系列:聚合特征构造以及转换特征构造
python3用ARIMA模型进行时间序列预测
ARIMA是首字母缩写词,代表自动回归移动平均。它是一类模型,可在时间序列数据中捕获一组不同的标准时间结构。
拓端
2020/08/07
2.4K0
最全总结【时间序列】时间序列的预处理和特征工程
时间序列(Time Series)是按时间顺序排列的一组数据点,通常用于描述和分析随时间变化的现象。时间序列数据在许多领域中都有广泛应用,如金融市场、气象学、经济学、医学等。
机器学习司猫白
2025/01/21
1.2K0
最全总结【时间序列】时间序列的预处理和特征工程
特征工程之处理时间序列数据
维基百科对于特征工程的定义是:利用相关领域知识,通过数据挖掘技术从原始数据中提取特征的过程。这些特征可以用来提高机器学习算法的性能。
deephub
2020/09/14
1.8K0
如何在Python中用LSTM网络进行时间序列预测
Matt MacGillivray 拍摄,保留部分权利 翻译 | AI科技大本营(rgznai100) 长短记忆型递归神经网络拥有学习长观察值序列的潜力。它似乎是实现时间序列预测的完美方法,事实上,它可能就是。在此教程中,你将学习如何构建解决单步单变量时间序列预测问题的LSTM预测模型。 在学习完此教程后,您将学会: 如何为预测问题制定性能基准。 如何为单步时间序列预测问题设计性能强劲的测试工具。 如何准备数据以及创建并评测用于预测时间序列的LSTM 递归神经网络。 让我们开始吧。 Python中使用
AI科技大本营
2018/04/26
4.6K0
如何在Python中用LSTM网络进行时间序列预测
时间序列建模的时间戳与时序特征衍生思路
时间序列模型在我们日常工作中应用的场景还是会很多的,比如我们去预测未来的销售单量、预测股票价格、预测期货走势、预测酒店入住等等,这也是我们必须要掌握时序建模的原因。而关于时间戳以及时序值的特征衍生,在建模过程中起到的作用是十分巨大的!之前写过一篇关于日期特征操作的文章——《关于日期特征,你想知道操作都在这儿~》,可以先回顾下,里面有关于日期特征的基础操作手法。
Sam Gor
2022/02/25
1.7K0
时间序列建模的时间戳与时序特征衍生思路
Python对商店数据进行lstm和xgboost销售量时间序列建模预测分析|附代码数据
在本文中,在数据科学学习之旅中,我经常处理日常工作中的时间序列数据集,并据此做出预测
拓端
2023/03/23
8910
为时间序列分析准备数据的一些简单的技巧
TS可能看起来像一个简单的数据对象,易于处理,但事实是,对于新手来说,在真正有趣的事情开始之前,仅仅准备数据集就可能是一项艰巨的任务。
deephub
2020/05/28
9160
用Python的长短期记忆神经网络进行时间序列预测
长短期记忆递归神经网络具有学习长的观察序列的潜力。
QiqiHe
2018/02/08
9.7K2
用Python的长短期记忆神经网络进行时间序列预测
AI 技术讲座精选:如何用 Keras 调试LSTM超参数解决时间序列预测问题
配置神经网络十分困难,因为并没有关于如何进行配置的好理论。 你必须用系统化的思维从动态结果和客观结果这两个角度探讨不同配置,设法理解给定预测建模问题。 在本教程中,您将学会探讨如何配置LSTM网络解决
AI科技大本营
2018/04/26
4K0
AI 技术讲座精选:如何用 Keras 调试LSTM超参数解决时间序列预测问题
【Kaggle时间序列教程:时间序列入门之时间序列的线性回归(1)】
时间序列分析是数据科学和机器学习中的一个重要领域,广泛应用于金融、气象、销售预测等多个行业。然而,对于很多初学者来说,时间序列的概念和方法可能会显得有些复杂,尤其是如何构建模型、如何处理数据等。
机器学习司猫白
2025/01/21
3040
【Kaggle时间序列教程:时间序列入门之时间序列的线性回归(1)】
Python TensorFlow循环神经网络RNN-LSTM神经网络预测股票市场价格时间序列和MSE评估准确性|附代码数据
自 2000 年 1 月以来的股票价格数据。我们使用的是 Microsoft 股票。
拓端
2023/07/28
6830
干货 | 20个教程,掌握时间序列的特征分析(附代码)
【导语】时间序列是指以固定时间为间隔的序列值。本篇教程将教大家用 Python 对时间序列进行特征分析。
AI科技大本营
2019/07/11
6.2K0
干货 | 20个教程,掌握时间序列的特征分析(附代码)
如何使用带有Dropout的LSTM网络进行时间序列预测
长短期记忆模型(LSTM)是一类典型的递归神经网络,它能够学习观察所得的序列。
风飘叶扬
2018/02/05
21.2K1
推荐阅读
特征工程|时间特征构造以及时间序列特征构造
3.3K0
Python用Transformer、Prophet、RNN、LSTM、SARIMAX时间序列预测分析用电量、销售、交通事故数据
2520
AI 技术讲座精选:Python中使用LSTM网络进行时间序列预测
1.8K0
​【特征工程】时序特征挖掘的奇技淫巧
1.6K0
掌握时间序列特征工程:常用特征总结与 Feature-engine 的应用
2.4K0
基于长短期记忆神经网络LSTM的多步长时间序列预测
7K0
​特征工程系列:聚合特征构造以及转换特征构造
2.8K0
python3用ARIMA模型进行时间序列预测
2.4K0
最全总结【时间序列】时间序列的预处理和特征工程
1.2K0
特征工程之处理时间序列数据
1.8K0
如何在Python中用LSTM网络进行时间序列预测
4.6K0
时间序列建模的时间戳与时序特征衍生思路
1.7K0
Python对商店数据进行lstm和xgboost销售量时间序列建模预测分析|附代码数据
8910
为时间序列分析准备数据的一些简单的技巧
9160
用Python的长短期记忆神经网络进行时间序列预测
9.7K2
AI 技术讲座精选:如何用 Keras 调试LSTM超参数解决时间序列预测问题
4K0
【Kaggle时间序列教程:时间序列入门之时间序列的线性回归(1)】
3040
Python TensorFlow循环神经网络RNN-LSTM神经网络预测股票市场价格时间序列和MSE评估准确性|附代码数据
6830
干货 | 20个教程,掌握时间序列的特征分析(附代码)
6.2K0
如何使用带有Dropout的LSTM网络进行时间序列预测
21.2K1
相关推荐
特征工程|时间特征构造以及时间序列特征构造
更多 >
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档