前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >迁移学习模型DANN实现

迁移学习模型DANN实现

作者头像
里克贝斯
发布2021-09-08 11:05:06
9820
发布2021-09-08 11:05:06
举报
文章被收录于专栏:图灵技术域

Individual differences in EEG signals lead to the poor generalization ability of EEG-based affective models. Transfer learning, as we introduced in the class, can eliminate the subject differences and achieve appreciable improvement in recognition performance. In this assignment, you are asked to build and evaluate a cross-subject affective model using Domain-Adversarial Neural Networks (DANN) with the SEED dataset.

You are required to apply leave-one-subject-out cross validation to classify different emotions with DANN model and compare the results of DANN with a baseline model (you can choose the baseline model on your own). Under Leave-one-subject-out cross validation configuration, for each subject, an affective model should be trained with one subject as target domain, and other subjects as source domain. In the end, there should be five DANN models for each of the subject, and you should report both the individual recognition accuracy and the mean recognition accuracy.

Here are some suggestions of parameter settings. The feature extractor has 2 layers, both with node number of 128. The label predictor and domain discriminator have 3 layers with node numbers of 64, 64, and C, respectively. C indicates the number of emotion classes to be classified.

Python

Python

代码语言:txt
复制
# Name: DANN_1
# Author: Reacubeth
# Time: 2021/4/22 19:39
# Mail: noverfitting@gmail.com
# Site: www.omegaxyz.com
# *_*coding:utf-8 *_*
 
from torch import nn
import torch
 
 
class ReversalLayer(torch.autograd.Function):
    def __init__(self):
        super(ReversalLayer, self).__init__()
 
    @staticmethod
    def forward(ctx, x, alpha):
        ctx.alpha = alpha
        return x.view_as(x)
 
    @staticmethod
    def backward(ctx, grad_output):
        output = grad_output.neg() * ctx.alpha
        return output, None
 
 
class DANN(nn.Module):
    def __init__(self, input_dim, hid_dim_1, hid_dim_2, class_num, domain_num):
        super(DANN, self).__init__()
        self.feature_extractor = nn.Sequential(nn.Linear(input_dim, hid_dim_1 * 2),
                                               nn.ReLU(),
                                               nn.Linear(hid_dim_1 * 2, hid_dim_1),
                                               nn.ReLU(),
                                               )
 
        self.classifier = nn.Sequential(nn.Linear(hid_dim_1, hid_dim_2),
                                        nn.ReLU(),
                                        nn.Linear(hid_dim_2, class_num),
                                        # nn.Softmax(),
                                        )
 
        self.domain_classifier = nn.Sequential(nn.Linear(hid_dim_1, hid_dim_2),
                                               nn.ReLU(),
                                               nn.Linear(hid_dim_2, domain_num),
                                               # nn.Softmax(),
                                               )
 
    def forward(self, X, alpha):
        self.alpha = torch.tensor(alpha)
        feature = self.feature_extractor(X)
        class_res = self.classifier(feature)
        feature2 = ReversalLayer.apply(feature, self.alpha)
        domain_res = self.domain_classifier(feature2)
        return feature, class_res, domain_res

相关文章

本文参与 腾讯云自媒体同步曝光计划,分享自作者个人站点/博客。
原始发表:2021-09-05,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 作者个人站点/博客 前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体同步曝光计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
目录
  • 相关文章
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档