Individual differences in EEG signals lead to the poor generalization ability of EEG-based affective models. Transfer learning, as we introduced in the class, can eliminate the subject differences and achieve appreciable improvement in recognition performance. In this assignment, you are asked to build and evaluate a cross-subject affective model using Domain-Adversarial Neural Networks (DANN) with the SEED dataset.
You are required to apply leave-one-subject-out cross validation to classify different emotions with DANN model and compare the results of DANN with a baseline model (you can choose the baseline model on your own). Under Leave-one-subject-out cross validation configuration, for each subject, an affective model should be trained with one subject as target domain, and other subjects as source domain. In the end, there should be five DANN models for each of the subject, and you should report both the individual recognition accuracy and the mean recognition accuracy.
Here are some suggestions of parameter settings. The feature extractor has 2 layers, both with node number of 128. The label predictor and domain discriminator have 3 layers with node numbers of 64, 64, and C, respectively. C indicates the number of emotion classes to be classified.
Python
Python
# Name: DANN_1
# Author: Reacubeth
# Time: 2021/4/22 19:39
# Mail: noverfitting@gmail.com
# Site: www.omegaxyz.com
# *_*coding:utf-8 *_*
from torch import nn
import torch
class ReversalLayer(torch.autograd.Function):
def __init__(self):
super(ReversalLayer, self).__init__()
@staticmethod
def forward(ctx, x, alpha):
ctx.alpha = alpha
return x.view_as(x)
@staticmethod
def backward(ctx, grad_output):
output = grad_output.neg() * ctx.alpha
return output, None
class DANN(nn.Module):
def __init__(self, input_dim, hid_dim_1, hid_dim_2, class_num, domain_num):
super(DANN, self).__init__()
self.feature_extractor = nn.Sequential(nn.Linear(input_dim, hid_dim_1 * 2),
nn.ReLU(),
nn.Linear(hid_dim_1 * 2, hid_dim_1),
nn.ReLU(),
)
self.classifier = nn.Sequential(nn.Linear(hid_dim_1, hid_dim_2),
nn.ReLU(),
nn.Linear(hid_dim_2, class_num),
# nn.Softmax(),
)
self.domain_classifier = nn.Sequential(nn.Linear(hid_dim_1, hid_dim_2),
nn.ReLU(),
nn.Linear(hid_dim_2, domain_num),
# nn.Softmax(),
)
def forward(self, X, alpha):
self.alpha = torch.tensor(alpha)
feature = self.feature_extractor(X)
class_res = self.classifier(feature)
feature2 = ReversalLayer.apply(feature, self.alpha)
domain_res = self.domain_classifier(feature2)
return feature, class_res, domain_res
扫码关注腾讯云开发者
领取腾讯云代金券
Copyright © 2013 - 2025 Tencent Cloud. All Rights Reserved. 腾讯云 版权所有
深圳市腾讯计算机系统有限公司 ICP备案/许可证号:粤B2-20090059 深公网安备号 44030502008569
腾讯云计算(北京)有限责任公司 京ICP证150476号 | 京ICP备11018762号 | 京公网安备号11010802020287
Copyright © 2013 - 2025 Tencent Cloud.
All Rights Reserved. 腾讯云 版权所有