Classification of Grasp-and-Lift EEG using GoogLeNet

Grasp-and-Lift (GAL) action is the hand movement of lifting an object for a few seconds and action complete with putting the object back to its original position. In fact, EEG signal is one of the common ways to understand the relationship between brain and GAL action. As the relationship between...

Full description

Main Author: Ong , Zhong Yi
Format: Final Year Project
Language: English
Institution: Universiti Teknologi Petronas
Record Id / ISBN-0: utp-utpedia.20200 /
Published: IRC 2019
Online Access: http://utpedia.utp.edu.my/20200/1/Final%20Dissertation.pdf
http://utpedia.utp.edu.my/20200/
Tags: Add Tag
No Tags, Be the first to tag this record!
id utp-utpedia.20200
recordtype eprints
spelling utp-utpedia.202002019-12-20T16:12:51Z http://utpedia.utp.edu.my/20200/ Classification of Grasp-and-Lift EEG using GoogLeNet Ong , Zhong Yi Grasp-and-Lift (GAL) action is the hand movement of lifting an object for a few seconds and action complete with putting the object back to its original position. In fact, EEG signal is one of the common ways to understand the relationship between brain and GAL action. As the relationship between human brain activity and EEG signal is yet to be fully understood, more research work needs to be carried out. Realizing the lack of low cost and practical prosthetic device for patients suffering from neurological disease and the fact that low classification accuracy due to numerous events and low Signal to Noise ratio (SNR), GAL EEG signal processing will be giving huge impact to the development of prosthetic device by providing input to Brain-Computer Interface device. As such, this research presents a Convolutional Neural Network (CNN)-based deep learning method to classify EEG signals into 6 GAL classes. The main objective of this research is to develop EEG GAL events classification based on pretrained CNN followed by performance evaluation in term of accuracy, sensitivity and specificity. 6 electrodes corresponding to motor movement including electrode C3, CZ, C4, P3, PZ, P4 were selected during pre- processing phase. One-versus-rest scheme and two class Common Spatial Pattern (CSP) filter were used to maximize variance difference between two classes. Extracted CSP features from each electrode were converted into grayscale scalogram using sliding window method followed by concatenating 3 grayscale scalogram forming RGB scalogram. One classifier was trained per class. The classification accuracy can be computed by inputting test data into trained network. Based on result obtained, average testing accuracy, specificity and sensitivity among 6 classes are 93.85%,96.5% and 91% respectively. IRC 2019-01 Final Year Project NonPeerReviewed application/pdf en http://utpedia.utp.edu.my/20200/1/Final%20Dissertation.pdf Ong , Zhong Yi (2019) Classification of Grasp-and-Lift EEG using GoogLeNet. IRC, Universiti Teknologi PETRONAS. (Submitted)
institution Universiti Teknologi Petronas
collection UTPedia
language English
description Grasp-and-Lift (GAL) action is the hand movement of lifting an object for a few seconds and action complete with putting the object back to its original position. In fact, EEG signal is one of the common ways to understand the relationship between brain and GAL action. As the relationship between human brain activity and EEG signal is yet to be fully understood, more research work needs to be carried out. Realizing the lack of low cost and practical prosthetic device for patients suffering from neurological disease and the fact that low classification accuracy due to numerous events and low Signal to Noise ratio (SNR), GAL EEG signal processing will be giving huge impact to the development of prosthetic device by providing input to Brain-Computer Interface device. As such, this research presents a Convolutional Neural Network (CNN)-based deep learning method to classify EEG signals into 6 GAL classes. The main objective of this research is to develop EEG GAL events classification based on pretrained CNN followed by performance evaluation in term of accuracy, sensitivity and specificity. 6 electrodes corresponding to motor movement including electrode C3, CZ, C4, P3, PZ, P4 were selected during pre- processing phase. One-versus-rest scheme and two class Common Spatial Pattern (CSP) filter were used to maximize variance difference between two classes. Extracted CSP features from each electrode were converted into grayscale scalogram using sliding window method followed by concatenating 3 grayscale scalogram forming RGB scalogram. One classifier was trained per class. The classification accuracy can be computed by inputting test data into trained network. Based on result obtained, average testing accuracy, specificity and sensitivity among 6 classes are 93.85%,96.5% and 91% respectively.
format Final Year Project
author Ong , Zhong Yi
spellingShingle Ong , Zhong Yi
Classification of Grasp-and-Lift EEG using GoogLeNet
author_sort Ong , Zhong Yi
title Classification of Grasp-and-Lift EEG using GoogLeNet
title_short Classification of Grasp-and-Lift EEG using GoogLeNet
title_full Classification of Grasp-and-Lift EEG using GoogLeNet
title_fullStr Classification of Grasp-and-Lift EEG using GoogLeNet
title_full_unstemmed Classification of Grasp-and-Lift EEG using GoogLeNet
title_sort classification of grasp-and-lift eeg using googlenet
publisher IRC
publishDate 2019
url http://utpedia.utp.edu.my/20200/1/Final%20Dissertation.pdf
http://utpedia.utp.edu.my/20200/
_version_ 1741195589310218240
score 11.62408