Keyphrase Generation with Recurrent Neural Network and Attention Model

There is currently an information overload happening in the world of digital media, with millions of articles being published in a single day. This has raised concerns regarding how someone can not only document and catalogue these articles for preservation purposes, but as well as produce a s...

Full description

Main Author: Shemar, Jay Anil Singh
Format: Final Year Project
Language: English
Institution: Universiti Teknologi Petronas
Record Id / ISBN-0: utp-utpedia.21001 /
Published: IRC 2020
Subjects:
Online Access: http://utpedia.utp.edu.my/21001/1/19716_Jay%20Anil%20Singh%20Shemar.pdf
http://utpedia.utp.edu.my/21001/
Tags: Add Tag
No Tags, Be the first to tag this record!
id utp-utpedia.21001
recordtype eprints
spelling utp-utpedia.210012021-09-13T14:54:38Z http://utpedia.utp.edu.my/21001/ Keyphrase Generation with Recurrent Neural Network and Attention Model Shemar, Jay Anil Singh Q Science (General) There is currently an information overload happening in the world of digital media, with millions of articles being published in a single day. This has raised concerns regarding how someone can not only document and catalogue these articles for preservation purposes, but as well as produce a summary for others who might want a quick understanding of its contents, as producing a summary for every article takes a long time. As such, in order to tackle this issue, automation is intended to streamline the entire process. The purpose of this study is to examine the effectiveness of a model when it uses an attention model, versus one without. To clarify, the model will initially not feature any attention model of any sort, be it Bahdanau or Luong. We intend to qualitatively evaluate as to whether or not the attention model alone is capable of producing better results. These results would be obtained using F1-Score, Precision and Recall evaluation methods. The research will first evaluate the original version of a model, that does not feature any attention mechanism, and obtaining the F1-Score, Precision, and Recall, then tabulating those results. Following that, the attention model will be incorporated into the model, and the new modified model will be tested again, and the results will be tabulated. For the purposes of this study, three datasets will be used, namely the Gigaword dataset, as well as two datasets generated for the purposes of this study, namely a dataset featuring Coronavirus News, and a dataset featuring Oil and Gas News. IRC 2020-05 Final Year Project NonPeerReviewed application/pdf en http://utpedia.utp.edu.my/21001/1/19716_Jay%20Anil%20Singh%20Shemar.pdf Shemar, Jay Anil Singh (2020) Keyphrase Generation with Recurrent Neural Network and Attention Model. IRC, Universiti Teknologi PETRONAS. (Submitted)
institution Universiti Teknologi Petronas
collection UTPedia
language English
topic Q Science (General)
spellingShingle Q Science (General)
Shemar, Jay Anil Singh
Keyphrase Generation with Recurrent Neural Network and Attention Model
description There is currently an information overload happening in the world of digital media, with millions of articles being published in a single day. This has raised concerns regarding how someone can not only document and catalogue these articles for preservation purposes, but as well as produce a summary for others who might want a quick understanding of its contents, as producing a summary for every article takes a long time. As such, in order to tackle this issue, automation is intended to streamline the entire process. The purpose of this study is to examine the effectiveness of a model when it uses an attention model, versus one without. To clarify, the model will initially not feature any attention model of any sort, be it Bahdanau or Luong. We intend to qualitatively evaluate as to whether or not the attention model alone is capable of producing better results. These results would be obtained using F1-Score, Precision and Recall evaluation methods. The research will first evaluate the original version of a model, that does not feature any attention mechanism, and obtaining the F1-Score, Precision, and Recall, then tabulating those results. Following that, the attention model will be incorporated into the model, and the new modified model will be tested again, and the results will be tabulated. For the purposes of this study, three datasets will be used, namely the Gigaword dataset, as well as two datasets generated for the purposes of this study, namely a dataset featuring Coronavirus News, and a dataset featuring Oil and Gas News.
format Final Year Project
author Shemar, Jay Anil Singh
author_sort Shemar, Jay Anil Singh
title Keyphrase Generation with Recurrent Neural Network and Attention Model
title_short Keyphrase Generation with Recurrent Neural Network and Attention Model
title_full Keyphrase Generation with Recurrent Neural Network and Attention Model
title_fullStr Keyphrase Generation with Recurrent Neural Network and Attention Model
title_full_unstemmed Keyphrase Generation with Recurrent Neural Network and Attention Model
title_sort keyphrase generation with recurrent neural network and attention model
publisher IRC
publishDate 2020
url http://utpedia.utp.edu.my/21001/1/19716_Jay%20Anil%20Singh%20Shemar.pdf
http://utpedia.utp.edu.my/21001/
_version_ 1741195693540769792
score 11.62408