Transformer-based multi-hop question generation

dc.contributor.authorEmerson, John Robert
dc.contributor.authorUniversity of Lethbridge. Faculty of Arts and Science
dc.contributor.supervisorChali, Yllias
dc.date.accessioned2022-11-07T18:56:14Z
dc.date.available2022-11-07T18:56:14Z
dc.date.issued2022
dc.degree.levelMastersen_US
dc.description.abstractQuestion generation is the parallel task of question answering, where given an input context and, optionally, an answer, the goal is to generate a relevant and fluent natural language question. Although recent works on question generation have experienced success by utilizing sequence-to-sequence models, there is a need for question generation models to handle increasingly complex input contexts to produce increasingly detailed questions. Multi-hop question generation is a more challenging task that aims to generate questions by connecting multiple facts from multiple input contexts. In this work, we apply a transformer model to the task of multi-hop question generation without utilizing any sentence-level supporting fact information. We utilize concepts that have proven effective in single-hop question generation, including a copy mechanism and placeholder tokens. We evaluate our model’s performance on the HotpotQA dataset using automated evaluation metrics, including BLEU, ROUGE and METEOR and show an improvement over the previous work.en_US
dc.identifier.urihttps://hdl.handle.net/10133/6378
dc.language.isoen_USen_US
dc.proquest.subject0984en_US
dc.proquest.subject0800en_US
dc.proquest.subject0723en_US
dc.proquestyesYesen_US
dc.publisherLethbridge, Alta. : University of Lethbridge, Dept. of Mathematics and Computer Scienceen_US
dc.publisher.departmentDepartment of Mathematics and Computer Scienceen_US
dc.publisher.facultyArts and Scienceen_US
dc.relation.ispartofseriesThesis (University of Lethbridge. Faculty of Arts and Science)en_US
dc.subjectnatural language processingen_US
dc.subjectquestion generationen_US
dc.subjectmachine learningen_US
dc.subjectartificial intelligenceen_US
dc.subjectNatural language processing (Computer science)en_US
dc.subjectComputational linguisticsen_US
dc.subjectMachine learningen_US
dc.subjectArtificial intelligenceen_US
dc.subjectDissertations, Academicen_US
dc.titleTransformer-based multi-hop question generationen_US
dc.typeThesisen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
EMERSON_JOHN_MSC_2022.pdf
Size:
1.4 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
license.txt
Size:
3.25 KB
Format:
Item-specific license agreed upon to submission
Description: