Publication Date

Spring 2023

Degree Type

Master's Project

Degree Name

Master of Science (MS)

Department

Computer Science

First Advisor

Ching-seh Wu

Second Advisor

Robert Chun

Third Advisor

William Andreopoulos,

Keywords

Context-aware, Graph encoders, Machine learning, Machine translation, Recurrent neural networks, Transformers

Abstract

Machine translation presents its root in the domain of textual processing that focuses on the usage of computer software for the purpose of translation of sentences. Neural machine translation follows the same idea and integrates machine learning with the help of neural networks.Various techniques are being explored by researchers and are famously used by Google Translate, Bing Microsoft Translator, Deep Translator, etc. However, these neural machine translation techniques do not incorporate the context of the sentences and are only determined by the phrasesor sentence structure. This report explores the neural machine translation technique dedicated to context-aware translations. It also provides insights into the potential of neural networks, types ofmachine translation techniques, and the architecture used for machine translation.

In this project, an improved way of approaching neural machine translation has been presented where the source language data, as well as the target language, is preprocessed using NLP techniques and trained with encoder-decoder-attention mechanism models to produce more accurate translations using a combination of deep learning machine learning models. Graph-based machine learning models have experimented with Recurrent Neural Networks in German English dataset and the results are compared with the Context-aware model to understand the future of Neural Machine Translation. The result of this research project shows that transformers can predict better than Context-aware and graph encoders with BLEU score of m.

Available for download on Friday, May 24, 2024

Share

COinS