Loading...
- Type of Document: M.Sc. Thesis
- Language: Farsi
- Document No: 53734 (19)
- University: Sharif University of Technology
- Department: Computer Engineering
- Advisor(s): Soleymani Baghshah, Mahdieh
- Abstract:
- Regarding the ever-increasing usage of text generation and analysis in Natural Language Processing field, Language Modeling and Masked Language Modeling have been recently one of the most frequent tasks. Besides, many pretrained models such as BERT have been proposed due to the lack of rich datasets and computational resources among researchers. These models can be finetuned on other datasets in downstream tasks. Although these Transformer-based deep neural networks have performed perfectly in many problems, they still have some shortcomings in a few tasks.Furthermore, structured data like graphs have been recently used extensively in Natural Language Processing and researchers have taken advantage of their rich information in various Natural Language Processing tasks. For instance, Knowledge Graphs provide rich information about the relations between world entities. Knowledge Graphs have been used in many tasks such as text generation, text classification, and language modeling. Regarding the rich information of Knowledge Graphs and other graphs derived from textual corpus, as well as the shortcomings of BERT, a few methods have been proposed in this project in order to provide enriched word embeddings by combining the global information existing in a static multi-graph, consisting of a Knowledge Graph and other graphs using the TF-IDF and PMI values, and the pretrained information in BERT model. In this project, the information from the multi-graph mentioned above is extracted using Relational Graph Convolutional Networks (R-GCN) and Graph Attention Networks (GAT). A multi-head attention-based extension has also been proposed for analyzing a dynamic graph derived from input sentences.Finally, BERT model as a baseline, other related models, and different versions of the proposed model in this project have been experimented from qualitative and quantitative perspectives using Hits@1, Hits@5, and Perplexity metrics. The reported results demonstrate that the proposed model using Relational Graph Convolutional Network with the help of dynamic sentence graph analysis using attention mechanism has been generally superior.
- Keywords:
- Graph Neural Network ; Knowledge Graph ; Graph Attention Networks ; Graph-Based Embedding ; Masked Language Modeling ; Graph Convolutional Networks
-
محتواي کتاب
- view
- 1 مقدمه
- 2 پژوهشهای پیشین
- 3 راهکار پیشنهادی
- 4 پیادهسازی، آزمایشها و ارزیابی
- 5 جمعبندی و کارهای آتی
- مراجع
- واژهنامه فارسی به انگلیسی
- واژهنامه انگلیسی به فارسی
- کلمات کوتهنوشت