Loading...
- Type of Document: M.Sc. Thesis
- Language: Farsi
- Document No: 53956 (19)
- University: Sharif University of Technology
- Department: Computer Engineering
- Advisor(s): Sameti, Hossein; Motahari, Abolfazl
- Abstract:
- Following dramatic changes after using deep learning method as a solution for Natural Language Processing tasks, Transformer architecture get popular. Based on that, then BERT Language model presented and get state-of-the-art as a solution for a lot of language processing tasks. It was a turning point in Natural Language Processing field. Also, in cross-lingual methods research line motivated by developing a common space for representation of language units, e.g. words, sentences, in more that one language, get some remarkable improvements. However, for languages distant from English such as Persian or Arabic the methods' performance was not clear. In this work, we performed some innovative methods to transfer learning of English BERT model to Persian and some languages close to Persian in which changes in the base model was as least as possible. The proposed model used as a solution for some language processing tasks such as Sentiment analysis, News Classification and Named entity recognition. The results show that the proposed method is effective for transferring BERT language model to other languages with no pre-training
- Keywords:
- Natural Language Processing ; Bidirectional Encoder Representations from Transformers (BERT)Model ; Cross Lingual Speaker Adaptation ; Pretrained Models ; Transfer Learning ; Sentiment Analysis