top of page

Natural Language Processing with Transformers

  • Leading institution: Athena Research Center

  • EC: 4

General Description

Natural Language Processing with Transformers' merges computational linguistics, role-based modeling of human language, and contemporary machine learning techniques. Throughout this course, doctoral candidates learn about the fusion of linguistic theories with statistical machine learning and deep learning models, focusing on the capabilities of transformer mechanisms and the application of modern large language models on various types of texts in the financial domain. Candidates will explore the intricacies of transformer architectures, such as BERT, GPT, and their variants, mastering their application in a wide range of advanced NLP tasks, from sentiment analysis to language generation. Additionally, candidates will gain hands-on experience in leveraging transformers for tasks like text classification, named entity recognition, and language translation. After completing the course, candidates will be able to harness transformer-based large language models to process and generate texts in the financial domains.

bottom of page