top of page

Natural Language Processing with Transformers

  • Leading institution: Athena Research Center

  • EC: 4

General Description

Natural Language Processing with Transformers' merges computational linguistics, role-based modeling of human language, and contemporary machine learning techniques. Throughout this course, doctoral candidates learn about the fusion of linguistic theories with statistical machine learning and deep learning models, focusing on the capabilities of transformer mechanisms and the application of modern large language models on various types of texts in the financial domain. Candidates will explore the intricacies of transformer architectures, such as BERT, GPT, and their variants, mastering their application in a wide range of advanced NLP tasks, from sentiment analysis to language generation. Additionally, candidates will gain hands-on experience in leveraging transformers for tasks like text classification, named entity recognition, and language translation. After completing the course, candidates will be able to harness transformer-based large language models to process and generate texts in the financial domains.

unnamed.png

Funded by the European Union. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or Horizon Europe: Marie Skłodowska-Curie Actions. Neither the European Union nor the granting authority can be held responsible for them. This project has received funding from the Horizon Europe research and innovation programme under the Marie Skłodowska-Curie Grant Agreement No. 101119635

Follow us

  • Wikipedia
  • LinkedIn
logo-nobackground-500.png
FinAI_COST.png

© 2023-2024

bottom of page