Skip to main content
The European High Performance Computing Joint Undertaking (EuroHPC JU)

Awarded Projects

Filter by

Awarded Projects (200)

Showing results 100 to 110

Natural language processing plays an increasingly important part of our digital lives but it works best for high-resource languages due to the amount of data needed to train models.


With the introduction of the transformer neural network (Vaswani et al. 2017) and the subsequent transformer language models (LM) such as GPT (Radford et al. 2018) and BERT (Devlin et al. 2018), they have become the new standard of a pre training-fine tuning paradigm.