Notice
Let’s explore transformer together
- document 1 document 2 document 3
- niveau 1 niveau 2 niveau 3
Descriptif
Séquence du séminaire les Rencontres de Statistique Appliquée
Brainpower Meets Machine Power: Chat-GPT synergies with social scientists
Kamel Smaïli (Lorraine University - Loria)
In this presentation, we will provide a detailed explanation of the Transformer architecture, which forms the foundation for state-of-the-art generative AI systems.
We will delve into the intricacies of both the encoder and the decoder, emphasizing the significance of positional encoding and multi-head attention. The presentation will be enriched with illustrative examples. Regarding the decoder, we will highlight the distinction between the two types of multi-head attention employed within it
Thème
Dans la même collection
-
Integrating environmental impact of ai in a data center
GayPaulIntegrating environmental impact of ai in a data center
-
From GPT to ChatGPT: General Principles and Issues Raised by OpenAI’s Large Language Models
SchwabDidierFrom GPT to ChatGPT: General Principles and Issues Raised by OpenAI’s Large Language Models
-
The origins of large language models: an historical perspective on computational language modeling
CrabbéBenoîtThe origins of large language models: an historical perspective on computational language modeling