Cours/Séminaire
Notice
Crédits
Yolhan Mannes (Intervention)
Conditions d'utilisation
Droit commun de la propriété intellectuelle
DOI : 10.60527/4b78-b957
Citer cette ressource :
Yolhan Mannes. GroupeCalcul. (2023, 12 juillet). Exploring Deep Learning through Flux.jl. [Vidéo]. Canal-U. https://doi.org/10.60527/4b78-b957. (Consultée le 14 janvier 2025)

Exploring Deep Learning through Flux.jl

Réalisation : 12 juillet 2023 - Mise en ligne : 6 janvier 2025
  • document 1 document 2 document 3
  • niveau 1 niveau 2 niveau 3
Descriptif

Flux.jl stands out as a premier Julia package for deep learning, enveloping a wide array of recognized deep learning techniques such as convolution, attention mechanisms, and recurrent layers, to name a few. This session will probe into the foundational mechanisms of Flux.jl, emphasizing its three critical elements:

  • Decomposition of complex nested structures (Functors.jl)
  • Automatic reverse differentiation (Zygote.jl)
  • Descent-based minimization methods (Optimisers.jl)

Throughout the discussion, we will scrutinize how some of the built-in layers are structured and concurrently learn the process of formulating custom ones. Additionally, a significant part of the presentation will be dedicated to exploring datasets, particularly focusing on how to integrate and utilize datasets from Python and R to augment the deep learning processes in Flux.jl.

Finally, we will analyze several well-known test cases such as MLP for approximation issues, CNN for image classification objectives, and Transformers for challenges related to text comprehension and translation.

Intervention
Thème