Notice
Exploring Deep Learning through Flux.jl
- document 1 document 2 document 3
- niveau 1 niveau 2 niveau 3
Descriptif
Flux.jl stands out as a premier Julia package for deep learning, enveloping a wide array of recognized deep learning techniques such as convolution, attention mechanisms, and recurrent layers, to name a few. This session will probe into the foundational mechanisms of Flux.jl, emphasizing its three critical elements:
- Decomposition of complex nested structures (Functors.jl)
- Automatic reverse differentiation (Zygote.jl)
- Descent-based minimization methods (Optimisers.jl)
Throughout the discussion, we will scrutinize how some of the built-in layers are structured and concurrently learn the process of formulating custom ones. Additionally, a significant part of the presentation will be dedicated to exploring datasets, particularly focusing on how to integrate and utilize datasets from Python and R to augment the deep learning processes in Flux.jl.
Finally, we will analyze several well-known test cases such as MLP for approximation issues, CNN for image classification objectives, and Transformers for challenges related to text comprehension and translation.