Skip to main content
Search
Join
Log in
Connecting research computing, data & software professionals.
Attention, Transformers, and LLMs: a hands-on introduction in Pytorch
ai
deep-learning
machine-learning
neural-networks
pytorch
Landing Page
Preparing data for LLM training
Small Language Models: an introduction to autoregressive language modeling
Attention is all you need
Other LLM Topics
This workshop focuses on developing an understanding of the fundamentals of attention and the transformer architecture so that you can understand how LLMs work and use them in your own projects.
1
Person found this useful
Login to vote
Category
learning
Skill Level