Dl4All Logo
Tutorials :

Code with the Author of Build an LLM (From Scratch) by Sebastian Raschka

   Author: Baturi   |   27 May 2025   |   Comments icon: 0

Code with the Author of Build an LLM (From Scratch) by Sebastian Raschka
Free Download Code with the Author of Build an LLM (From Scratch) by Sebastian Raschka
Released 5/2025
By Sebastian Raschka
MP4 | Video: h264, 1280x720 | Audio: AAC, 44.1 KHz, 2 Ch
Genre: eLearning | Language: English | Duration: 13h 35m | Size: 2.72 GB


Master the inner workings of how large language models like GPT really work with hands-on coding sessions led by bestselling author Sebastian Raschka. These companion videos to Build a Large Language Model from Scratch walk you through real-world implementation, with each session ending in a "test yourself" challenge to solidify your skills and deepen your understanding.
Table of contents
Chapter 1. Python Environment Setup
Chapter 2. Tokenizing text
Chapter 2. Converting tokens into token IDs
Chapter 2. Adding special context tokens
Chapter 2. Byte pair encoding
Chapter 2. Data sampling with a sliding window
Chapter 2. Creating token embeddings
Chapter 2. Encoding word positions
Chapter 3. A simple self-attention mechanism without trainable weights | Part 1
Chapter 3. A simple self-attention mechanism without trainable weights | Part 2
Chapter 3. Computing the attention weights step by step
Chapter 3. Implementing a compact self-attention Python class
Chapter 3. Applying a causal attention mask
Chapter 3. Masking additional attention weights with dropout
Chapter 3. Implementing a compact causal self-attention class
Chapter 3. Stacking multiple single-head attention layers
Chapter 3. Implementing multi-head attention with weight splits
Chapter 4. Coding an LLM architecture
Chapter 4. Normalizing activations with layer normalization
Chapter 4. Implementing a feed forward network with GELU activations
Chapter 4. Adding shortcut connections
Chapter 4. Connecting attention and linear layers in a transformer block
Chapter 4. Coding the GPT model
Chapter 4. Generating text
Chapter 5. Using GPT to generate text
Chapter 5. Calculating the text generation loss: cross entropy and perplexity
Chapter 5. Calculating the training and validation set losses
Chapter 5. Training an LLM
Chapter 5. Decoding strategies to control randomness
Chapter 5. Temperature scaling
Chapter 5. Top-k sampling
Chapter 5. Modifying the text generation function
Chapter 5. Loading and saving model weights in PyTorch
Chapter 5. Loading pretrained weights from OpenAI
Chapter 6. Preparing the dataset
Chapter 6. Creating data loaders
Chapter 6. Initializing a model with pretrained weights
Chapter 6. Adding a classification head
Chapter 6. Calculating the classification loss and accuracy
Chapter 6. Fine-tuning the model on supervised data
Chapter 6. Using the LLM as a spam classifier
Chapter 7. Preparing a dataset for supervised instruction fine-tuning
Chapter 7. Organizing data into training batches
Chapter 7. Creating data loaders for an instruction dataset
Chapter 7. Loading a pretrained LLM
Chapter 7. Fine-tuning the LLM on instruction data
Chapter 7. Extracting and saving responses
Chapter 7. Evaluating the fine-tuned LLM



No Password - Links are Interchangeable

Free Code with the Author of Build an LLM (From Scratch) by Sebastian Raschka, Downloads Code with the Author of Build an LLM (From Scratch) by Sebastian Raschka, Rapidgator Code with the Author of Build an LLM (From Scratch) by Sebastian Raschka, Mega Code with the Author of Build an LLM (From Scratch) by Sebastian Raschka, Torrent Code with the Author of Build an LLM (From Scratch) by Sebastian Raschka, Google Drive Code with the Author of Build an LLM (From Scratch) by Sebastian Raschka.
Feel free to post comments, reviews, or suggestions about Code with the Author of Build an LLM (From Scratch) by Sebastian Raschka including tutorials, audio books, software, videos, patches, and more.

[related-news]



[/related-news]
DISCLAIMER
None of the files shown here are hosted or transmitted by this server. The links are provided solely by this site's users. The administrator of our site cannot be held responsible for what its users post, or any other actions of its users. You may not use this site to distribute or download any material when you do not have the legal rights to do so. It is your own responsibility to adhere to these terms.

Copyright © 2018 - 2025 Dl4All. All rights reserved.