Tutorials :

Udemy - Deep Learning for NLP - Part 2

      Author: Baturi   |   12 August 2021   |   comments: 0



Udemy - Deep Learning for NLP - Part 2
MP4 | Video: h264, 1280x720 | Audio: AAC, 44.1 KHz
Language: English | Size: 1.19 GB | Duration: 2h 52m
What you'll learn


Deep Learning for Natural Language Processing
Encoder-decoder models, Attention models, ELMo
GLUE, Transformers, GPT, BERT
DL for NLP
Requirements
Basics of machine learning
Recurrent Models: RNNs, LSTMs, GRUs and variants
Multi-Layered Perceptrons (MLPs)
Description
This course is a part of "Deep Learning for NLP" Series. In this course, I will introduce concepts like Encoder-decoder attention models, ELMo, GLUE, Transformers, GPT and BERT. These concepts form the base for good understanding of advanced deep learning models for modern Natural Language Processing.
The course consists of two main sections as follows.
In the first section, I will talk about Encoder-decoder models in the context of machine translation and how beam search decoder works. Next, I will talk about the concept of encoder-decoder attention. Further, I will elaborate on different types of attention like Global attention, local attention, hierarchical attention, and attention for sentence pairs using CNNs as well as LSTMs. We will also talk about attention visualization. Finally, we will discuss ELMo which is a way of using recurrent models to compute context sensitive word embeddings.
In the second section, I will talk about details about the various tasks which are a part of the GLUE benchmark and details about other benchmark NLP datasets across tasks. Then we will start our modern NLP journey with understanding different parts of an encoder-decoder Transformer model. We will delve into details of Transformers in terms of concepts like self attention, multi-head attention, positional embeddings, residual connections, and masked attention. After that I will talk about two most popular Transformer models: GPT and BERT. In the GPT part, we will discuss how is GPT trained and what are differences in variants like GPT2 and GPT3. In the BERT part, we will discuss how BERT is different from GPT, how it is pretrained using the masked language modeling and next sentence prediction tasks. We will also quickly talk about finetuning for BERT and multilingual BERT.
Who this course is for:
Beginners in deep learning
Python developers interested in data science concepts

Homepage
https://www.udemy.com/course/ahol-dl4nlp2/


Buy Premium From My Links To Get Resumable Support,Max Speed & Support Me


Links are Interchangeable - No Password - Single Extraction
Udemy - Deep Learning for NLP - Part 2 Fast Download
Udemy - Deep Learning for NLP - Part 2 Full Download

free Udemy - Deep Learning for NLP - Part 2, Downloads Udemy - Deep Learning for NLP - Part 2, Rapidgator Udemy - Deep Learning for NLP - Part 2, Nitroflare Udemy - Deep Learning for NLP - Part 2, Mediafire Udemy - Deep Learning for NLP - Part 2, Uploadgig Udemy - Deep Learning for NLP - Part 2, Mega Udemy - Deep Learning for NLP - Part 2, Torrent Download Udemy - Deep Learning for NLP - Part 2, HitFile Udemy - Deep Learning for NLP - Part 2 , GoogleDrive Udemy - Deep Learning for NLP - Part 2,  Please feel free to post your Udemy - Deep Learning for NLP - Part 2 Download, Tutorials, Ebook, Audio Books, Magazines, Software, Mp3, Free WSO Download , Free Courses Graphics , video, subtitle, sample, torrent, NFO, Crack, Patch,Rapidgator, mediafire,Mega, Serial, keygen, Watch online, requirements or whatever-related comments here.





DISCLAIMER
None of the files shown here are hosted or transmitted by this server. The links are provided solely by this site's users. The administrator of our site cannot be held responsible for what its users post, or any other actions of its users. You may not use this site to distribute or download any material when you do not have the legal rights to do so. It is your own responsibility to adhere to these terms.

Copyright © 2018 - 2023 Dl4All. All rights reserved.