Unit 01: Getting an Idea of NLP and its Applications | |||
Module 01: Introduction to NLP | 00:03:00 | ||
Module 02: By the End of This Section | 00:01:00 | ||
Module 03: Installation | 00:04:00 | ||
Module 04: Tips | 00:01:00 | ||
Module 05: U – Tokenization | 00:01:00 | ||
Module 06: P – Tokenization | 00:02:00 | ||
Module 07: U – Stemming | 00:02:00 | ||
Module 08: P – Stemming | 00:05:00 | ||
Module 09: U – Lemmatization | 00:02:00 | ||
Module 10: P – Lemmatization | 00:03:00 | ||
Module 11: U – Chunks | 00:02:00 | ||
Module 12: P – Chunks | 00:05:00 | ||
Module 13: U – Bag of Words | 00:04:00 | ||
Module 14: P – Bag of Words | 00:04:00 | ||
Module 15: U – Category Predictor | 00:05:00 | ||
Module 16: P – Category Predictor | 00:06:00 | ||
Module 17: U – Gender Identifier | 00:01:00 | ||
Module 18: P – Gender Identifier | 00:08:00 | ||
Module 19: U – Sentiment Analyzer | 00:02:00 | ||
Module 20: P – Sentiment Analyzer | 00:07:00 | ||
Module 21: U – Topic Modeling | 00:03:00 | ||
Module 22: P – Topic Modeling | 00:06:00 | ||
Module 23: Summary | 00:01:00 | ||
Unit 02: Feature Engineering | |||
Module 01: Introduction | 00:02:00 | ||
Module 02: One Hot Encoding | 00:02:00 | ||
Module 03: Count Vectorizer | 00:04:00 | ||
Module 04: N-grams | 00:04:00 | ||
Module 05: Hash Vectorizing | 00:02:00 | ||
Module 06: Word Embedding | 00:11:00 | ||
Module 07: FastText | 00:04:00 | ||
Unit 03: Dealing with corpus and WordNet | |||
Module 01: Introduction | 00:01:00 | ||
Module 02: In-built corpora | 00:06:00 | ||
Module 03: External Corpora | 00:08:00 | ||
Module 04: Corpuses & Frequency Distribution | 00:07:00 | ||
Module 05: Frequency Distribution | 00:06:00 | ||
Module 06: WordNet | 00:06:00 | ||
Module 07: Wordnet with Hyponyms and Hypernyms | 00:07:00 | ||
Module 08: The Average according to WordNet | 00:07:00 | ||
Unit 04: Create your Vocabulary for any NLP Model | |||
Module 01: Introduction and Challenges | 00:08:00 | ||
Module 02: Building your Vocabulary Part-01 | 00:02:00 | ||
Module 03: Building your Vocabulary Part-02 | 00:03:00 | ||
Module 04: Building your Vocabulary Part-03 | 00:07:00 | ||
Module 05: Building your Vocabulary Part-04 | 00:12:00 | ||
Module 06: Building your Vocabulary Part-05 | 00:06:00 | ||
Module 07: Tokenization Dot Product | 00:03:00 | ||
Module 08: Similarity using Dot Product | 00:03:00 | ||
Module 09: Reducing Dimensions of your Vocabulary using token improvement | 00:02:00 | ||
Module 10: Reducing Dimensions of your Vocabulary using n-grams | 00:10:00 | ||
Module 11: Reducing Dimensions of your Vocabulary using normalizing | 00:10:00 | ||
Module 12: Reducing Dimensions of your Vocabulary using case normalization | 00:05:00 | ||
Module 13: When to use stemming and lemmatization? | 00:04:00 | ||
Module 14: Sentiment Analysis Overview | 00:05:00 | ||
Module 15: Two approaches for sentiment analysis | 00:03:00 | ||
Module 16: Sentiment Analysis using rule-based | 00:05:00 | ||
Module 17: Sentiment Analysis using machine learning – 1 | 00:10:00 | ||
Module 18: Sentiment Analysis using machine learning – 2 | 00:04:00 | ||
Module 19: Summary | 00:01:00 | ||
Unit 05: Word2Vec in Detail and what is going on under the hood | |||
Module 01: Introduction | 00:04:00 | ||
Module 02: Bag of words in detail | 00:14:00 | ||
Module 03: Vectorizing | 00:08:00 | ||
Module 04: Vectorizing and Cosine Similarity | 00:10:00 | ||
Module 05: Topic modeling in Detail | 00:16:00 | ||
Module 06: Make your Vectors will more reflect the Meaning, or Topic, of the Document | 00:10:00 | ||
Module 07: Sklearn in a short way | 00:03:00 | ||
Module 08: Summary | 00:02:00 | ||
Unit 06: Find and Represent the Meaning or Topic of Natural Language Text | |||
Module 01: Keyword Search VS Semantic Search | 00:04:00 | ||
Module 02: Problems in TI-IDF leads to Semantic Search | 00:10:00 | ||
Module 03: Transform TF-IDF Vectors to Topic Vectors under the hood | 00:11:00 | ||
Assignment | |||
Assignment -U&P AI – Natural Language Processing (NLP) with Python | 00:00:00 |
Membership renews after 12 months. You can cancel anytime from your account.