dotsdots

Natural Language Processing with Probabilistic Models

Description

In Course 2 of the Natural Language Processing Specialization, you will:

a) Create a simple auto-correct algorithm using minimum edit distance and dynamic programming,
b) Apply the Viterbi Algorithm for part-of-speech (POS) tagging, which is vital for computational linguistics,
c) Write a better auto-complete algorithm using an N-gram language model, and
d) Write your own Word2Vec model that uses a neural network to compute word embeddings using a continuous bag-of-words model.Read more.

This resource is offered by an affiliate partner. If you pay for training, we may earn a commission to support this site.

Career Relevance by Data Role

The techniques and tools covered in Natural Language Processing with Probabilistic Models are most similar to the requirements found in Data Scientist job advertisements.

Similarity Scores (Out of 100)

Learning Sequence

Natural Language Processing with Probabilistic Models is a part of one structured learning path.

Coursera
DeepLearning.AI

4 Courses 4 Months

Natural Language Processing