dotsdots

Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

Description

In the second course of the Deep Learning Specialization, you will open the deep learning black box to understand the processes that drive performance and generate good results systematically.

By the end, you will learn the best practices to train and develop test sets and analyze bias/variance for building deep learning applications; be able to use standard neural network techniques such as initialization, L2 and dropout regularization, hyperparameter tuning, batch normalization, and gradient checking; implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence; and implement a neural network in TensorFlow.Read more.

This resource is offered by an affiliate partner. If you pay for training, we may earn a commission to support this site.

Career Relevance by Data Role

The techniques and tools covered in Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization are most similar to the requirements found in Data Scientist job advertisements.

Similarity Scores (Out of 100)

Learning Sequence

Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization is a part of one structured learning path.

Coursera
DeepLearning.AI

5 Courses 5 Months

Deep Learning