End-to-end: automated hyperparameter tuning for deep neural networks

End-to-end: automated hyperparameter tuning for deep neural networks

HomeAbhishek ThakurEnd-to-end: automated hyperparameter tuning for deep neural networks
End-to-end: automated hyperparameter tuning for deep neural networks
ChannelPublish DateThumbnail & View CountDownload Video
Channel AvatarPublish Date not found Thumbnail
0 Views
In this video I show you how to automatically perform #HyperparameterOptimization for a #NeuralNetwork using Optuna. This is an end-to-end video where I select a problem and design a neural network in #PyTorch and then find the optimal number of layers, dropout, learning rate and other parameters using Optuna.

The dataset used in this video can be found here: https://www.kaggle.com/c/lish-moa

Subscribe and like the video so I stay motivated to make great videos like this. 🙂

00:00 Introduction
01:56 Dataset class
06:17 Start with train.py
08:19 Cross-validation folds
13:38 Read data
24:10 Engine
29:48 Model
35:10 Add model and engine to training
43:05 Optuna
49:02 Start tuning with Optuna
52:50 Training, suggestions and outro

To purchase my book Approaching (Almost) Any Machine Learning problem, visit: https://bit.ly/buyaaml

Follow me on:
Twitter: https://twitter.com/abhi1thakur
LinkedIn: https://www.linkedin.com/in/abhi1thakur/
Kaggle: https://kaggle.com/abhishek

Please take the opportunity to connect and share this video with your friends and family if you find it helpful.