Gradient descent, how neural networks learn | Chapter 2, Deep Learning

Gradient descent, how neural networks learn | Chapter 2, Deep Learning

Home3Blue1BrownGradient descent, how neural networks learn | Chapter 2, Deep Learning
Gradient descent, how neural networks learn | Chapter 2, Deep Learning
ChannelPublish DateThumbnail & View CountDownload Video
Channel AvatarPublish Date not found Thumbnail
0 Views
Enjoy these videos? Consider sharing one or two.
Help fund future projects: https://www.patreon.com/3blue1brown
Special thanks to these supporters: http://3b1b.co/nn2-thanks
Written/interactive form of this series: https://www.3blue1brown.com/topics/neural-networks

This video was supported by Amplify Partners.
For all the founders of early-stage ML startups, Amplify Partners looks forward to hearing from you at [email protected]

For more information, I highly recommend Michael Nielsen's book
http://neuralnetworksanddeeplearning.com/
The book walks through the code behind the example in these videos, which you can find here:
https://github.com/mnielsen/neural-networks-and-deep-learning

MNIST database:
http://yann.lecun.com/exdb/mnist/

Also check out Chris Olah's blog:
http://colah.github.io/
His post on neural networks and topology is particularly nice, but honestly all the stuff there is great.

And if you like that, you'll *love* the publications at Distillation:
https://distill.pub/

For more videos, Welch Labs also has some great series on machine learning:
https://youtu.be/i8D90DkCLhI
https://youtu.be/bxe2T-V8XRs

/"But I have already voraciously consumed the works of Nielsen, Olah and Welch/

Please take the opportunity to connect and share this video with your friends and family if you find it helpful.