Improving accuracy using Hyper-parameter tuning

Improving accuracy using Hyper-parameter tuning

HomeAK PythonImproving accuracy using Hyper-parameter tuning
Improving accuracy using Hyper-parameter tuning
ChannelPublish DateThumbnail & View CountDownload Video
Channel AvatarPublish Date not found Thumbnail
0 Views
In machine learning, hyperparameter optimization or tuning is the problem of choosing a set of optimal hyperparameters for a learning algorithm. A hyperparameter is a parameter whose value is used to control the learning process. In contrast, the values of other parameters (usually node weights) are learned.

For collaboration, sponsors and projects: [email protected]
Keep supporting me to get 25,000 subscribers

Important links:
Datasets: https://github.com/akpythonyt/Datasets/blob/main/winequality-red.csv
Code:
https://github.com/akpythonyt/ML-algorithms/blob/main/Hyper%20parameter%20tuning.ipynb
Social contacts:
Ask your doubts on Instagram: arun.codes
Telegram: https://t.me/s/akpython (1.5K subscribers)
Twitter: https://twitter.com/Ak_Python

Codes & Donations:

‍Github: https://github.com/akpythonyt
If you are impressed by my work, buy me a cup of coffee:
https://www.buymeacoffee.com/akpython

Music by:
cold. by Sakura Hz https://soundcloud.com/sakurahertz
Creative Commons — Attribution 3.0 Unported — CC BY 3.0
Free download/stream: http://bit.ly/chill-sakuraHz
Music promoted by Audio Library https://youtu.be/pF2tXC1pXNo

Thank you ..!

Disclaimer:
All videos, songs, images and pictures used in the video are the property of their respective owners and I or this channel do not claim any rights to them.

Copyright Disclaimer Under Section 107 of the Copyright Act 1976, allowance is made for “fair use” for purposes such as criticism, comment, news reporting, teaching, scholarship, education and research. Fair use is a use permitted by copyright statute that might otherwise be infringing.”

Please take the opportunity to connect and share this video with your friends and family if you find it helpful.