Рет қаралды 224,392
Activation functions (step, sigmoid, tanh, relu, leaky relu ) are very important in building a non linear model for a given problem. In this video we will cover different activation functions that are used while building a neural network. We will discuss these functions with their pros and cons,
1) Step
2) Sigmoid
3) tanh
4) ReLU (rectified linear unit)
5) Leaky ReLU
We will also write python code to implement these functions and see how they behave for sample inputes.
Github link for code in this tutorial: : github.com/codebasics/deep-le...
Do you want to learn technology from me? Check codebasics.io/?... for my affordable video courses.
🔖 Hashtags 🔖
#activationfunction #activationfunctionneuralnetwork #neuralnetwork #deeplearning
Next video: • Derivatives | Deep Lea...
Previous video: • Neural Network For Han...
Deep learning playlist: • Deep Learning With Ten...
Machine learning playlist : kzfaq.info?list...
Prerequisites for this series:
1: Python tutorials (first 16 videos): kzfaq.info?list...
2: Pandas tutorials(first 8 videos): • Pandas Tutorial (Data ...
3: Machine learning playlist (first 16 videos): kzfaq.info?list...
🌎 My Website For Video Courses: codebasics.io/?...
Need help building software or data analytics and AI solutions? My company www.atliq.com/ can help. Click on the Contact button on that website.
#️⃣ Social Media #️⃣
🔗 Discord: / discord
📸 Dhaval's Personal Instagram: / dhavalsays
📸 Instagram: / codebasicshub
🔊 Facebook: / codebasicshub
📝 Linkedin (Personal): / dhavalsays
📝 Linkedin (Codebasics): / codebasics
📱 Twitter: / codebasicshub
🔗 Patreon: www.patreon.com/codebasics?fa...