We've all dealt with activation functions while working with neural nets. - Sigmoid - Tanh - ReLu & Leaky ReLu - Gelu Ever wondered why they are so important❓🤔 Let me explain it to... 👇 https://t.co/Oqfd09QeIc
— Akshay 🚀 (@akshay_pachaar) Mar 1, 2024
from Twitter https://twitter.com/akshay_pachaar
March 01, 2024 at 12:51PM
via IFTTT
No comments:
Post a Comment