Wide Neural Networks of Any Depth Evolve as Linear Models Under Gradient Descent https://t.co/zmHhklDdit <--- this should blow your mind a bit!! Also holds for convolutional networks, batch norm, ... Also, closed form for test predictions resulting from gradient descent training. pic.twitter.com/sRlaODOynq
— Jascha (@jaschasd) February 19, 2019
from Twitter https://twitter.com/jaschasd
February 19, 2019 at 10:12AM
via IFTTT
No comments:
Post a Comment