Transformer’s attention mechanism can be linked to other cool ideas in AI
— hardmaru (@hardmaru) September 26, 2020
- Indirect Encoding in Neuroevolutionhttps://t.co/G740mhjBv4
- Hopfield Networks with continuous stateshttps://t.co/FL8PimjVo9
- Graph Neural Networks with multi-head attentionhttps://t.co/PACMnKT50F
from Twitter https://twitter.com/hardmaru
September 26, 2020 at 12:37AM
via IFTTT
No comments:
Post a Comment