3 important updates from last week: 1. @AIatMeta’s surprise Saturday release of the Llama 4 herd: It sparked initial hype, quickly followed by widespread criticism. While mixture-of-experts (MoE) architecture allows for massive parameter counts, users report underwhelming https://t.co/N00mjZvwK3
— TuringPost (@TheTuringPost) Apr 9, 2025
from Twitter https://twitter.com/TheTuringPost
April 09, 2025 at 11:35AM
via IFTTT
No comments:
Post a Comment