Ted Chiang has one of the most profound articles, I've read, on explaining LLMs. He mentions that understanding and compression are two sides of the same coin.🪙 And interestingly, when we’re dealing with predicting words, lossy compression looks smarter than lossless… https://t.co/vpWVPdZcb3 https://t.co/Bm44f2xk3I
— Zain Hasan (@ZainHasan6) Dec 10, 2023
from Twitter https://twitter.com/ZainHasan6
December 10, 2023 at 07:37PM
via IFTTT
No comments:
Post a Comment