Falcon 180B is out🤯 - 180B params - Trained on 3.5 trillion tokens+7 million GPU hours - Quality on par with PaLM 2 outperforms Llama 2 and GPT-3.5 across 13 benchmarks - 4bit and 8bit precision with similar quality Demo: https://t.co/JCMYid7VYn Blog: https://t.co/vb8ppRHgJ5
— Omar Sanseviero (@osanseviero) Sep 6, 2023
from Twitter https://twitter.com/osanseviero
September 06, 2023 at 05:44AM
via IFTTT
No comments:
Post a Comment