GPT4All: A new 7B LLM based on LLaMa. The model is trained on 800k completions from GPT-3.5-Turbo. They released: • The code • Training data • Model checkpoints w/ LoRa weights • 4-bit quantized weights for CPU inference Now THAT is open-source. https://t.co/3YhsVDlDqQ
— Mark Tenenholtz (@marktenenholtz) Mar 28, 2023
from Twitter https://twitter.com/marktenenholtz
March 28, 2023 at 04:38PM
via IFTTT
No comments:
Post a Comment