TIL, you only need 4x H100 80GB GPUs to fine-tune @AIatMeta Llama 3.1 405B using Q-LoRA, packing with a sequence length of 2048. https://t.co/isXeL3VOTB
— Philipp Schmid (@_philschmid) Aug 22, 2024
from Twitter https://twitter.com/_philschmid
August 22, 2024 at 07:31PM
via IFTTT
No comments:
Post a Comment