Proud to announce: Mixtral 8x7B -- Mixtral of Experts - Free to use under Apache 2.0 license - outperforms Llama 2 70B with 6x faster inference. - matches or outperforms GPT3.5 - masters English, French, Italian, German and Spanish. - seq_len = 32K https://t.co/mRMQYtxmx6 1/N https://t.co/XMK8sA1KLH https://t.co/BnLzE7fXzR
— Devendra Chaplot @ #NeurIPS2023 (@dchaplot) Dec 11, 2023
from Twitter https://twitter.com/dchaplot
December 11, 2023 at 12:35PM
via IFTTT
No comments:
Post a Comment