Meta has made its last major AI announcement of the year, according to an Instagram post from CEO Mark Zuckerberg in which he explains the release of Llama 3.3 - the final generative AI model before Llama 4.
The tech giant says Llama 3.3 70B delivers the same quality of performance as its 405B model but is easier and more cost-efficient to run, making it potentially more accessible to creators and developers.
"By leveraging the latest advancements in post-training techniques including online preference optimization, this model improves core performance at a significantly lower cost," writes Meta's Vice President of Generative AI at Meta Ahmad Al-Dahle.
A published chart Al-Dahle shared below his post on X shows Llama 3.3 70B outperforming competing generative-AI models, like Google's Gemini 1.5 Pro, OpenAI's GPT-4o, and Amazon's new Nova Pro.
Meta says its new model is designed to deliver users improved functions in math, general knowledge, following of instructions and app use.
Llama models have been downloaded over 650 million times, according to Zuckerberg. In addition, Meta's AI assistant, which is powered by Llama models, has almost 600 million monthly active users, making it a contender for the most-used AI assistant on the market.
Meta says it is building a massive AI data center in Louisiana, which it plans to use in order to train future versions of Llama, marking a significant commitment to the competitive AI space. During the company's Q4 earning call, Zuckerberg said Meta will need 10x more computing power to train Llama 4, and has partnered with computing-chip distributor Nvidia to carry out its goal.