Falling in place alongside OpenAI and Google’s stated goals, Mark Zuckerberg, CEO of Meta, has acknowledged that artificial general intelligence (AGI) is the direction Meta’s research is heading, in an interview with The Verge. Meta recently restructured its AI teams, combining its responsible AI research team with its generative department.
As for what artificial general intelligence is, Zuckerberg can’t really define it: “I don’t have a one-sentence, pithy definition. You can quibble about if general intelligence is akin to human-level intelligence, or is it like human-plus, or is it some far-future super intelligence. But to me, the important part is actually the breadth of it, which is that intelligence has all these different capabilities where you have to be able to reason and have intuition.”
See? Crystal clear. A very simplistic definition of AGI is it refers to machines that can learn and reason across a broad range of domains at the level of the human mind or beyond.
Just in case you don't want to click over to the other sites, Big Zuck update
– Open sourcing will continue
– Currently training LLama 3
– AI + Metaverse
– Will have 350,000 H100s and ~600 H100 equivalents of compute 🤯
-Ideal AI formfactor is 🕶️ pic.twitter.com/xJSi7yVzXe
— Alex Volkov (Thursd/AI) (@altryne) January 18, 2024
In the interview, Zuckerberg acknowledges talent as one of the key limiting factors in AI research. “We’ve come to this view that, in order to build the products that we want to build, we need to build for general intelligence. I think that’s important to convey because a lot of the best researchers want to work on the more ambitious problems.”
“We’re used to there being pretty intense talent wars,” he says. “But there are different dynamics here with multiple companies going for the same profile, [and] a lot of VCs and folks throwing money at different projects, making it easy for people to start different things externally.”
One thing that Zuckerberg isn’t worried about losing out on though is computing power. AI development and research take an exceptionally high level of computing power, and Meta is prepared to meet the challenge with over 340,000 Nvidia H100 GPUs. Nvidia has emerged as a leader in AI chips.
“We have built up the capacity to do this at a scale that may be larger than any other individual company,” said Zuckerberg.
As Meta set out to develop Llama 3, Meta hopes to continue its trend of what Zuckerberg calls “responsible open sourcing”. He acknowledges that Llama 2 was not a leading AI model, but wants Llama 3 to be. “Our ambition is to build things that are at the state of the art and eventually the leading models in the industry.”
Featured image credit: Julio Lopez/Pexels