[ad_1]
Tech companies like Google, OpenAI, Facebook parent Meta, Anthropic and others are using different types of data to train their AI models. But for Meta’s chief AI scientist Yann LeCun, it is not sufficient enough for AI models to be compared with animals, let alone humans.
“Animals and humans get very smart very quickly with vastly smaller amounts of training data than current AI systems.Current LLMs are trained on text that would take 20,000 years for a human to read,” LeCun said in a post on Threads.
He said despite getting trained on such a vast amount of data, AI models still haven’t learned that if A is the same as B, then B is the same as A.
“Humans get a lot smarter than that with comparatively little training data. Even corvids, parrots, dogs, and octopuses get smarter than that very, very quickly, with only 2 billion neurons and a few trillion ‘parameters.’
In comparison, GPT-4 is said to have 1.7 trillion parameters, while PaLM 2 is reported to have 340 billion parameters and the LLaMA foundation model of Meta Platforms has parameters ranging from 7 billion to 65 billion. Parameters are the kind of knobs in a model that is responsible for various probabilities that it can produce.
How can AI systems learn like humans?
LeCun said that current approaches to using data to train AI models have limitations. He said that new architectures may make it possible for AI models to learn as efficiently as animals and humans.
“My money is on new architectures that would learn as efficiently as animals and humans,” he said.
“Using more text data (synthetic or not) is a temporary stopgap made necessary by the limitations of our current approaches. The salvation is in using sensory data, e.g. video, which has higher bandwidth and more internal structure,” he added.
“Animals and humans get very smart very quickly with vastly smaller amounts of training data than current AI systems.Current LLMs are trained on text that would take 20,000 years for a human to read,” LeCun said in a post on Threads.
He said despite getting trained on such a vast amount of data, AI models still haven’t learned that if A is the same as B, then B is the same as A.
“Humans get a lot smarter than that with comparatively little training data. Even corvids, parrots, dogs, and octopuses get smarter than that very, very quickly, with only 2 billion neurons and a few trillion ‘parameters.’
In comparison, GPT-4 is said to have 1.7 trillion parameters, while PaLM 2 is reported to have 340 billion parameters and the LLaMA foundation model of Meta Platforms has parameters ranging from 7 billion to 65 billion. Parameters are the kind of knobs in a model that is responsible for various probabilities that it can produce.
How can AI systems learn like humans?
LeCun said that current approaches to using data to train AI models have limitations. He said that new architectures may make it possible for AI models to learn as efficiently as animals and humans.
“My money is on new architectures that would learn as efficiently as animals and humans,” he said.
“Using more text data (synthetic or not) is a temporary stopgap made necessary by the limitations of our current approaches. The salvation is in using sensory data, e.g. video, which has higher bandwidth and more internal structure,” he added.
[ad_2]
Source link
More Stories
Google Maps: Three privacy features coming to Google Maps on Android, iPhones
Most-Downloaded IPhone App: This Chinese app was the most-downloaded iPhone app in the US in 2023
Ukraine’s largest mobile operator goes offline for millions of users after cyber attack