

when we asked GPT-3It is a very popular and powerful artificial intelligence language system. Whether it is more likely to use a paper map or a stone to ventilate life into the coals for barbecue, he preferred the stone.
To smooth out your wrinkled skirt, can you grab a warm thermos or hairpin? GPT-3 hairpin suggested.
And if you need to cover your hair to work at a fast food restaurant, what would be better, a paper sandwich wrapper or a hamburger bun? GPT-3 went for a cake.
G/O Media may earn a commission

$32 off
A complete hair revitalization kit
Fight hair loss with science
Right now, you can get The Hair Revitalizing Complex Full Set for the price of a refill. That’s just $98 for a 30-day supply, and $32 off the regular price of the supplement. This supplement is proven to deliver results. Augustinus Bader conducted a six-month double-blind trial and found that those taking the supplement increased their hair count by 56%, hair shine by 100%, and experienced a 98% reduction in hair damage compared to those who took a placebo.
Why does GPT-3 make these choices when most people would choose the alternative? Because GPT-3 does not understand language as humans do.
ChatGPT’s disembodied words
One of us is Psychology researcher Which more than 20 years ago presented a series of scenarios like the one above Test the computer model’s understanding of the language since then. The model did not choose precisely between using rocks and maps in the coal fans, while humans did so easily.
The other of us is PhD student in cognitive science who was part of a team of researchers recently Use the same scenarios for GPT-3 testing. Although GPT-3 was better than the older model, it was much worse than humans. You’ve got all three of the above scenarios completely wrong.
GPT-3, the engine that powered the initial version of ChatGPT, learns language by observing, from a trillion instances, which words tend to follow other words. The strong statistical regularities in linguistic sequences allow GPT-3 to learn a great deal about a language. This string knowledge often allows ChatGPT to produce reasonable sentences, articles, poems, and computer code.
Although GPT-3 is very good at Learn the rules for the following what In human language, she has no hazy idea of what any of these words mean to a human being. And how is that possible?
Humans are biological entities that have evolved with bodies that need to function in the physical and social worlds to get things done. Language is a tool that helps people do this. GPT-3 is an artificial software system that predicts the next word. He doesn’t need to accomplish anything with these predictions in the real world.
I guess, so I’m not artificially smart
The meaning of the word or sentence is It is closely related to the human bodyPeople’s abilities to act, perceive, and have emotions. Human perception is enabled by embodiment. People’s understanding of a term like “sandwich paper wrap,” for example, includes the look, feel, and weight of the wrapper, and thus how we might use it: to wrap the sandwich. Understanding people also includes how someone can use it for the myriad other opportunities it provides, such as passing it in a ball for a game of hoops, or covering one’s hair.
All of these uses arise due to the nature and needs of human bodies: people have hands that can fold paper, a head of hair that is roughly the same size as sandwich wrap, and the need to use and therefore follow rules such as covering with hair. This means that people understand how to make use of things in the ways they are It is not captured in language use statistics.
Your body shapes your mind.
GPT-3, its successor, GPT-4and his cousins coldAnd chinchilla And LLaMA They do not have bodies, and therefore cannot identify folding objects on their own, or the many other characteristics named by psychologist JJ Gibson. expenses. Looking at people’s hands and arms, paper maps provide kindling the flame, and a thermos provides wrinkles.
Without arms and hands, not to mention having to wear non-wrinkle clothes for a job, GPT-3 can’t figure out these costs. He could only fake it if he came across something similar in the word stream on the internet.
Will a large-language AI understand language as well as humans do? From our point of view, it is not devoid of a humanoid body, senses, purposes, and lifestyles.
Towards a sense of artificial intelligence world
GPT-4 is trained on images and text, which allows it to learn statistical relationships between words and pixels. While we can’t do our original analysis on GPT-4 because it doesn’t currently produce the probability it assigns to words, when we asked GPT-4 all three questions, it answered them correctly. This may be due to the model learning from previous inputs, or increasing its size and visual input.
However, you can continue to build new examples to skip by thinking of things that have surprising possibilities that the model probably hasn’t encountered. For example, GPT-4 says that a cup with the bottom cut off will be better at holding water than a light bulb with the bottom cut off.
A model with access to images might be something like a child learning language — and the world — from television: it’s easier than learning from radio, but human-like understanding requires significant opportunity. interact with the world.
Recent research has taken this approach, training language models on Generate physics simulationAnd interact with physical environments and even Create robotic action plans. Understanding embodied language may be a long way off, but these types of interactive, multisensory projects are crucial steps on the way there.
ChatGPT is a great tool that will no doubt be used for good – and not so – good purposes. But don’t be fooled into thinking that he understands the words he utters, let alone that It’s sensitive.
Want to learn more about artificial intelligence, chatbots, and the future of machine learning? Check out our full coverage of artificial intelligenceor browse our guides to The best free AI art generators And Everything we know about OpenAI’s ChatGPT.
Arthur GlinbergProfessor Emeritus of Psychology, Arizona State University And Cameron Robert JonesPhD student in cognitive science University of California, San Diego
This article has been republished from Conversation Under Creative Commons Licence. Read the The original article.