Home The misfires of AI — what about hallucinations of the chatbots?

The misfires of AI — what about hallucinations of the chatbots?

Some of the first instances of “hallucination” of the chatbots were reported in April 2023. This phenomenon occurs when chatbots and/or people see what isn’t there, and the problem is getting worse. Schools and universities (and businesses) are trying to figure out how to fix this issue before it gets too big. Well, it has already gotten too big.

For those who mess around with OpenAI’s ChatGPT, Google’s Bard (and the like) recognized the issue when Ben Zimmer of The Wall Street Journal wrote about it. In Zimmers’ own words:

For instance, I asked Bard about “argumentative diphthongization,” a phrase that I just made up. Not only did it produce five paragraphs elucidating this spurious phenomenon, the chatbot told me the term was “first coined by the linguist Hans Jakobsen in 1922.” Needless to say, there has never been a prominent linguist named Hans Jakobsen (although a Danish gymnast with that name did compete in the 1920 Olympics).

AI researchers are calling this issue “hallucinations.” Can machines actually become unhinged and deviate from reality? Apparently so. Here is an interview on CBS’s “60 Minutes” with Google CEO, Sundar Pichai, who recognizes the problem of AI hallucination all too well. Pichai says that “no one has been able to solve the hallucination, yet and that all AI models have this as an issue.”

The Interesting Subject of Hallucination in Neural Machine Translation

Here is the interview — listen closely.

Here is an open review worth looking into the immediate problem plaguing the industry at present, written by several scholars working with Google AI in 2018. Why are we just hearing about this in the last few months — and why is it worsening?

CNN said it this way, “Before artificial intelligence can take over the world, it has to solve one problem. The bots are hallucinating. AI-powered tools like ChatGPT have mesmerized us with their ability to produce authoritative, human-sounding responses to seemingly any prompt.”

Yesterday, Wired stated, “Chatbot hallucinations are poisoning web search.” It really is difficult to tell the difference unless you know the actual truth. The artificial intelligence tells you the information in the most confident tones, and the response seems true — untell you search further and find out that it is not true.

Are these AI hallucinations preventable? But here are the worst chatbot liars — those who profess to know medical knowledge. Imagine the parent who does not have medical training — like most of us. Picture this: late one night, you have a sick child, and you want to ask search whether you should give the kid a little Tylenol or take them to the emergency room. The bot instructs you erroneously, and your child is injured. Most could detect a few issues arising from this scenario.

PubMed — the official website of the United States government responded to chatbot scientific writing. Here. Even the government is a little concerned.

Let’s hope the chatbots get an overhaul soon.

Featured Image Credit:

About ReadWrite’s Editorial Process

The ReadWrite Editorial policy involves closely monitoring the tech industry for major developments, new product launches, AI breakthroughs, video game releases and other newsworthy events. Editors assign relevant stories to staff writers or freelance contributors with expertise in each particular topic area. Before publication, articles go through a rigorous round of editing for accuracy, clarity, and to ensure adherence to ReadWrite's style guidelines.

Deanna was an editor at ReadWrite until early 2024. Previously she worked as the Editor in Chief for Startup Grind, Editor in Chief for Calendar, editor at Entrepreneur media, and has over 20+ years of experience in content management and content development.

Get the biggest tech headlines of the day delivered to your inbox

    By signing up, you agree to our Terms and Privacy Policy. Unsubscribe anytime.

    Tech News

    Explore the latest in tech with our Tech News. We cut through the noise for concise, relevant updates, keeping you informed about the rapidly evolving tech landscape with curated content that separates signal from noise.

    In-Depth Tech Stories

    Explore tech impact in In-Depth Stories. Narrative data journalism offers comprehensive analyses, revealing stories behind data. Understand industry trends for a deeper perspective on tech's intricate relationships with society.

    Expert Reviews

    Empower decisions with Expert Reviews, merging industry expertise and insightful analysis. Delve into tech intricacies, get the best deals, and stay ahead with our trustworthy guide to navigating the ever-changing tech market.