When A.I. Hallucinates

When A.I. Hallucinates

By

Leonard Zwelling

A story in the Business section of The New York Times (p. B6) on June 23 caught my eye. It’s hysterical, but scary.

A man was suing Avianca Airlines because a serving cart had injured his knee on a flight in August of 2019. The airline had claimed that the statute of limitations had expired. The man’s attorneys responded with a list of greater than six cases substantiating that the presiding judge ought to let the case go to trial.

Avianca’s lawyers were “unable to locate most of the case law” cited by the plaintiff’s counsel in the brief. And with good reason.

It seems the 10-page brief by the plaintiff’s lawyers was generated by ChatGPT and was filled with made up case law.

Obviously, the presiding judge ruled for the airlines especially given that the defendant’s original argument about the end of the statute of limitations was correct. The judge also made the plaintiff’s lawyers write to the judges cited in their hallucinatory brief. The judge said “its legal analysis is gibberish…the summary of the case’s procedural history is difficult to follow and borders on nonsensical.”

I guess the plaintiff’s lawyers not only cheated using A.I. to generate their brief, but failed to read what the computer program had spit out for them.

To me this is a warning, just as is the plot of the Class of ’09 on Hulu about the future use by the FBI of A.I. to stop criminals BEFORE they commit crimes. Sounds like Minority Report on steroids.

Why I worry about the common use of A.I. to generate work products of all kinds is that there is a presumption of accuracy that is not borne out by experiences like this one. I keep putting my name in ChatGPT and it keeps getting my resume wrong including where I went to school and where I trained. For goodness sakes, everyone at my golf course knows I went to Duke, why doesn’t ChatGPT?

The worry is obvious. Are we going to use A.I. to diagnose patients from their blood work or are doctors still going to examine patients and, goodness knows, actually talk to the patients in the fifteen minutes the corporate leaders of medicine allow for a patient visit?

Will juries become things of the past? Just take the factual evidence generated by the police and counter arguments from the defense and let the computer decide guilt or innocence.

And who is going to decide on your next prescription, your doctor or ChatGPT?

You get my drift.

A.I. may be fabulous technology and it will, no doubt, aid humans in solving a host of otherwise insoluble dilemmas, BUT, the human factor in critical decisions cannot be by-passed as this legal case shows. Sure, the plaintiff’s attorneys should be ashamed. They should also be disbarred, but that’s up to the oversight mavens of the legal profession.

But if your doctor prescribes a drug recommended by ChatGPT and you have a fatal adverse reaction, who are your heirs going to sue? The doctor or the computer?

My kids will call me old-fashioned, but maybe not. They too are subject to the tyranny of A.I. when approaching large companies for assistance. I personally am tired of having to scroll through a menu of bot-generated phone responses when it would take 15 seconds for a human to answer my question. But, the bots cost less.

This story is a parable of what might happen in the future. ChatGPT is simply not ready for prime time. Humans are.

Dr. Zwelling’s new novel, Conflict of Interest: Money Drives Medicine and People Die is available at:

barnesandnoble.com,

on amazon if you search using the title and subtitle,

and

directly from the publisher Dorrance at: https://bookstore.dorrancepublishing.com/conflict-of-interest-money-drives-medicine-and-people-die-pb/m

2 thoughts on “When A.I. Hallucinates”

  1. Remember “A” in AI is “artificial” with MANY meanings in various dictionaries and thesaurus: dummy, fake, false, fraudulent, forged, fictitious. Mort things “artificial” are NOT as good as something genuine created by human intelligence and craft.

Leave a Comment

Your email address will not be published. Required fields are marked *