Thoughts Brewing Blog

BizTech Q&A 5: Hallucinations

Written by Damien Griffin | Nov 3, 2024 10:00:00 AM

Answering your questions about business and technology

 

Question

Someone told me to watch out for ChatGPT hallucinations.  What does that mean and what can I do about it?

~ Nancy

Answer

Hi Nancy,

Thank you for the question.

Sometimes ChatGPT (and other AI tools) will generate output that sounds true but is not.  These are called hallucinations.  

You can cross-check important data with reliable sources (websites, people, publications, etc.) to make sure it is accurate.  If the data isn’t that important (random story, etc.) then you likely won’t notice or care if there are inaccuracies.

You can also resubmit your input and see if you get a better response.  This tends to work better if you point out the incorrect information.

Let us know if you have any other questions.

Damien