<div class="statcounter"><a title="Web Analytics" href="https://statcounter.com/" target="_blank"><img class="statcounter" src="https://c.statcounter.com/12795394/0/d64e9537/1/" alt="Web Analytics" referrerPolicy="no-referrer-when-downgrade">

BizTech Q&A 5: Hallucinations

BizTech Q&A

Answering your questions about business and technology

 

Question

Someone told me to watch out for ChatGPT hallucinations.  What does that mean and what can I do about it?

~ Nancy

Answer

Hi Nancy,

Thank you for the question.

Sometimes ChatGPT (and other AI tools) will generate output that sounds true but is not.  These are called hallucinations.  

You can cross-check important data with reliable sources (websites, people, publications, etc.) to make sure it is accurate.  If the data isn’t that important (random story, etc.) then you likely won’t notice or care if there are inaccuracies.

You can also resubmit your input and see if you get a better response.  This tends to work better if you point out the incorrect information.

Let us know if you have any other questions.

Damien

Have A Question Ask Us

Send Us Your Questions!

Do you have a question about business or technology? Post it below in the "Comments" or submit it via email using the "Ask Us" button.

Comments