6. It can be confidently wrong.
A chatbot picks words that sound right. Most of the time, sounding right and being right go together. But sometimes they do not.
This is called hallucination — the chatbot produces something that reads well and sounds certain but is partly or completely made up. It might invent a book that does not exist, cite a fake statistic, or mix up details.
The tricky part: a wrong answer looks exactly like a right one. Same confidence. Same polish. No warning label.
Lesson 6 — It can be confidently wrong
Both responses look and sound identical
CorrectThis is accurate and verifiable.
Made upThis book does not exist.
A chatbot sounds just as sure when it is wrong as when it is right.