When I first heard that term hallucinations with AI way back, I thought, whoa! As I quickly learned what they were, it made sense.
Errors where an AI generates false or misleading information that appears plausible but is not based on real data or facts.
Of course I had to immediately turn my thoughts to human hallucinations. Particularly when induced by a drug. Yes, those as well could be false or misleading. But in most cases, from my experience, they were just freaking weird.
I’m glad, between AI and myself, that it’s the one with hallucinations.
Leave a comment