The AI mashed information together that didn’t go together in that context and returned something that was not correct. It was wrong, but did not invent anything.
AI doesn’t do that either. That is another example of trying to make AI sound like it has reasoning and intent like a person instead of the pattern matching weighted randomizer that it is.
Sure, it’s not hallucinating in the sense of how a human does. But that is the industry name, and has been in use since 1995 in the study of neural networks
The AI mashed information together that didn’t go together in that context and returned something that was not correct. It was wrong, but did not invent anything.
“Hallucinates”
AI doesn’t do that either. That is another example of trying to make AI sound like it has reasoning and intent like a person instead of the pattern matching weighted randomizer that it is.
Sure, it’s not hallucinating in the sense of how a human does. But that is the industry name, and has been in use since 1995 in the study of neural networks
https://en.wikipedia.org/wiki/Hallucination_(artificial_intelligence)