Personally seen this behavior a few times in real life, often with worrying implications. Generously I’d like to believe these people use extruded text as a place to start thinking from, but in practice is seems to me that they tend to use extruded text as a thought-terminating behavior.
IRL, I find it kind of insulting, especially if I’m talking to people who should know better or if they hand me extruded stuff instead of work they were supposed to do.
Online it’s just sort of harmless reply-guy stuff usually.
Many people simply straight-up believe LLMs to be genie like figures as they are advertised and written about in the “tech” rags. That bums me out sort of in the same way really uncritical religiosity bums me out.
HBU?
Ffs, I had one of those at work.
One day, we bought a new water sampler. The thing is pretty complex and requires from a licensed technician from the manufacturer to come and commission it.
Since I was overseeing the installation and later I would be the person responsible of connecting it to our industrial network, I had quite a few questions about the device, some of them very specific.
I swear the guy couldn’t give me even the most basic answers about the device without asking chatgpt. And at a certain point, I had to answer myself one question by reading the manual (that I downloaded on the go, because the guy didn’t have a paper copy of it) because chatgpt couldn’t give him an answer. This guy was someone hired by the company making the water sampler as an “expert”, mind you.
assuming you were in meatspace with this person, I am curious, did they like… open gpt in mid convo with you to ask it? Or say “brb”?