Personally seen this behavior a few times in real life, often with worrying implications. Generously I’d like to believe these people use extruded text as a place to start thinking from, but in practice is seems to me that they tend to use extruded text as a thought-terminating behavior.

IRL, I find it kind of insulting, especially if I’m talking to people who should know better or if they hand me extruded stuff instead of work they were supposed to do.

Online it’s just sort of harmless reply-guy stuff usually.

Many people simply straight-up believe LLMs to be genie like figures as they are advertised and written about in the “tech” rags. That bums me out sort of in the same way really uncritical religiosity bums me out.

HBU?

  • stabby_cicada@slrpnk.net
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    14 days ago

    The thing is, I think, anybody who wants an answer from ChatGPT can ask ChatGPT themselves - or just Google it and get the AI answer there. People ask questions on social media because they want answers from real people.

    Replying to a Lemmy post with a ChatGPT generated answer is like replying with a link to a Google search page. It implies the question isn’t worth discussing - that it’s so simple the user should have asked ChatGPT instead of posting it. I agree with the OP - it’s frankly a little insulting.