• Quatity_Control@lemm.ee
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    Keep in mind ChatGPT is a language model. It’s designed specifically to simulate sounding like a human. It does that… Okay. It doesn’t understand the information or concepts it is using. It just sounds like it does. It can’t reliably do basic maths and doesn’t try or need to. It just needs to talk about it in a believably conversational way.

    The brain does far more than process information. And ChatGPT doesn’t even really do that.

    • lloram239@feddit.de
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Okay. It doesn’t understand the information or concepts it is using.

      That’s just utter nonsense. ChatGPT by every definition of the word very much understands a lot of what it is talking about. People complaining about ChatGPT not “understanding” seems to have a hard time grasping how insanely difficult it is to produce natural language answers and how much you need to understand of the context to do so successfully.

      It can’t reliably do basic maths

      Neither can many humans, but my $5 calculator is great at it. There are without a doubt a lot of things that ChatGPT can’t do, sometimes fundamentally so, like math. It can’t do loops and it doesn’t even get to see the digits of the numbers it should calculate on, so not a terribly big surprise that it can’t do math very well. English language, and a whole bunch of other ones, on the other side, that it understands surprisingly well.

      Basically, if you want to complain about ChatGPT, complain about things it actually gets wrong, saying “it doesn’t understand” just makes you sound like a parrot and note even a clover one.

      • Quatity_Control@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        While it’s humorous how personally you are taking critiques of, chatGPT, it is unfortunate you are also demonstrating a fundamental lack of basic understanding of how ChatGPT works. Because of that, you have inflated what you believe chatGPT is doing.

        Even when it gets basic maths wrong repeatedly. Because I can tell it 2+2=5 and it will agree with me. Conversationally. Since it has no concept of what 2+2=5 means.

        Even though it has no memory of previous conversations, you believe it somehow retains understanding of concepts it discusses.

        Even though it searches the internet to provide it the knowledge to answer questions, which is why it can cite sources that don’t exist or don’t support its claims, clearly demonstrating a fundamental lack of understanding the concept, or even the concept of citing sources.

        Even though it was literally trained by humans telling it what the three most correct conversational response would be out of the 5 answers it gave every calibration question, you still believe it actually possesses intelligence above any human, who can have a conversation without making any of these mistakes.

        I clearly put chatGPT “intelligence” as remarkably low as is possible, even non-existent. I also must concede in this situation it is smarter than at least one human I am aware of.