• jws_shadotak@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    29
    ·
    edit-2
    11 months ago

    ChatGPT consistently makes up shit. It’s difficult to tell when something is made up because it’s a language model so it is supposed to sound confident as if it’s any person telling a fact that they know.

    It knows how to talk like a subject matter expert because that’s usually what gets publicized most and thus that’s what it’s trained on, but it doesn’t always know the facts necessary to answer questions. It makes shit up to fill the gap and then presents it intelligently, but it’s wrong.