• tabular@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    3
    ·
    1 day ago

    hallucination refers to the generation of plausible-sounding but factually incorrect or nonsensical information”

    Is an output an hallucination when the training data involved in the output included factually incorrect data? Suppose my input is “is the would flat” and then an LLM, allegedly, accurately generates a flat-eather’s writings saying it is.