• jol@discuss.tchncs.de
    link
    fedilink
    arrow-up
    1
    ·
    3 days ago

    I know, and accept that. You can’t just tell an LLM not to halucinate. I would also not trust that trust score at all. If there’s something LLMs are worse than accuracy, is maths.