• tonarinokanasan@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    6 months ago

    I think the trouble is, what baby are we throwing out with the bathwater in this case? We can’t prevent LLMs from hallucinating (but we can mitigate it somewhat with carefully constructed prompts). So, use cases where we’re okay with that are fair game, but any use case (or in this case, law?) that would require the LLM never hallucinates aren’t attainable, and to get back earlier, this particular problem has nothing to do with capitalism.