• remi_pan@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 days ago

    If the jailbreak is about enabling the LLM to tell you how to make explosives or drugs, this seems pointless, because I would never trust a IA so prone to hallucinations (and basicaly bad at science) in such dangerous process.