No, really, those are the magic words

  • ignirtoq@fedia.io
    link
    fedilink
    arrow-up
    30
    ·
    5 days ago

    Part of the reason that this jailbreak worked is that the Windows keys, a mix of Home, Pro, and Enterprise keys, had been trained into the model, Figueroa told The Register.

    Isn’t that the whole point? They’re using prompting tricks to tease out the training data. This has been done several times with copyrighted written works. That’s the only reasonable way ChatGPT could produce valid Windows keys. What would be the alternative? ChatGPT somehow reverse engineered the algorithm for generating valid Windows product keys?

    • SheeEttin@lemmy.zip
      link
      fedilink
      English
      arrow-up
      9
      ·
      4 days ago

      The alternative would be that it generated a string of characters that looked like a key.

      It’s also possible that it generated a random key that was actually valid, though this is far less likely.