• 0 Posts
  • 2 Comments
Joined 1 year ago
cake
Cake day: March 3rd, 2024

help-circle

  • Part of the reason that this jailbreak worked is that the Windows keys, a mix of Home, Pro, and Enterprise keys, had been trained into the model, Figueroa told The Register.

    Isn’t that the whole point? They’re using prompting tricks to tease out the training data. This has been done several times with copyrighted written works. That’s the only reasonable way ChatGPT could produce valid Windows keys. What would be the alternative? ChatGPT somehow reverse engineered the algorithm for generating valid Windows product keys?