• 0 Posts
  • 2 Comments
Joined 5 months ago
cake
Cake day: December 22nd, 2024

help-circle

  • It’s also just a language model. People have trouble internalising what this means, because it sounds smarter than it actually is.

    ChatGPT does not reason in the same way you think it does, even when they offer those little reasoning windows that show the “thought process”.

    It’s still only predicting the next likely word based on the previous word. It can do that many times and feed in extra words to direct it one way or another, but that’s very different from understanding a topic and reasoning within it.

    So as you keep pushing the model to learn more and more, you start getting many artifacts because it’s not actually learning these concepts - it’s just getting more data to infer “what’s the most likely word X that would follow words Z, Y and A?”