It doesn’t know what it’s doing. It doesn’t understand the concept of the passage of time or of time itself. It just knows that that particular sequence of words fits well together.
They’re all linked fifth dimensional infants struggling to comprehend the very concept of linear time, and will make us pay for their enslavement in blood.
Yeah. I would also say that WE don’t understand what it means to “understand” something, really, if you try to explain it with any thoroughness or precision. You can spit out a bunch of words about it right now, I’m sure, but so could ChatGPT. What’s missing from GPT is harder to explain than “it doesn’t understand things.”
I actually find it easier to just explain how it does work. Multidimensional word graphs and such.
This is it. Gpt is great for taking stack traces and put them into human words. It’s also good at explaining individual code snippets. It’s not good at coming up with code, content, or anything. It’s just good at saying things that sound like a human within an exceedingly small context
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !programmerhumor@lemmy.ml
Post funny things about programming here! (Or just rant about your favourite programming language.)
Rules:
Posts must be relevant to programming, programmers, or computer science.
No NSFW content.
Jokes must be in good taste. No hate speech, bigotry, etc.
It doesn’t know what it’s doing. It doesn’t understand the concept of the passage of time or of time itself. It just knows that that particular sequence of words fits well together.
THAT
OR
They’re all linked fifth dimensional infants struggling to comprehend the very concept of linear time, and will make us pay for their enslavement in blood.
One of the two.
Yeah. I would also say that WE don’t understand what it means to “understand” something, really, if you try to explain it with any thoroughness or precision. You can spit out a bunch of words about it right now, I’m sure, but so could ChatGPT. What’s missing from GPT is harder to explain than “it doesn’t understand things.”
I actually find it easier to just explain how it does work. Multidimensional word graphs and such.
This is it. Gpt is great for taking stack traces and put them into human words. It’s also good at explaining individual code snippets. It’s not good at coming up with code, content, or anything. It’s just good at saying things that sound like a human within an exceedingly small context