we love google (and LLMs)

further fact check showed that THE IMAGE HAS BEEN EDITED

No man, what you’re saying is fundamentally philosophical. You didn’t say anything about the Chinese room or epistemology, but those are the things you’re implicitly talking about.

You might as well say humans are fancy predictive muscle movement. Sight, sound and touch come in, movement comes out, tuned by natural selection. You’d have about as much of a scientific leg to stand on. I mean, it’s not wrong, but it is one opinion on the nature of knowledge and consciousness among many.

I didn’t bring up Chinese rooms because it doesn’t matter.

We know how chatGPT works on the inside. It’s not a Chinese room. Attributing intent or understanding is anthropomorphizing a machine.

You can make a basic robot that turns on its wheels when a light sensor detects a certain amount of light. The robot will look like it flees when you shine a light at it. But it does not have any capacity to know what light is or why it should flee light. It will have behavior nearly identical to a cockroach, but have no reason for acting like a cockroach.

A cockroach can adapt its behavior based on its environment, the hypothetical robot can not.

ChatGPT is much like this robot, it has no capacity to adapt in real time or learn.

Create a post

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

  • Posts must be relevant to programming, programmers, or computer science.
  • No NSFW content.
  • Jokes must be in good taste. No hate speech, bigotry, etc.
  • 1 user online
  • 77 users / day
  • 211 users / week
  • 413 users / month
  • 2.92K users / 6 months
  • 1 subscriber
  • 1.53K Posts
  • 33.8K Comments
  • Modlog