A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community’s icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
It not being conscious or self aware. It’s just putting words together that don’t necessarily have any meaning. It can simulate language but meaning is a lot more complex than putting the right words in the right places.
I’d also be VERY surprised if it isn’t harvesting people’s data in the exact way you’ve described.
That’s correct, its whole experience is limited to a ~2000 word text prompt (that includes your questions, as well as previous answers). Everything else is a static model with a bit of randomness sprinkled in so it doesn’t just repeat. It doesn’t learn. It doesn’t have long term memory. Every new conversation starts from scratch.
User data might be used to fine tune future models, but it has no relevance for the current one.
This is just wrong, but despite being frequently parroted. It obviously understands a lot. Having a little bit of conversation with it should make it very clear. You can’t generate language without understanding the meaning, people have tried before and never got very far. The only problem it has is that its understanding is only of language, it doesn’t know how language relates to other sensory inputs (GPT-4 has a bit of image stuff build in, but it’s all still a work in progress). So don’t ask it to draw pictures or graphs, the results won’t be any good.
That said, it’s surprising how much knowledge it can extract just from text alone.
you don’t need to be surprised, in their ToS is written pretty big that anything you write to chatGPT will be used to train it.
nothing you write in that chat is private.