A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community’s icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
You get AI tools shoved down your throat everywhere nowadays. Whether you want it and it’s useful or not.
Please don’t call them AI. They are “Language Learning Models” (or “Spicy Autocorrect” if you want to be cheeky).
Copilot is no more “intelligent” than Clippy from Microsoft Bob in 1995. It just appears to be to people who also have low intelligence.
The distinction is irrelevant and “AI” is what businesses and normal folks call this stuff. Just like the age old arguments that the media should say something like “cyber criminals” instead of “hackers” or “cloud” is just other people’s computers. LLM, GNU/spicy-auto-correct, whatever. To the populous it’s all “AI”.
https://mastodon.social/@sdw/112203918268779518
Actually Indians.
People who don’t understand how LLMs work aren’t necessarily of low intelligence.
Don’t get ignorance and intelligence mixed up. People of low intelligence do that
Ehhhh, if you have expertise in ANY field outside of like programming, you can easily test various models and see that they produce a lot of crap. That doesn’t require you to understand how LLMs work exactly.
It’s not just text generating AI, like those transformer models, but also image classificators and generators, time series predictors, and a bunch of other stuff you get.
But yes, even though you seem not to like it, it is AI.
I can’t share that experience.
That’s a bit condescending, don’t you think?