Microsoft’s LinkedIn will update its User Agreement next month with a warning that it may show users generative AI content that’s inaccurate or misleading.
[…]
]The relevant passage, which takes effect on November 20, 2024, reads:
Generative AI Features: By using the Services, you may interact with features we offer that automate content generation for you. The content that is generated might be inaccurate, incomplete, delayed, misleading or not suitable for your purposes. Please review and edit such content before sharing with others. Like all content you share on our Services, you are responsible for ensuring it complies with our Professional Community Policies, including not sharing misleading information.
In short, LinkedIn will provide features that can produce automated content, but that content may be inaccurate. Users are expected to review and correct false information before sharing said content, because LinkedIn won’t be held responsible for any consequences.
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community’s icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
At this point, if you’re not double-checking something produced by your AI tool of choice, it absolutely is your fault. It’s no secret that these applications were trained on garbage.
Morally speaking I’d blame both sides on this matter - Microsoft/LinkedIn for shoving down generative A"I" where it shouldn’t, and users assumptive/gullible thus harmful enough to take the output at face value.
“We will provide you with a tool to emit garbage and a platform to share content. If you put the two together, you are liable.”
Attractive nuisance much? Is it too much to ask that they should have to label it a garbage generator instead of “AI”? Why does honesty always have to take a back seat?
Because then tech would have to admit they’re moving in to a period of stability rather than a period of constant growth.
The big companies and start ups need to prove they’ve still got “revolutionary” potential otherwise the stock values start to drop. And lower stock values means less bonuses for leadership.
The only thing I use AI for is generating character art for tabletop portraits and when the well is sufficiently poisoned I will probably go back to Pinterest.