The risk, explains Yuste, is that the same tools which – in medicine – can help improve people’s lives, can also end up violating the information stored in the brain. “Although the roadmap is beneficial, these technologies are neutral and can be used for better or worse,” he notes. This isn’t only about securing personal data, such as shopping habits, a home address, or which political party one supports – it also involves things as intimate as memories and thoughts. And, in the not so distant future, even the subconscious.
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community’s icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
Think “shopping habits” already includes subconscious thoughts. Advertisers know when you will quit a brand before you do.
Title made me think this was a “sentient a.i.” argument, but I’m glad to see it’s not. human neuro rights is exactly what I think we need to be thinking about.
We also need a fix for established classes in society. Why have the smallest fraction of the population hoarded almost all of the benefits from humanity’s advancements in the past 50 years? It’s unconscionable.
not actually reading the article though, because i can’t easily read it past the cookie confirmation.
Obligatory mention of Target figuring out when people are pregnant and sending related marketing, including teenagers, and the dad that got mad when they sent it to his daughter.
Btw, i learned recently that subconscious is outdated and it’s rather pre-conscious.