A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community’s icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
Oh no, the A.I. identified someone as a drug trafficker, and the police pulled that person over on suspicion of being a drug trafficker, and found out that he was indeed a drug trafficker, and now he’s upset he got caught by a robot dragnet.
I don’t think drugs should be criminalized, but are we supposed to be upset that A.I. is going to finally help parse data and solve crimes?
This time they where right because it was indeed a drugsdealer but just look at what it took to get this data
“in this case it was used to examine the driving patterns of anyone passing one of Westchester County’s 480 cameras over a two-year period.”
“the AI determined that Zayas’ car was on a journey typical of a drug trafficker. According to a Department of Justice prosecutor filing, it made nine trips from Massachusetts to different parts of New York between October 2020 and August 2021 following routes known to be used by narcotics pushers and for conspicuously short stays.”
So apparently making long trips with short stays is now enough proof to be searched by police. And if they can extrapolate that into “this guys a dealer” how much other data and possible extrapolations got caught in the crossfire off all those cameras. How long till someone in power decides selling some of that info to corporations is a good way to line state/government/own pockets?
Maybe we should place cameras in everyones house and listening bugs in every single phone? Criminality solved? Or hear me out, the real criminals will adept, find new and novel ways while the common citizen is kept in line with fines for even the smallest offense.
Then the police state will want to escalate the tools again, even more suppressing technology. Good thing were spend so much resources continuing to bully normal citizens into generating cash flow from fines. Money and resources well spend?
Or maybe the world needs some actually intelligent people that can find the root causes of criminal behavior and restructure society to improve well being and chances so people want to belong and maintain it rather then feeling like the system of opportunities is rigged against them so they should cheat to survive.
This is literally 1984. We shouldn’t applaud when they propose a surveillance state
Hyperbole.
There’s a difference between using AI as a tool and using it as a solution. Though knowing how this society works, it’ll start off as a tool like now, and soon enough the higher ups will wonder why humans are even necessary in the process, especially when they need to be paid, and against everyone else’s objections they’ll get rid of the human verification part and use only AI, and when things go wrong the people in charge will go “who could have seen it coming?”
Wasn’t there recently the news of the suicide outline that fired its staff for generative AI chat bots until not even a few days later it started giving dangerous responses?
Sounds like oversight, transparency, and regulation are in order. There’s no putting this genie back in the bottle, unfortunately.
Yeah you’re right, helping people order lunch is literally 1984.
Uh oh, the things you are buying look pretty suspicious, we are going to wiretap your house to make sure you are not doing any no-nos.
I really dislike people using technology to analyze my habits, I prefer just stumbling onto things I like because they were around the places I usually look. Yes, I also don’t like algorithmic content, it just makes people try to appease the algorithm, meaning less effort into the thing they do
I think people get worried about concepts like pre-crime with this.
Worry, what…that you’ll get pulled over for suspicions while being innocent, and then the cops would be forced to find out you’re innocent?
Yeah I can see that being inconvenient to downright dangerous depending on the cop, personally I think it has the potential to do more good than bad.
It’s just a shame the police have such a hard on for drugs when there’s so much worse stuff going on out there.
Big “if you’re innocent, you’ve got nothing to hide” energy. I can be innocent and still not want unnecessary interaction with police and appreciate my privacy
And a certain percentage of innocent people are found guilty. You don’t see how expanding the arrests of innocent people is a bad thing? It has the potential to ruin lives
What about innocent until proven guilty?
Sir, we have proof you look like a drug dealer, you have to prove you are not one
Wait until you grow up a bit and learn about probable cause, or people getting pulled over for nothing. This tech changes nothing about your idea of “innocent until proven guilty” vs what actually happens on the daily.
I think that’s exactly the point. The current situation is already bad, tools that reinforce the bad part of the system shouldn’t be accepted.
It’s not the point. You guys are discussing pipe dreams and impossible scenarios. I’m just trying to be pragmatic about what’s happening.
Did you miss the part in the article about this tech being run by private companies? Or how it’s so seamless it can be installed on ANY existing camera system? No upgrades to hardware required.
This surveillance genie is already out of the bottle so just hoping it will go away or be made suddenly illegal in a country that has had a hard-on for surveillance on its own people since at least 9/11 is foolish.
So the best we can do is hope for proper laws regulating and controlling it, so it doesn’t turn into all of those evil things everyone always wants to jump to first.
Because ACAB, but recognize that police work is also one of the areas that could really benefit from AI technology, they are constantly flooded with information from all sorts of sources and it leads to ridiculous backlogs that actually affect society.
So yeah, this tech, like all tech, has the potential to do great harm to society if not reigned in, but it also has the potential to help find your child after they’ve been abducted, or locate your wife or mom after someone has attacked them.
It could do legitimate good in society, too, if used correctly.
Acknowledging this is a dragnet, a practice generally considered unconstitutional since the 1950s, actually illustrates pretty well why people are upset about it. Even if it would result in more easy prosecutions for cops, it doesn’t change that it’s mass surveillance and an unconstitutional practice.
I guess that will be up to the courts to decide, not us.
This pattern might indicate drugs. Or adultery, which isn’t illegal. It could be a straight job such as a mobile MRI technician. It might be a landlord.
In short this is likely to affect innocent people. It’s like if you’ve got a name that happens to be on the no fly list, right? Your travel is fucked and you haven’t done anything wrong.