A federal bill that will require Google and Meta to pay media outlets for news content that they share or otherwise repurpose on their platforms is set to become law.
Yes agreed, but the problem is far wider than just Canadian news sources. The solution is not to try to tax them, because they will just disengage and make the problem worse. I don’t have a good solution right now but if we were to pass regulation it should be something about automated recommendation systems (ai) not trying to make people pay to link to things.
AI recommendation systems are even worse. They learn from the behaviour of people. Not everyone on the internet is trustworthy, so these AI algorithms are learning bad habits. It probably only takes around a 1000 person troll farm to manipulate a machine learning algorithm.
Who can devote 1000+ people to manipulating the machine learning algorithms that recommend content? Countries interested in disinformation campaigns to destabilize their adversaries. Hence C-11… need to have Canadians involved in those systems otherwise it’s just going to be Russia, China, or whoever else is dedicated enough to poisoning the data that the machine learning algorithms are learning from.
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !canada@lemmy.ca
Yes agreed, but the problem is far wider than just Canadian news sources. The solution is not to try to tax them, because they will just disengage and make the problem worse. I don’t have a good solution right now but if we were to pass regulation it should be something about automated recommendation systems (ai) not trying to make people pay to link to things.
AI recommendation systems are even worse. They learn from the behaviour of people. Not everyone on the internet is trustworthy, so these AI algorithms are learning bad habits. It probably only takes around a 1000 person troll farm to manipulate a machine learning algorithm.
Who can devote 1000+ people to manipulating the machine learning algorithms that recommend content? Countries interested in disinformation campaigns to destabilize their adversaries. Hence C-11… need to have Canadians involved in those systems otherwise it’s just going to be Russia, China, or whoever else is dedicated enough to poisoning the data that the machine learning algorithms are learning from.