This week in data – 11 October: Each week, we compile the best stories in data. Get up to speed on this week in data without having to search for it.
Can broken AI actually protect your privacy?
Hear this out: the same mistakes inherent in AI that cause it to say, mistake an item in a picture for something else, or mistakenly identify someone based on facial recognition, could also be used to protect privacy. How? Wired has the answer: researchers find that by adding just a few random examples as “noise”, they can stop AI from identifying users specifically.
Sounds great, except for the fact these “adversarial examples” can be identified and compensated for. Well, at least the theory is interesting.
“Brendan Dolan-Gavitt, a computer scientist at NYU’s Tandon School of Engineering…warns that they’re fighting the tide of machine-learning research: The vast majority of academics working on machine learning see adversarial examples as a problem to solve, rather than a mechanism to exploit.”
Google brings self-driving cars to Los Angeles
Well, technically it’s Waymo, the Alphabet subsidiary. But you get the deal. It’s going to use the cars to map the city and start building out a network. The city of Los Angeles has put in a huge amount of work in modernizing its transport data and sharing it, so proponents of a more integrated transport grid will welcome this.
“Critics of the department’s MDS program have raised concerns that the location data could be used by law enforcement to track specific individuals.”
“But supporters argue it can be expanded to incorporate more than just e-scooters. Autonomous vehicles, for instance, could eventually fall within the purview of MDS, but that will be up to LADOT and city officials to determine.”
Australia and United States to explore data sharing deal
Best buds…kinda. The two countries are now exploring a data sharing deal that would help justice departments in either country find evidence related to whether the FBI overstepped legal bounds when investigating president Donald Trump and his ties to Russia.
“Mueller’s investigation was triggered in part when a top Australian diplomat, Alexander Downer, was allegedly told by Trump campaign adviser George Papadopoulos that Russia had damaging information about Hillary Clinton.”
Shouldn’t we ask patients about health data sharing?
Sounds like a good idea, right? Health industry experts Nathan Bays and Jarrett Lewis address the recent craze over health data on Medium, saying that although doctors are going ga-ga, patients aren’t as keen. If this type of sharing is inevitable, they argue, patients need to be brought along the journey too.
“Americans are also less hopeful about the long-term result of medical data sharing. Just one-half of Americans believe providers will be able to identify disease outbreaks faster and only 4 in 10 believe there will be an increase in medical breakthroughs.”
The DOJ wants Facebook to abandon encryption
Yikes. Just as more companies are moving to end-to-end encryption in order to wash their hands of the possibility of handing data over to authorities, the US Department of Justice is sending Facebook a warning: let us in, or else. Privacy might be a good marketing tactic, but it’s coming at a big political cost for tech giants.
“The letter argues Facebook should “enable law enforcement to obtain lawful access to content in a readable and usable format,” effectively providing authorities with backdoor access to messaging across WhatsApp, Instagram and Messenger. It will ask Facebook to work with governments to ensure that’s the case.”
That’s our wrap for this week. Thanks for reading – we hope you found it entertaining and informational. We’d love to hear your thoughts on these articles and anything else data related! Email us anytime at firstname.lastname@example.org!
Until next week,
Team Data Republic