Plus, live facial recognition updates and the week’s A.I. research
Tuesday should have been a win for privacy advocates. Washington state signed SB 6280 into law, making it the first state in the country to pass a facial recognition bill, which outlines how the government can and cannot use the technology.
But a closer look reveals the bill’s flaws. The law does little to curtail government use of facial recognition, instead setting up basic transparency and accountability mechanisms for when the state does decide to deploy dystopian real-time surveillance.
The bill has little impact on the commercial development or sale of facial recognition technology. The bill doesn’t limit sales to law enforcement, or even hold companies responsible for the outcomes of their algorithms.
The bill was sponsored by State Senator Joe Nguyen, who is currently employed as a program manager by Microsoft
It’s no surprise then that the bill was championed by Microsoft in public and behind closed doors. In fact, the bill was literally sponsored by State Senator Joe Nguyen, who is currently employed as a program manager by Microsoft.
I’m not a fan of the phrase “Let that sink in,” but you really do have to take a minute and think about that one.
The ACLU of Washington wrote a strong rebuttal of the law, which is set to take effect in July, saying that anything short of a facial recognition ban will not safeguard civil liberties.
“Alternative regulations supported by big tech companies and opposed by impacted communities do not provide adequate protections — in fact, they threaten to legitimize the infrastructural expansion of powerful face surveillance technology,” ACLU project manager Jennifer Lee wrote. “This is why we strongly opposed SB 6280, which purports to put safeguards around the use of facial recognition technology but does just the opposite.”
The transparency and accountability measures are better than nothing. The new law requires state and local government agencies to alert the public about what facial recognition algorithms they are buying, and train its staff adequately to use the technology. Companies selling the technology to the government now need to open their algorithms to independent testing. State or local prosecutors relying on facial recognition must tell defendants that before the trial.
But when it comes to real-time surveillance, the new law says police must now get a warrant except in the broad case of “exigent circumstances,” enshrining the capability of real-time facial recognition in the state legislature.
“People are always afraid of something new, but there’s no stopping technology. We can either ignore it and other people develop it, or we can understand it’s here to stay … and try to steer its path toward the force of good, ” Wolfcom CEO Peter Onruang said.
Onruang said concerns about constant police surveillance were overblown.
“I know there are fears out there that people will get shot for facial recognition. That’s not what it will be designed for … It’s only meant to help an officer realize there’s a possibility (they’re interacting with a wanted or missing person),” he said.
Thankfully, no technology has ever been used for something other than what it was designed for.
And now, the best part: Let’s take a look at some new A.I. research from this week.
Detecting deepfakes and other synthetic media will always be a cat and mouse game. Here, researchers show that it’s possible to fool “deepfake detectors” with fake images.
Uber’s self-driving car unit has found that algorithms analyzing 3D lidar data can be tricked into missing objects around them. That also means a car relying on lidar to see other cars on the road can be fooled into catastrophic accidents.
This paper seemingly gives self-driving cars the ability to predict the future, by analyzing the movement of objects around them and trying to guess their trajectory.
Facebook is researching the ability to create highly detailed 3D models of people from just a single image. The obvious use case is in Facebook’s virtual reality software for its Oculus headsets.
Computer scientists and art historians try to build a deep learning algorithm that can find popular Christian iconography, like Mary and Gabriel, in thousands of historical works of art.