This Tool Could Protect Your Photos From Facial Recognition

In latest years, firms have been prowling the online for public pictures related to folks’s names that they’ll use to construct monumental databases of faces and enhance their facial-recognition programs, including to a rising sense that non-public privateness is being misplaced, bit by digital bit.

A start-up referred to as Clearview AI, for instance, scraped billions of on-line pictures to construct a software for police that would lead them from a face to a Facebook account, revealing an individual’s id.

Now researchers are attempting to foil these programs. A workforce of laptop engineers on the University of Chicago has developed a software that disguises pictures with pixel-level adjustments that confuse facial recognition programs.

Named Fawkes in honor of the Guy Fawkes masks favored by protesters worldwide, the software program was made accessible to builders on the researchers’ web site final month. After being found by Hacker News, it has been downloaded greater than 50,000 occasions. The researchers are engaged on a free app model for noncoders, which they hope to make accessible quickly.

The software program just isn’t meant to be only a one-off software for privacy-loving people. If deployed throughout tens of millions of pictures, it will be a broadside towards facial recognition programs, poisoning the accuracy of the so-called information units they collect from the online.

“Our goal is to make Clearview go away,” mentioned Ben Zhao, a professor of laptop science on the University of Chicago.

Fawkes converts a picture — or “cloaks” it, within the researchers’ parlance — by subtly altering a number of the options that facial recognition programs depend upon once they assemble an individual’s face print. In a research paper, reported earlier by OneZero, the team describes “cloaking” photos of the actress Gwyneth Paltrow using the actor Patrick Dempsey’s face, so that a system learning what Ms. Paltrow looks like based on those photos would start associating her with some of the features of Mr. Dempsey’s face.

The changes, usually subtle and not perceptible to the naked eye, would prevent the system from recognizing Ms. Paltrow when presented with a real, uncloaked photo of her. In testing, the researchers were able to fool facial recognition systems from Amazon, Microsoft and the Chinese tech company Megvii.

The team says it plans to tweak the software so that it will no longer subtly change the gender of users.

The other issue is that my experiment wasn’t what the tool was designed to do, so Shawn Shan, a Ph.D. student at the University of Chicago who is one of the creators of the Fawkes software, made the changes to my photos as extreme as possible to ensure that it worked. Fawkes isn’t intended to keep a facial recognition system like Facebook’s from recognizing someone in a single photo. It’s trying to more broadly corrupt facial recognition systems, performing an algorithmic attack called data poisoning.

The researchers said that, ideally, people would start cloaking all the images they upload. That would mean a company like Clearview that scrapes those photos wouldn’t be able to create a functioning database, because an unidentified photo of you from the real world wouldn’t match the template of you that Clearview would have built over time from your online photos.

But Clearview’s chief executive, Hoan Ton-That, ran a version of my Facebook experiment on the Clearview app and said the technology did not interfere with his system. In fact, he said his company could use images cloaked by Fawkes to improve its ability to make sense of altered images.

“There are billions of unmodified photos on the internet, all on different domain names,” Mr. Ton-That said. “In practice, it’s almost certainly too late to perfect a technology like Fawkes and deploy it at scale.”

“I personally think that no matter which approach you use, you lose,” said Emily Wenger, a Ph.D. student who helped create Fawkes. “You can have these technological solutions, but it’s a cat-and-mouse game. And you can have a law, but there will always be illegal actors.”

Ms. Wenger thinks “a two-prong approach” is needed, where individuals have technological tools and a privacy law to protect themselves.

Elizabeth Joh, a law professor at the University of California, Davis, has written about tools like Fawkes as “privacy protests,” where individuals want to thwart surveillance but not for criminal reasons. She has repeatedly seen what she called a “tired rubric” of surveillance, then countersurveillance and then anti-countersurveillance, as new monitoring technologies are introduced.

“People are feeling a sense of privacy exhaustion,” Ms. Joh said. “There are too many ways that our conventional sense of privacy is being exploited in real life and online.”

For Fawkes to have an immediate effect, we would need all the photos of ourselves that we have already posted to be cloaked overnight. That could happen if a huge platform that maintains an enormous number of online images decided to roll out Fawkes systemwide.

A platform like Facebook adopting Fawkes would prevent a future Clearview from scraping its users’ images to identify them. “They could say, ‘Give us your real photos, we’ll cloak them, and then we’ll share them with the world so you’ll be protected,’” Mr. Zhao said.

Jay Nancarrow, a Facebook spokesman, did not rule out that possibility when asked for comment. “As part of our efforts to protect people’s privacy, we have a dedicated team exploring this type of technology and other methods of preventing photo misuse,” Mr. Nancarrow said.

“I’m actually interning on that exact team at Facebook right now,” said the Fawkes co-creator Mr. Shan.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *