15 C
London
Sunday, September 24, 2023
HomeNewsApple launched Child Safety Features as per Tim Hardwick

Apple launched Child Safety Features as per Tim Hardwick

Date:

Related stories

Nayya raises $11 million in a series A round led by Felicis Ventures

With unwavering assistance from all initial investors, including Social...

Joel Kaplan, Facebook’s Man in Washington, Has an Infinite Reach

Even by the norms of the Trump White House,...
spot_imgspot_img

More than a dozen well-known cybersecurity experts blasted Apple and the European Union’s plans to check people’s phones for illegal content on Thursday, branding the initiatives as hazardous and inefficient tactics that would encourage governmental spying. apple csam child safetyhardwickmacrumors.

In a 46-page paper, the experts said that Apple’s proposition to find photographs of child sex abuse on iPhones and a proposal put forth by European Union members to find pictures of sex abuse of children and terrorist iconography on encrypted devices in Europe both employed “hazardous technologies.”

Resisting attempts to eavesdrop on and influence law-abiding persons “should be a national-security priority,” the researchers stated.

By scanning photographs uploaded to Apple’s iCloud storage service, a technology known as client-side scanning would enable Apple, or possibly law enforcement agencies in Europe, to find images of child sexual assault on a person’s phone.

A so-called fingerprint of the image will be matched against a repository of documented child sexual abuse material to look for potential similarities, according to Apple’s announcement of the upcoming technology in August.

The proposal, however, infuriated privacy activists and stoked worries that the technology may erode online privacy and potentially be used by authoritarian governments to find political opponents and other foes.

The experts predict that a petition to legalize photo scan in the European Union could appear as early as this year.

They claimed they were making their findings public right away to alert the European Union to the risks associated with its proposal and because the “extension of the monitoring capabilities of the state truly is passing a red line,” according to Ross Anderson, a participant of the team and professor of security engineering at the University of Cambridge.

In addition to privacy issues, the researchers claimed that their research showed the technology was ineffective in detecting photographs of child sexual abuse. They said that within days of Apple’s announcement, individuals had suggested ways to circumvent recognition by slightly changing the photos.

Susan Landau, a professor of cybersecurity and policy at Tufts University, warned that the technology permits “scanning of a personal private gadget without any probable grounds for something illicit being done.” It is really risky. It poses a risk to privacy, public safety, business, and national security. 

Subscribe

- Never miss a story with notifications

- Gain full access to our premium content

- Browse free from up to 5 devices at once

Latest stories

spot_img