Apple, one of the leading companies in artificial intelligence, has generated widespread criticism from privacy advocates. This time, Apple is building a cloud-based database to automatically scan users’ personal photos for images that may have been posted on erotic or adult image sharing websites.
On August 21, Apple announced a proposal to check user-stored images in iCloud for evidence of child sexual assault.
The company’s decision aims to protect user privacy while enabling the business to flag potentially harmful and abusive content without disclosing any other information.
However, it quickly received strong criticism from privacy and security experts and digital rights groups, who were worried that the surveillance capabilities could lead to privacy and security risks for iCloud users worldwide.
Beginning in September 2021, Apple announced that it would stop the feature’s deployment to “gather feedback and make changes before delivering these critically essential kid safety features,” indicating that a launch was still on the way.
Apple also said the CSAM-detection tool for iCloud photos had been discontinued in light of the comments and recommendations it has received.
Last week, Apple revealed that it is concentrating its anti-CSAM efforts and investments on its “Communication Safety” capabilities, which were first introduced in December 2018 after being first mentioned in August 2021. Through family iCloud accounts, parents and other caregivers can choose to enable the protections.
“After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021,” the company shared in a statement.
Along with the news that Apple is significantly increasing its end-to-end encryption options for iCloud, including adding security for backups and photographs saved on the cloud service, the firm announced on December 7 that it is updating its CSAM software.
“Potential child exploitation can be interrupted before it happens by providing opt-in tools for parents to help protect their children from unsafe communications,” the company added.
“Apple is dedicated to developing innovative privacy-preserving solutions to combat Child Sexual Abuse Material and protect children while addressing the unique privacy needs of personal communications and data storage.”
In the end, Apple focused on other AI solutions instead of continuing with the development of CSAM technology, which could harm users’ privacy.
More Stories
Killnet and AnonymousSudan Collaborate to Launch Cyber Attacks on Western Organisations
In recent news, it has been reported that two Russia-sympathetic hacktivist groups, Killnet and AnonymousSudan, have allegedly launched a series...
$4000 Gone In An Instant: Mother Defrauded in Facebook Marketplace Car Deal
A mother of four is warning others to be cautious after believing she had purchased a safe and dependable car...
Shocking Scam: Sydney Family Loses $200K Life-Savings in Suncorp Spoofing Fraud
A family from Sydney has lost their life savings worth $200,000 due to a fraudulent scam. Peter and Madison, who...
Mysterious Money Transfer Leaves Couple Speechless: How They Got an Unsolicited $4000
A young couple in Melbourne claims their bank is making up a personal loan they do not understand. Ashley and...
Phishing + AI + Voice Cloning= Big Trouble: The New Way Criminals are Stealing Your Money
New Alert: Criminals use AI and voice cloning to trick you out of your money. Earlier this year, Microsoft unveiled...
‘Impossible to Spot’ Delivery Scam Email Targets Australia Post Customers – Don’t Fall Victim!
Unsuspecting shoppers should be cautious as a parcel delivery scam that is hard to distinguish targets Australia Post customers. Email...