Every moronic offensive against personal freedoms needs an excuse. We need to fight the terrorists they say – and wiretap the globe. We need to search for illegal porn they say – and get into your phone.
Apple wants to scan iPhones in the United States for photos of child abuse. The tech giant develops special software for this purpose that will sound the alarm if he thinks he has encountered illegal images. A team of investigators then decides whether the police should be involved.
Apple says the program will compare photos on devices such as the iPhone and iPad with images of child abuse in a database of the National Center for Missing and Exploited Children. The system would have a margin of error of less than “one in trillion”. The company says it won’t get information about users’photos until they have a collection of known abuse material in their iCloud account.
The tech company announced more measures. For example, the digital assistant Siri can intervene when users try to search for abuse material. There will also be an option in the Messages app that can protect children when they try to send or receive sexual images. These images can then be screened and parents receive a warning when such content is sent or viewed.
The Financial Times reported on the initiative earlier and described it as a compromise. Apple would try to strike a balance between protecting customer privacy and supporting governments that demand more help in the hunt for criminals. Some experts don’t feel comfortable with it. “This is the fence of the dam,” safety researcher Matthew Green of Johns Hopkins University predicted in the newspaper. “Governments are going to demand this from everyone.”
Researchers said that such a program can also be modified to detect other content, such as images of decapitations by extremists or anti-government protests. Critics fear that other tech companies may be put under pressure to monitor data on phones in a similar way.