Checklist 252: Cheating, Spying, and Cheating
This week on The Checklist:
- Another stalkerware app is leaking data
- Critics say “I told you so” about iOS CSAM scanning
- Are students using Live Text to cheat?
Insecure apps for insecure people
This week, TechCrunch broke a story about yet another stalkerware company that has exposed private data through poor security practices. According to the piece:
The private phone data of hundreds of thousands of people are at risk. Call records, text messages, photos, browsing history, precise geolocations and call recordings can all be pulled from a person’s phone because of a security issue in widely used consumer-grade spyware.
TechCrunch isn’t saying exactly which stalkerware vendor has been leaking data, because they don’t want to make it any easier for bad guys to get their hands on people’s information.
This isn’t the first time that a commercial spyware company has accidentally exposed the data that they collect. mSpy (the company we discussed last week in connection with its Google Ads shenanigans) was found to have leaked millions of records back in 2018.
Unfortunately, this kind of thing seems to be par for the course for stalkerware companies. One industry expert on tech-enabled abuse, Eva Galperin of Electronic Frontier Foundation, remarked:
I am disappointed but not even slightly surprised … I think that we could reasonably characterize this kind of behavior as negligent. Not only do we have a company which is making a product which enables abuse, but they’re doing such a poor job of securing the information that’s exfiltrated that they are opening the targets of this abuse to even further abuse.
Needless to say, if someone you know is thinking about installing stalkeware on a partner’s device — tell them to think again. Not only is stalkerware abusive, but it also risks exposing their partner’s data to the world.
If you want to learn more about this issue, you can read this blog post about stalkerware on Apple platforms.
Of slippery slopes
When Apple announced that it would scan for child sexual abuse material (CSAM) on iOS, there was an outcry from the cybersecurity community.
We addressed the issue on Checklist 242, Checklist 243, and Checklist 246. But for those who missed it, the short version is that Apple was planning to implement on-device scanning of images that users wanted to upload to iCloud. Each image would be given a hash value, which would be compared to hash values of known CSAM images from the National Center for Missing and Exploited Children (NCMEC) database. This would allow Apple to tell if someone had CSAM on their iPhone and alert the authorities if need be.
Curbing the spread of CSAM material is an obvious good. So why were people in the security community so worried about Apple’s plans? One of the main concerns was summed up well by EFF’s Eva Galperin:
Apple has now built this capability to do on-device scanning. And this opens them up to demands from governments to use it for other things; for things other than photos intended to be uploaded to iCloud. Having the system at all — under any circumstances — is very dangerous … Many countries already have laws on the books requiring the kind of scanning that Apple’s new system allows Apple to do. Apple has resisted (up until now) by telling governments that it’s simply not capable of doing what they ask. But that’s no longer true.
Fans of Apple’s CSAM scanning tech dismissed such concerns as overblown and hypothetical. And some have been rather derisive about it: One NCMEC executive famously referred to Apple’s critics as “the screeching voices of the minority”.
However, it’s starting to look as though the “screeching voices” were right all along.
We hate to say “I told you so”, but…
As 9 to 5 Mac reports in a recent article:
Governments were already discussing how to misuse CSAM scanning technology even before Apple announced its plans, say security researchers.
9 to 5 Mac cites a piece published in The New York Times that reads:
More than a dozen prominent cybersecurity experts [last week] criticized plans by Apple and the European Union to monitor people’s phones for illicit material, calling the efforts ineffective and dangerous strategies that would embolden government surveillance […]
The cybersecurity researchers said they had begun their study before Apple’s announcement. Documents released by the European Union and a meeting with E.U. officials last year led them to believe that the bloc’s governing body wanted a similar program that would scan not only for images of child sexual abuse but also for signs of organized crime and indications of terrorist ties.
Is Apple playing with fire?
You may be able to see where this is going. Apple says that its CSAM scanning tech will only be used to scan for CSAM. But governments in the EU and elsewhere already want to use this kind of technology to fight other types of crime. So what happens when a government decides that it wants to scan for anti-government memes, or blasphemy, or outlawed LGBTQ+ content? Will Apple become, in Galperin’s words, “repression’s little helper”?
Apple says that it will refuse to comply with such requests. Well and good, but the problem is that Apple is already on record as saying that it has to comply with local laws. They already change their software offerings based on what local governments require. Just recently, on Checklist 248, we discussed how Apple was forced to pull an opposition politician’s voting app from the App Store in Russia — and how the company isn’t going to be releasing its Private Relay feature in Russia either due to local restrictions on VPN-like tech.
Apple has delayed its rollout of the CSAM scanning feature based on feedback from users and privacy advocates. If the company decides to go ahead with it in the end, we can only hope that they’re ready for the pressure that it will bring.
All your notes are belong to us
One of the coolest new features in iOS 15 is Live Text. It lets you take a picture of words, and then converts those words to text that you can paste into other documents.
It’s meant as a convenience: as a quick way to copy and call a phone number that you see on a business’s sign, or to copy a time from a marquee into your calendar. But students are now using the feature to copy their classmates’ notes — directly from their laptop screens and without permission!
Now, some might call that “using Live Text to cheat”. Alternatively, you could think of it as “creating an ad hoc study buddy”. But whatever the ethics of it, educators are concerned that it could impact learning. As a Newsweek article notes:
Within education research, there exist two suggested benefits of taking notes while learning: external-storage hypothesis, which is looking back on them after, and encoding hypothesis, which is the idea that actively taking notes actually helps you comprehend and retain the information better.
The Newsweek piece also quoted a student who points out why one group will definitely not be fans of the new feature: “At my university people were selling their notes for money … sorry to see their business model destroyed.”
For more security tips and commentary, have a look at The Checklist archives. To ask us a question about Apple security — or about digital security and privacy in general — just send us an email!