Checklist 161: The Fixes Are In (Again)
This week on the Checklist, we’ll catch you up on what Apple has been doing to keep you safe. We’ll talk about some adware apps that got the boot from the App Store. And finally, we’ll look at the cybersecurity implications of Catholicism 2.0.
This week’s Checklist covers:
- A look at the security bits of this week’s Apple updates
- Adware violating apps kicked out of the App Store
- Do rosary beads really need an app? Really?
To opt in, or to opt out, that is the question…
This week, Apple released a bevy of updates for its OSes.
Some of these were UX or performance improvements (like the enhancements to the Camera app in iPhone 11), but a number of these updates bear on security and privacy—and demonstrate that Apple is taking outside criticism seriously.
If this first issue sounds familiar, it’s because we discussed it on the Checklist back in August. As you may remember, Siri was sending recordings of user interactions to third-party contractors tasked with improving the virtual assistant’s performance.
Unfortunately, many of these recordings were the result of accidental activations—and thus captured all sorts of extremely personal moments, including confidential discussions with physicians, users having sex, and even drug deals!
Not a great look for the privacy-focused Apple, and even worse was the fact that the story only went public after a whistleblower came forward to let people know this was happening.
Apple faced some fairly harsh criticism over this, but they took their lumps and tried to do the right thing: They apologized, suspended the practice of sending recordings to QA teams, and promised users that in the future they’d be able to opt out of having their interactions with Siri sent to Apple’s development group.
As promised, that opt-out ability is now here. In the newly released iOS 13.2, you can opt out during installation or by adjusting your settings if you no longer want to share Siri recordings with Apple. To do this, first go to Settings, and then to Privacy. There you’ll see a menu item called Analytics and Improvements, which is where you’ll be given the option to opt out.
But this raises a question: Should you automatically opt out?
Remember that Siri wasn’t sending these voice recordings to the third-parties in order to build a marketing profile or serve you more relevant ads. The whole purpose of the program was to improve Siri’s performance. Developers were trying to look at cases in which Siri was failing to understand a question or provide an appropriate response, figure out what was going on, and then make improvements.
And those improvements are genuinely important, because Siri functions as much more than a productivity tool: Siri is an accessibility and safety feature as well. Without a wide range of samples to work with, Apple developers will find it harder to troubleshoot Siri issues, which may mean slower improvements and longer timelines for bug fixes.
So while opting in or opting out is obviously a completely personal choice, it’s worth bearing in mind that there may be good reasons to share your Siri interactions with quality control and development teams at Apple.
Moving on to a second major security update, iOS will now support HomeKit-enabled routers (another thing we were talking about on a recent Checklist). HomeKit-enabled routers will give users more granular control over their smart devices’ ability to communicate over networks.
This is significant as IoT devices are famously insecure. Being able to stop some or all of them from being able to communicate with devices outside of the home network means that you can reduce the number of possible targets available to malicious actors. It’s unclear when the actual router hardware will come out, but this is definitely a step in the right direction for IoT security.
Lastly, Apple found (and fixed) a vulnerability in their Books app that could potentially disclose personally identifiable information to bad actors under the right conditions.
Apple stores some of your personal information as metadata each time you buy a book—mainly your email address and name. They do this as a way of digitally watermarking files such that they can be linked to a single person. This is a way to prevent copyright infringement: If you’re distributing your copy of a book all over the web, Apple will be able to tie it back to you.
Unfortunately, Apple found that there was an issue in their app that could allow a bad actor to serve up a malicious book file that would give them access to that metadata—and thus to a user’s personal information.
While this isn’t a particularly dangerous bug, it’s definitely a good reminder that vulnerabilities can crop up just about anywhere…even in places where you wouldn’t really expect them.
This is why it’s always a good idea to update your OS whenever you have the chance, and ideally to set up automatic updates so you don’t forget.
You can’t do that
Apple tossed 17 apps out of the App Store recently, after third-party security researchers found that a development group in India had been creating apps which contained Trojan clicker malware. These apps were able to surreptitiously open ad-containing links in the background and then generate fake clicks on those ads.
So why would anyone want to do this? The answer is pretty simple: money.
Online ad revenue is often paid out on a per click basis. If you can deploy millions of copies of ad clicker malware to click on whichever ads you want, you can artificially inflate the number of ad clicks you get credit for and thus “earn” more ad revenue. The companies affected are none the wiser, and the victims of your malware probably don’t even realize they’ve been infected (though they may wonder why their battery seems to be draining so quickly).
But is this malware dangerous? For iOS users, probably not (though again, since these clicker Trojans are software, they take up system resources when they run, which can impact performance). You may wonder how malware can be considered “not too dangerous” if it can perform actions on your system without your knowledge. But the fact that apps can do things in the background isn’t necessarily a bad thing. You may, for example, want certain types of music or podcast apps to download things for you in the background for future use. And generally speaking, on iOS anyway, apps have limited permissions. So while an app may be able to open links in the background, it won’t be able to access your contacts or read your files.
Nevertheless, these apps definitely slipped through the App Store review process, and so hopefully Apple will take the opportunity to continue to improve its auditing of developers and the apps they submit. The good news is that their review protocols are improving, as is the automation used to handle the huge volume of apps submitted for review each day.
But until Apple perfects its review procedures, everyday iOS users should stay vigilant.
First of all, when browsing the App Store, look for apps that seem to have good reviews and come from trusted developers.
Secondly, keep an eye out for signs of a malware infection on your phone. Are strange files appearing out of nowhere? Are there telltale signs that your phone is constantly doing something in the background, for example running hot or losing battery charge quickly? Have you noticed charges on your credit card that you can’t explain? All of these can be indicators of malicious activity on your device.
Lastly, take those permission requests seriously! If an app asks for access to something, stop for a moment to think about it before saying yes. A good litmus test is whether or not the app is asking for something that makes sense. For example, an app that can read and translate Chinese-language menus probably would need to access your camera in order to function. But if a chess app is asking for your contacts list, that should strike you as a bit fishy. If something feels “off” about an app’s request, just say no.
Deliver us from e-vil
Here on the Checklist, we’ve covered plenty of “smart things” that probably didn’t ever need to exist, from smart freezers to Bluetooth-enabled self-lacing sneakers (yes, really).
In a way, this is all somewhat predictable. IoT is “the next big thing”. Companies naturally want to cash in on the trend by producing networked versions of just about everything—often in ways that defy logic, prudence, and good taste.
We thought that we’d seen it all: that at this point, nothing could surprise us.
Gentle reader, we were wrong.
The Roman Catholic Church recently released the eRosary, an IoT smart device to help the faithful pray the rosary.
For those of us who aren’t Catholic, the rosary is a traditional Catholic devotion associated with Mary. It’s a long series of prayers—long enough that people who pray the rosary will often keep count on a string of beads.
The eRosary is a “smart” version of traditional rosary beads: a wearable device that can be connected to another device through the eRosary companion app.
The eRosary was released as part of a wider church initiative to engage younger Catholics and involve them in a movement to pray for peace in our world. Laudable goals, but the Vatican soon discovered what so many businesses have found: IoT security is devilishly difficult.
The app allowed users an option to sign in with their Facebook or Google accounts. So far so good. But for folks who wanted to log in directly with their email accounts, the developers decided to provide login credentials by sending a 4-digit PIN to the user’s email address. Unfortunately, the PIN was then transmitted—unencrypted—in the API response. This meant that anyone with the ability to monitor network traffic could see the PIN and use it to access the associated account themselves. The researchers who discovered the vulnerability were able to break into an eRosary account with relative ease, and from there see all sorts of personal details and contact information.
Fortunately, those researchers were able to contact the developers behind the eRosary, and a fix was released almost immediately. Credit where credit is due: Large organizations often take far longer to address concerns raised by third-party researchers, so kudos to the church’s development team, at least in terms of their response.
While no one was likely in any real danger from this vulnerability, the story serves as yet another reminder that connected devices are very hard to secure. Organizations of all sizes have released “smart things” with exploitable vulnerabilities, and as enterprise continues to chase after the IoT goldrush—often bringing new products to market with little concern for security—you can expect this to continue.
So remember to protect yourself by following best practices for smart device security. And above all, ask yourself whether a smart version of whatever they’re trying to offer you is really, truly necessary. After all, dumb shoelaces worked pretty well for Michael Jordan…and humble wooden rosary beads were good enough for the saints.