Checklist 133: Hello, Turkmenistan!
This week, we’ve got many interesting stories on our hands, ranging from yet another example of someone abusing Apple’s enterprise developer program, to Congress rumbling about holding companies responsible for data problems, and a new development from Apple in terms of how they’ll handle certifying apps and developers. Those stories, and all their details, form the backbone of today’s show as we check them off our list:
- More Enterprise Certificate Abuse
- Threats to Put CEOs in the Pokey
- A Layer of Certification for Mac Apps
So let’s stoke the fires of conversation and delve into the nitty-gritty facts of these stories.
So, who is it that’s been caught bypassing good practices and using Enterprise Developer Certificates to side-load rule-breaking apps onto an iPhone? Is it Google again? Maybe Facebook? Not this week — in fact, this time we’ve seen how it can be abused to drop bonafide malware onto someone’s device. What’s up with that?
More Enterprise Certificate Abuse
Let’s start with the good news and the answer to the question you’re most likely to have: no, this isn’t malware that’s easy to pick up, or even that you’re likely to encounter personally. In fact, we’d say it would be pretty hard for people to get. However, for those who did fall prey to its creator’s nefarious plot, they’ve certainly been taken for a ride. This story comes to us from TechCrunch.
The app, left unnamed by the researchers who discovered it, was developed originally to run only on Android phones; however, the developer seems to have figured out a way to target iPhone users, too. So, what is the app? TechCrunch says it features a design intended to disguise it as a “carrier assistance app.” It might be something your cell phone service provider has you download — from the official App Store, we might add —to allow for diagnostics of your cellular performance or to troubleshoot problems you’re having with the service. This app, though it looked like that, didn’t do any of those things.
Instead, it would steal your information, and not just a little bit, but as much as it could grab: your phone book, all your pictures and videos, any voice memos or audio recordings you’ve saved, and even the phone’s current location. Lookout, the security firm which discovered the app, says that the malware was available for download from fake sites targeting cell users in Turkmenistan and Italy.
In other words, not the kind of software the average Checklist listener is likely to encounter. However, users in those countries do need to be wary: Lookout says it can connect this app to another, related Android app which bears the signature of a company called Connexxa. Connexxa makes surveillance software and works with the Italian police and government.
Okay, so that’s what it is, but here’s the crux of the story: on iOS, this isn’t something a user could just download from the App Store. Instead, TechCrunch says Connexxa used its own internal Enterprise Developer certificate to sign the app for distribution. They could then walk iOS users through the process of installing the certificate and enabling the app, unaware of its more sinister purpose. Just like when Facebook and Google and adult content distributors did the same thing, it’s against the rules — if you encounter an app on the web encouraging you to follow some special method for installation, don’t do it!
That doesn’t mean there aren’t legitimate reasons to install an app this way, though. Despite all the recent abuse, Enterprise Developer Certificates were, and still are, used for wholly kosher means. Some of the most common scenarios include developers themselves using certificates, companies using their own apps internally, and for quality assurance and beta testing. Your coder friend might ask you for help squashing some bugs in his latest game before he submits it to the App Store; these are all above-board reasons to use this process. However, most people aren’t going to encounter these scenarios daily.
While jailbreaking is like leaving the front door unlocked, profile installation is similar to deciding to let someone in after they knock on your door. In other words, be careful about who you let in, especially when you don’t know them.
Since we’ve discussed this topic a few times in recent weeks, listeners may be wondering how they can tell whether they’ve been affected by an app that required a new Enterprise Developer profile installation. Here’s what you need to do to check if you have any such certificates active:
1. Open your Settings App.
- Open your Settings App.
- Go to General.
- Go to Profiles.
If you don’t have anything listed there, you’re all good. If you do have something there, review them and make sure that you want the apps associated with them on your phone. Tap on the entries for more details to help you make an assessment. If you see something you don’t like, tap the Remove button, and it will be gone — which will also disable the app linked to the certificate. That’s it!
Threats to Put CEOs in the Pokey
Back to Capital Hill again this week.
Do you ever get tired of hearing what seems to be the same story, month after month? Equifax loses sensitive personal data on tens of millions of people, and they say they’re sorry. Facebook leaks personal info for tens of millions of people and overlooks data harvesting and spying on its users, and they say they’re sorry. Earl Enterprises, Marriott, and the list goes on and on — we get a nicely-worded apology and reminders about best practices, and it’s status quo after that. Would the CEOs and upper-level executives of these companies feel like doing more than saying sorry after the fact if they actually stood to lose something from the mistake? Some jail time, perhaps?
According to a piece from Apple Insider, that’s precisely what one bill introduced recently into the Senate would aim to accomplish. Called the Corporate Executive Accountability Act, the law would hold executives in big business criminally liable for severe data breaches and leaks. The proposal would only affect companies making more than $1 billion every year, and only data breaches that impacted 1% of a given population (a particular state or, as would have been the case in the Equifax breach, the nation).
Those requirements mean that if your local deli gets hacked, they’re not going to face criminal problems as a result of that. Big restaurant and hotel chains, though, along with tech companies and social networks such as Facebook — they would be on the hook. The consequences for the first offense would include a fine or up to a year in prison; the penalties grow harsher after that.
It’s a long shot for more reasons than one, but perhaps the most important attribute of this bill is its potential to push the conversation farther along. When people hear someone lost their data, there’s really no sense of what that means, long term — especially when most people never really feel the effects of such breaches directly. If a child told a parent they’d lost the car, the reaction would be justifiably outraged; yet we don’t do the same for our data, which is extremely valuable in the hands of Big Data traders and businesses. Bills like this one emphasize the importance of our information, and will hopefully move us closer to a society that understands the value of bits and bytes as much as what’s physically in front of us.
There’s no real chance the bill moves forward, though. Introduced into the Republican-controlled Senate by Democratic Senator and presidential hopeful Elizabeth Warren, the likelihood any opportunity arises for it to receive serious consideration, let alone a vote, seems slim. Warren may know it’s not going anywhere, but as these issues are a centerpiece of her campaign, it makes sense to put the conversation out there. Though it’s a political theater for now, in the future, we do need to move towards stronger penalties to give corporations pause before they cavalierly spew our data everywhere.
In light of all this, though, we do need to remember that there is still and always will be an element of personal responsibility to online security and privacy. If Facebook loses your password and someone else gets a hold of it, that’s on Facebook. If that password is one you use for dozens of websites including your bank, well — those consequences are at least partly on the user.
That’s why sticking to best practices is still the best policy: strong passwords, don’t re-use them, use password managers, use 2-factor authentication and seek it out where you don’t yet have it. Additionally, keep your software up to date and use a malware scanner — no matter what happens out there in the world, these steps will still help to keep you safe.
A Layer of Certification for Mac Apps
Apple is renewing a push towards a new security feature on its apps, and so far, the reaction is a little mixed. Apple Insider brings us this story that comes in the wake of an advisory Apple issued to all developers about what requirements they will place on apps for functioning securely within the macOS ecosystem. According to the report, new apps developed with a new Developer ID (that is, when both the account and the app are brand new to the process) must receive Apple’s “notarization” approval. Otherwise, Gatekeeper will not let them run. The change goes into effect with version 10.14.5 of macOS and will later affect all versions of the operating system.
What is notarization? If you don’t remember, we don’t blame you: it was introduced at WWDC 2018, and we haven’t done much talking about it since then. Notarization is a process Apple created for developers to use when offering their software for download outside of the Mac App Store. A “notarized” app has been reviewed in a more in-depth manner by Apple and shown not to contain malware. The idea is thus that a user who sees a notarized app knows that Apple has granted it a seal of safety.
There’s more to it, though, as a notarized app that does do untoward things or becomes compromised (think of the Handbrake fiasco from a few years back), Apple can revoke the certificate and stop the apps from functioning. So, what’s the change? This process, currently, is voluntary. New developers and those who register new Developer IDs in the future will face mandatory notarization, ostensibly to cut down on the number of malicious apps that use brand-new IDs (as they only cost $99).
While this change has a small effect right now, Apple says it eventually intends to require notarization for all developers who choose not to submit their apps to the Mac App Store. There are plenty of legitimate reasons for that: some developers use techniques that simply aren’t allowed under App Store rules, such as accessing low-level system functions.
There are plenty of legitimate paid programs that do these things and have normal purposes, but would not be able to meet the App Store’s more rigorous requirements. Others simply don’t want to be a part of the Apple corporate ecosystem and prefer to make their money selling software independently. That’s why, though the notarization change may be positive for users; it’s causing consternation elsewhere. It appears Apple is expanding the walls of its walled garden and bricking it up around those who voluntarily chose to play by the rules but outside of the system.
Some developers have concerns about what changes Apple might make to the program in the future. What if they decide certain tools and techniques common today aren’t kosher anymore later? Could they arbitrarily decide to revoke notarized apps that still use those techniques, or refuse to grant the notarization at all? These are some of the troubling questions developers have, but for now, Apple plows onward.
Will this be good for users? That we couldn’t say for certain right now; we’ll have to wait and see. Nonetheless, the developments are worth noting and watching for the future as Apple continues to push its reputation for privacy and security even harder in advertising.