Checklist 103 Always Look a Gift Horse in the Mouth
Last week, we spent most of our discussion shaking our heads. This week, we’ve got an update on one of those stories, a look at a frankly quite scary USB device that doubles as malware, and a look at just how much the government doesn’t like apps that let you encrypt all your conversations. As we head into the “dog days of summer,” the security threats and the big stories keep adding up — but maybe we should be thankful we don’t have a multi-million-person data leak to discuss right now! So, these are the items we’ll tick off today’s list:
- The tale of the USBHarpoon
- The endless battle against end-to-end encryption
- Suing the search giant that is Google
If you needed a quick way to charge your phone and all you had on hand was your laptop, would you think twice about using a USB power cable offered up by a stranger? For many of us, the answer is probably “no” — but the concept behind the USBHarpoon challenges us to rethink that notion. What’s the story here?
The tale of the USBHarpoon
It’s no surprise that USB devices can pose a security risk at times. We’ve all heard the same advice for years: don’t plug mysterious USB thumb drives into your computer. It’s an excellent way to give malware a free pass to get onto your machine, and it’s why you should always be careful about accepting such devices from dubious or unclear sources. The adage says “don’t look a gift horse in the mouth,” but we’re starting to think that saying ought to be retired. The “free” things today are rarely free — we watch ads to see videos, we hand over data on our habits for free web searches, and so on. Now, new research has pushed the world of malicious USB devices even further.
According to Bleeping Computer, several researchers have built on a previous project called BadUSB. With BadUSB, an attacker could reprogram the control chip used by external hard drives and flash drives to interface through USB. Once reprogrammed, the drive would act like an “HID” — or a “human interface device.” In other words, it puts on a mask and pretends to be something like a keyboard or a network card. Once recognized as such by the system, it’s a trivial matter for it to begin executing commands rapidly.
The latest research project, dubbed USBHarpoon, takes the same concept but moves it away from hard drives. After all, many of us are now conditioned to avoid these unless we purchase and use them ourselves. Instead, the researchers wanted to know if it would be possible for bad guys to circumvent this social training. The result: a malicious USB charging cable. All you need to do is plug it in to your laptop, thinking you’re about to charge it or your phone, and in seconds it could take control of the machine.
How? The key is the tiny metal pins USB uses to pass both data and power back and forth between devices. By modifying these pins, the researchers were able to sneak data into the target machine through channels it would not normally monitor. They even demonstrated that the same method of attack would also work on so-called “data blocker devices,” which plug into a computer before the USB device. The goal is ostensibly to block all non-power signals, but by tampering with the blocker itself, they can be transformed into malicious tools.
The good news, for now, is that it isn’t a perfect solution. Though the USBHarpoon cables already can dump a malicious payload onto the target machine to run wild, it isn’t subtle. Users can still see what’s going on, and if you were to suddenly notice a whole bunch of weird things happening on screen, it might be easy to figure out that something was amiss. For now, the researchers continue to look for ways to hide the activity, such as by adding a time delay to the payload.
Whatever the details, this is clearly a pretty concerning development — and it’s not exactly a one-off, either. The Bleeping Computer piece mentions that when an example of the hardware wasn’t ready for one researcher, he challenged another person to build the same device — and they succeeded, too. It’s a good bet that at some point in the future, we’ll need to pay attention to the cables we use, too.
Does that mean you should be wary of buying any non-standard cables? Probably not yet. If you’re purchasing a USB charging cable in the store, or an official version from Apple or whoever your device maker happens to be, the risk level is probably relatively low. What you do need to be aware of are the freebies — the swag bags you might get a conference or the free charger someone might offer you at a convention. These are the real sources of risk because there’s no true way of knowing where those things came from or what secrets they might be hiding.
Considering this topic made us remember another time when we’ve discussed the perils and pitfalls of inexpensive but highly insecure technologies — the Internet of Things. We’ve discussed plenty of issues with cheap Internet-connected devices exposing information and dumping malware in the past. If you’d like to revisit the conversation we had on that topic, you can head back to Checklist 42 and refresh your memory.
The endless battle against end-to-end encryption
When it comes to communicating securely today, there’s no better technology to rely on that of end to end encryption. In these setups, the entire conversation is put under digital lock and key from the start. In a proper implementation, there’s no chance for a third party to intercept and read the messages exchanged over an E2E system. In fact, in the past, we’ve discussed some of the options Apple users have in this department. You can find that conversation in the archives under Checklist 29: Encrypted Chat Apps for Secure Messaging on iOS. Naturally, given the secure nature of these communications, it poses a problem both to the government and to law enforcement. As they point out, these technologies make their jobs harder to do — but we know that we can’t compromise on the dangers of back doors. So, the demands and legal battles continue.
Reuters reported that in the latest salvo in Federal court, the Department of Justice has demanded that Facebook “disable or break” the E2E technology its popular Messenger app uses so that law enforcement officials can break up suspected gang activity. Naturally, if Facebook were to do that for one user or a series of users, it would mean endangering the security and privacy of every single user on the platform. For that reason, Zuckerberg’s social media juggernaut has so far politely declined DOJ’s requests. They continue to argue against the demands in court. Facebook points out that they would either need to hack the government’s target themselves or rewrite the entire system from the ground up — neither are pleasing options, to say nothing of the dangers of backdoors.
Apple Insider points out the potentially far-reaching effects a decision in the DOJ’s favor could have. For starters, the government and the courts would likely view it as establishing precedent. With the bar set for when the government can demand the breaking of encryption, they could then turn around and demand the same of other companies. Both iMessage and Facetime use end to end encryption to keep your communications secure, and that would mean the government would have an inroad into finally forcing Apple to open up user data to investigations.
While law enforcement truly has a difficult burden to bear, the simple reality is that there is no such thing as a back door that “only the good guys” can use. The mere existence of one, even if it is a secret, endangers users — and you can bet that the bad guys are always looking for a secret way to penetrate popular apps and services. And if the bad guys get their hands on all our encryption keys, well — we can only imagine the kind of chaos that would create! What will be the ultimate decision in this case? For now, there’s no way to know. Unfortunately, we can’t keep a close watch on this court case; all we know currently comes from anonymous sources, as the judge has sealed the court case. As soon as we hear more, we’ll bring you an update.
Suing the search giant that is Google
For our final story today, we’re jumping back to a story we brought you just last week. We encourage you to go back and check out that episode for the full details, but here’s a quick recap: some researchers discovered that Google was still tracking your location, even when users thought they’d disabled the proper setting. Even if you turned off “location history” in your phone, Google was using several other methods and pieces of data to figure out exactly where you were. In other words, the setting didn’t really do what it said. That made some people very unhappy.
Unhappy enough to sue, as it turns out. aAccording to Ars Technica, a man in California has initiated legal action against the search giant. The lawsuit claims that Google’s continued tracking after user opt-outs violates a California “Invasion of Privacy” law and the state’s own right to privacy. He isn’t going it alone, either — in fact, the suit seeks to gain class-action status for both iPhone and Android users
The legal woes for Google don’t stop there. Lawyers with the Electronic Privacy Information Center have asked the Federal Trade Commission to open its own investigation, too. Why? Google operates under a “consent decree” from the FTC, issued in 2011, in which the Mountain View giant agreed to be clear and upfront about how it collected and used user data. That included a prohibition against “misrepresenting” their collection efforts — something which this tracking snafu certainly resembles.
Looking at these developments, we say “Good!” While it’s true that sometimes frivolous lawsuits make the headlines all too often in America, this is one lawsuit we don’t hate. Sometimes, public pressure isn’t enough to get the ball rolling on making changes — but legal pressure can do a world of wonders, especially when the negative publicity and potential judgment can influence your bottom line.
Should you switch search providers? We considered this question last week, and generally, the answers haven’t changed. However, there are alternatives out there, such as Duck Duck Go, which does provide more confidence regarding how your data is handled. While it can be tricky to “reshape your brain” to work without Google, it can be done — and as we’ve seen, it might be a good idea anyway. At the very least, if Google notices a downward tick in its usage, it may spur them on to making changes. After all, some of the big “evil giants” of previous tech eras have since reformed themselves to a degree — even Microsoft plays nicer with the community and other developers today than it ever did in the past. Could Google do the same? We hope so.
For now, though, we don’t know whether this will spur any real changes in the way Google informs users about the data it collects. We can hope and dream, though, and who knows? Maybe we’ll be bringing you another update on this story next week, next month, or maybe even next year.