Security news that informs and inspires

Unicorn Wrangling 101: What is a Backdoor?

By

What constitutes a backdoor in software, firmware, or even hardware? This question nagged at us during a recent project that Duo Labs worked on.

We managed to get ahold of three Android-based phones that were for sale in China only, so naturally we figured we’d:

  1. Look for some government backdoors that some government put in;
  2. perform some actions that involved informing the Internet about our findings and how awesome we were; and
  3. argue about how to spend the Nobel Peace Prize money.

So what did we find? A ton, but in reality, a lot of nothing. You see, it all depends on what your definition of a backdoor is.

What Exactly is a Backdoor?

This is actually a hard question to answer. We’ll start by breaking things down into several categories, and yes this is an incomplete list.

There is an obviously bolted-on piece of code whose sole purpose is to provide some type of access (remote or otherwise) to an attacker. This is your traditional backdoor, it could come in the form of an extra program or app that is installed that allows a bad guy to function on the system. This is usually considered real-time remote access - many of your traditional rootkits fit into this category - but it could allow for special access if the bad guy is holding the device in their hands. Of course the more obvious the backdoor, the easier it is to spot, and the more likely forensics could trace back and identify the attacker.

There is a backdoor, it isn’t supposed to be a backdoor, but when you see it, well, it is a backdoor. Is this vague enough? Sometimes the backdoor is rather obvious, particularly if the word “backdoor” is in the secret password, even if it is written backwards. Other times, the vendor adds a backdoor for “remote maintenance” or states that it was used during development and accidently left in (which is often the truth, people do make both coding and design mistakes).

It is rather entertaining to go Googling for articles about backdoors in commercial products - virtually all of the vendors with a disclosed backdoor in their products release some type of statement on their website, and then later delete that statement after a few weeks. The whole backdoor issue makes a vendor look really bad, so it’s not surprising they remove it from the website as soon as they think they can get away with it. This issue has a long history, with operating systems shipping with default accounts and passwords since nearly the beginning of operating systems. Alas, these type of issues still appear in modern systems.

There is a coding flaw that has been introduced specifically by the attacker to allow that attacker to bypass normal access methods. This is a little more obscure. It could involve simply adding a subroutine that does a badguy check “if (AreYouBadguy == true) then YouGetFullAccessYourTableIsWaiting();” and then the bad guy gets to do whatever is desired. Every once in awhile you hear about this type of thing happening.

For example, there was the infamous equal sign backdoor attempt (which was caught). An attacker compromised a server containing Linux kernel source, and added a simple change that should have contained “current->uid == 0” (basically asking if the current process was root), but instead contained “current->uid = 0” (basically assigning the calling process root level access).

This was in 2003, and people argued and debated about it for days. I guess in modern terms, they “Interneted” about it, but nonetheless, it was a backdoor method to bypass restrictions and gain root access. Again, forensics can potentially allow someone to track down the attacker that installed this bit of code. And while we’ve never seen proof of this, it is possible that evil backdoor code could be in the form of seemingly legit but maybe questionable functions that are added. While there are security checks in the code to prevent security issues, compiler optimization actually removes the security checks.

Bugs like this do actually occur but are rare, since a security review of source code alone may not catch them. Personally, we feel this would be the most elegant route to go, since the source code looks like the bad guy did not intentionally introduce the flaw to begin with, and this makes proving the evil intent of the coder even harder. Also, for a smartphone where one is working off of a known code base, this might be the proper route to go to really hide that backdoor.

There is a coding flaw that has been introduced accidentally by a legitimate coder that will allow an attacker to bypass normal access methods. This is arguably the best backdoor - a 0day. A flaw exists that the attacker knows about on the target system. The attacker didn’t have to get the flaw introduced into the code specifically, they just poured over the existing code or reverse engineered compiled code and located the flaws. Granted, this is an odd science in itself, and if anyone else finds the 0day, it could end up getting it patched, effectively burning the attacker’s “backdoor.” However, there is no evidence that says the attacker put the flaw there, so there is no trail back to the attacker to trace.

A library or subsystem has been included into the overall build process, and it contains a flaw that allows for an attacker to bypass normal access methods. The phone manufacturer puts together their codebase, maybe starting from a fairly secure Android version, but they’ve added on additional programs that introduce flaws through third-party coding dependencies. This tends to happen a lot when people are in a hurry to build something and just make it work. You could easily combine this one with the previous one to a degree as well. Again, the attacker just takes advantage of the introduced flaw.

Normal processes are in place as they are on every phone, however due to configuration changes affecting options or timing, an attacker can bypass normal access methods. This is somewhat strange, but it happens. Let’s say it’s fine to run the questionable_program_that_backs_up_creds_to_the_cloud because the transmission is encrypted and the certificate is pinned. However, an attacker has made a change that weakens the encryption algorithm to one the attacker knows is breakable, and just has to capture the traffic. Or the attacker knows that this configuration is weak to begin with and just camps out waiting for the traffic. Now all the code on the system looks fine, but a configuration adjustment is the weak link. This type of backdoor could happen in an obvious way, or a not so obvious way.

A known piece of good code is included on the phone, but it has been altered so that its security has been weakened. This is somewhat similar to the previous one, but with a twist. Let’s say a known web browser has been added, but a number of security features that would normally be included or turned on have been purposely adjusted - it could happen. There is a reason for this type of concern.

Now here is where it gets weird - substitute all those “attacker bypasses normal access methods” statements with “uploads private data” and that may actually provide the same intended result. Exfiltration of critical data off of the phone may be the intended end result anyway. Further imagine that the phone is just violating privacy aggressively as a normal part of doing business - all you have to do is figure out how it does that and monitor it, and that’s your “backdoor.”

We’ve barely scratched the surface on this backdoor business - really one could write a book on the subject. And people wonder why Duo Labs has such a large bar tab at the end of any given night - thinking about all of this makes one’s brain hurt.

So Those Results….

Ok so we went off on a bit of a tangent there, so let’s bring this back to our real world example - those Chinese phones.

The Chinese smartphones were running outdated software, in fact they were all based off of older versions of Android code (4.4) with its well-documented known flaws. One could simply query for a list of older Android CVEs, spend a few minutes searching the interwebs, and there is your pile of potential backdoors. Ok, well not quite, but running older versions of software is certainly a valid starting place, particularly if you are looking for as easy entry into a phone’s OS.

Were there questionable apps with sketchy-looking code included with a basic install? Of course there were - just like most phones. Were there violations of privacy that could be leveraged? You bet - but again, just like most phones. How does one even write that stuff up? Trust us, you can’t simply turn in a report to management that says “yup, they are all shit, but no, we didn’t find anything anywhere close to new.” But that is exactly what we found - your basic hot mess of questionable code but no unicorn. Just saying “4.4” or even just “non-current version of the OS” should be enough.

We had toyed with the idea of looking at the firmware, but this is a tricky thing in the grand scheme of backdoors. It violates the first rule of attack - use the easiest method to achieve the objective.

It boiled down to this - there were issues with the phones that allowed for someone with the right resources to get to information on individual phones. Modern tech systems all phone home (no pun intended). The Chinese government has the means to get data from private firms based in China, and monitor the infrastructure that allows the phones to communicate. Most of the issues discussed above required either access to data being uploaded or being able to intercept and decrypt network traffic, so there would be no reason for the Chinese government to backdoor the firmware. They can easily enough get to pretty much anything, or at least most of what they need.

So then the question is, do you pursue pulling apart firmware looking for unicorns, a process that could take weeks or months (we’re also working on other projects at the same time), or do you spend your time looking at something a little more fruitful? After 30 days, we chose the latter.

Conclusion

So no Nobel Peace Prize, no fascinating new bugs, just a lot of looking at older code that already has been picked clean of anything interesting. But we did learn a lot about how to approach backdoors, so not a total loss. Special thanks to Darren Kemp and Khang Nguyen who also worked on this phone project with me.