As mobile adoption spreads like wildfire globally, pervasive government surveillance programs are coming to light and major internet security exploits are being uncovered. While mobile device security can be improved via mobile device management solutions (MDM), it’s our responsibility as app developers/publishers to ensure that our apps protect user privacy and critical business data. The problem is that the ways to secure your Android app and data are not always obvious or well-documented. In this talk from GOTO Conference CPH 2015, Scott covers current Android app threats and looks at how easily we can reverse engineer an Android app with freely available tools. He covers enhanced SSL validation, encryption, tamper protection, and advanced obfuscation techniques, and focuses on leveraging open source, commercially viable libraries to allow us to increase our app’s security with minimal effort.
Scott Alexander-Bown introduces the Android ecosystem and talks about survival tips to strengthen the security of your apps. He covers the topic in greater detail in the book he co-authored, Android Security Cookbook.
1.4 billion of active users make Android the largest mobile platform. It should build trust, but we should be skeptical of any security vendor or company claming that they have a handle on security.
Android’s Ecosystem
Security Services (05:12)
- Google Play store, with human approval process since 2015.
- Developer security notifications: i.e. things the developer may have accidentally included in their private keys in the app, vulnerable version of open SSL.
- Android Bouncer, which runs all the apps in emulate environments. It checks for malware signatures and dodgy stuff, and then removes them from the app store, suspends them, and, if warranted, can remotely wipe them from the devices.
- Android device manager (Device security; free service)
- Safety net (intrusion detection). Any device with Google services can gather different data points; uploads that to the Google Services, and they do data analysis for potential malware.
- Android at Work, a way of segregating your personal and business data and applications on the same device. Definitely geared towards a BYOD approach, but it gives system administrators API access for provisioning, managing profiles. You can remove your access and just wipe the business data and apps not your personal data.
Newer Versions of Android are More Secure (10:19)
The verified apps section came into Android 4.2, a way for Google to monitor apps and detect potential malware when users were installing applications not from the Google Play store. The newer versions have more security features and a stronger BYOD policy.
Threats: App Hijacking or “Trojanizing” (11:06)
Taking an app and adding malware has raised concerns, because reversing Android apps is easy and there is no need for certificate authority. Additionally, sideloading apps (installing apps that are not from the approved Play Store), is very simple.
OWASP - Top 10 Risks (16:56)
Stagefright affected almost all Android devices out in the marketplace. Nexus devices now have a monthly security update and security bulletin; OEMs has also increased their security updates. As compiled by the OWASP project (Open Web Applications Security Project, a not-for-profit organization focused on improving security in software), there are 10 top risk for our apps (list presented on the slides; each of these is broken down in attack vectors and sample code).
Survival Kit (18:16)
We want to harden the network communication between the app and our service. We want to protect any data that we have stored. We are going to use encryption for that; and look at ways to validate the device and app’s integrity. Then, increase the binary security.
Network Communications (19:40)
Using SSL and TLS for all your network communications should be a default. (You have to opt out.) The second point: use the platform SSL/TLS validation, and not to disable it. There is different ways of doing it. You can use a form of pinning or buy genuine certificates for your development service. You could also use stronger cipher suites, and the stronger TLS versions. If you control the service, then you can control the type of connection, the strength of the encryption between your apps and the server. SSL defaults to lower versions, to support older browsers. You could also use OkHttp 2.1.
SSL pinning (22:33)
SSL pinning is a way of reducing the amount of certificate authorities that you trust. There are two different types of pinning:
- pin the certificate
- pin the public key of that certificate (recommended, since you do not have to change your app or force an app update every time you renew your certificate)
To patch against SSL exploits, Google Play services have introduced dynamic security provider (you can check the training article here). You can use ProviderInstaller.installIfNeeded(getContext());
. Provider will fix any issues in the system that has with any of the OpenSSL libraries. Because it is an update of the cryptology provider, you can also use it for standard self-encryption (instead of Bouncy Castle).
Encrypt (25:36)
Password-based encryption, the best practice approach when you are doing encryption (where to store the keys; see the slides to learn how to use password-based encryption).
Encryption libraries can make your encryption tasks easier:
- Conceal: from Facebook; awesome for files. They use it for encrypting photos that they have cached on your SD card. Easy to use, it defaults to all the stronger algorithms. It uses AES-GCM (the preferred AES algorithm, but not the strongest).
- SQLcipher, uses 256-bit AES in CBC mode However CQLCipher adds between four and seven MB to your APK size. The good news is that Google just upped the APK size limit in the Play store to 100 MB (less excuses not to use it).
- Secure-Preferences, one of my open-source libraries. You can also look at Hawk (much better-named and much more popular implementation). It is a key value store which is backed by an XML file: easy way of securing or obfuscating those files (people on rooted devices cannot mess with your things). With Secure-Preferences, you can use it with password, and it does password-based encryption. If you do not use a password, it will generate one, and store it in there (if you rip the file out, you cannot tell which one is the key). But it is still obfuscated, because you could find it with enough effort. All small steps to make your apps harder to hack.
Above all: avoid hardcoding your encryption key. Reverse engineering and finding a constant, a string constant for your encryption key is trivially easy. But not all apps can use the password-based encryption, an app passcode. If you have to generate a key dynamically per installation, do not hardcode it.
Verifying App Integrity (33:10)
Debuggable check: APK checksum With PackageManager you see if you are in a debuggable state (should not be happening once you are in production). You can calculate an Apk Checksum to see if it has been modified.
Signing certificate verification: build-time vs. runtime You can sign a certificate verification (my favorite one, because each of our app has to be signed with a developer’s key when we publish or distribute it). That key remains the same through the lifetime of the app. One of the cornerstones of Android’s security: the signatures have to match. Your signature stays the same throughout the lifetime of your app. You know that in build-time, you can check in this runtime. You would use keytool (Java), to list out your keystore’s signature. Then you just embed it in the app. At runtime, you would ask the PackageManager for your signature. Hash the signature (to assure one, to match what you have embedded), and then compare the strings. However, if someone can decompile code, they can take this check out.
Verifying Device Integrity (35:43)
If we want to verify the device integrity, there is various different things you can check. Emulator check: a cool library if you should be running in an emulator if you are in production. It could give an indication that you are being tampered. Or dynamically debugged. Or do the Google SafteyNet test, part of Google Play services. It says whether it is CTS compatible, and the CTS tests of each version of Android that is built from the open source tree. If it passes all them, then it is verified as an okay device. The SafetyNet will come back false if the device is rooted, if it is been compromised in other ways. You can check the state of the device.
root@android:/ #
(39:04)
Indication that app installation could be running on a compromise device. The sample library does not have a server side component: you do the validation test on the device.
I commonly get asked about detecting root. There is no 100% way of checking for root. But you can look for root apps, potentially dangerous apps. You can check certain system properties, which could have been changed on a rooted device. The most common way is to look for the SU binary, which may or may not be present, or may be hidden. Same with the BusyBox binary. You can also check if certain system paths are writeable, or rewritable (when they should not be). We pulled those checks and put it into a library called Rootbeer. That does a native check, and we managed to defeat some of the root cloaks that we found.
ProGuard (41:32)
To protect the binary we can obfuscate the code (harder to reverse), known as security via obscurity. Your code is not encrypted, it is just harder to read. ProGuard is the Java obfuscator, part of the Android SDK (free). Very easy to turn on effectively. But you get crashes: in ProGuard they have “ReTrace”, which takes the output from ProGuard, and you can use that with your stack trace, and turn it back into a sensible stack trace. Crashlytics, HockeyApp, and Crittercism support automatic de-obfuscation of crashes.
If you want to go pro-ProGuard, use DexGuard (but there are other, i.e. Metaphoric, and Arxan). ProGuard is a Java obfuscator; DexGuard, has been built for Android, designed around protection. It has useful security utils - SSL Pinning, Root check, logging removal etc. My two favorite features: sting encryption and API hiding. You can use those in conjunction with each other. You can hide your tamper check, or hide your SSL pinning verification. Then, using string encrypt, reference your method name, your class names, that you have used a reflection-based call on, effectively, that you have hidden, and that is also encrypted. That makes it double-strength check.
Testing (46:27)
Quick Android Review Kit, Drozen, and More (46:51)
There are many tools and great articles about how to reverse engineer: Apktool, dex-to-jar, Androguard. The Quick Android Review Kit, Qark, is fairly new, it was open-sourced a couple months ago by LinkedIn. It is a Python script that you run against your app or your source code. It will give you an HTML report about the weaknesses or any exploits that could potentially be done. The other one is Drozer, from Mercury Labs. It is been around for a while (very feature-rich). Plus other useful links:
42+ Secure mobile development tips OWASP Mobile security risks Android security cookbook Android security internals Droidsec (whitepapers)
Questions? (49:25)
Q: You are saying not persist the key on the device: how do you deal with password change? Scott: You can do that. If you want to see an example of how to do this, check out my Secure-Preferences library because I handle that in that, and that would in a very dumb way decrypt the library/code with the existing password, and then encrypting it again with the other password. You can’t cope with, because that is password change, it does not help you cope with if someone’s forgot their password. In that case, you cannot decrypt the data. That is why I see it more as, it is really good for caching the data. But there is no way of finding that password, that is the point. That is why it is the secure way of doing it.
Q: JavaScript bindings in WebView: safe to use? Scott: I would say no, I would err on the side of caution. I have always been scared with what you can do with that. It got better in API 17, with the annotation that you have to explicitly annotate the methods that you can call from JavaScript into Java. But I have not used that binding much.
Q: Bootstrapping baseband firmware as a vulnerability: real or hoax? Scott: I do not know the answer to that question.
Q: Can you compare the Keystore to password key, type of that? Have you ever used one? Scott: Yes. One of the things I did not have time to mention, since Android 3.1, there is the Android Keystore, similar to iOS Keychain. It is a place where you can store a certificate. Unfortunately it is not like the Keychain on iOS, because you can just store arbitrary secrets in it. On Android, you have to store a public-private key, and then use that to encrypt other things. That came in Android 4.3, and there is a couple of issues with it, not being consistent in the way it behaves; if the user changes their PIN on some versions of Android, it just wipes the Keystore, and at other times it still says it is there, but it actually has wiped it. One of my friends wrote an article called “The Forgetful Keystore”. He has done a metrics on what versions. In terms of which is more secure, they are based on the same principle. They are using something that the user puts in, that we do not store on the device, to then encrypt, to generate a key that we use to do the encryption. I would say they are roughly the same, I just prefer doing a password-based encryption that internally, with my app, because I have control over that. And also, forcing the user to put in a passcode or a password in for my app, when it is a secure app, and there is a point to it, is a easy sell, but forcing that user to have a device PIN, passcode, when they do not have one, that is the big issue. That is a bigger sell, if you are a bank. I think forcing the user to have a PIN code on their device is, you could probably get away with it, but other apps I think you would really struggle with. Maybe Enterprise apps, you would probably get away with it as well, but that is quite a big ask for the user. Just to be clear, to use the Android Keystore, the user has to have a PIN code or a passcode enabled, and that is the key thing. You have to force that for the whole device rather than just your app; that is why I tend to prefer, doing it in-app, but it is just as good a solution as well. Also, if you are using, if you are in a lucky position of being able to target API 17 and above, which, I do not know if anyone else is in that great situation, I know I am not. Really good question.
Q: I just looked at the Conceal website, part of this Facebook library, and they claim that it is super fast. Do you work with it, or do what they do which makes it so much faster? Scott: I have used it, and it is pretty quick, and it uses a couple of the open SSL algorithms, which are quick. It is doing native code, rather than what SQLCipher does, it just bundles those algorithms and the stuff that you need, that is why it is quicker. It is not as big a a payload, effectively. Actually, worth mentioning, I mentioned it in the talk, but it is one of those Facebook libraries that does not have the weird license about… I forget what it is. It is commercially viable.
Receive news and updates from Realm straight to your inbox