There’s so much that goes into creating a new app, from design and development to testing, that security is often overlooked. The problem with lax security is that it puts users and organizations (such as the enterprise) at risk, and ultimately developers can be held responsible. The impact of vulnerable apps can be very significant for users. Consumers store all kinds of sensitive data that can be easily exposed, such as payment account details, personal photos, and email messages. If consumers use their phone for work, as they often do, there’s also the potential for leaks of company confidential data. And it’s not just security that we need to keep in mind, but user privacy as well. For instance, the FTC is currently investigating whether app developers are violating children’s privacy laws by sharing data without parental consent. The state of California has also introduced a law specific to mobile app privacy. This overarching trend shows why developers must be more careful about how they build their apps, not just to protect a developer’s reputation, but also for legal reasons. In some cases, the lack of app security and privacy is holding back app adoption. With the BYOD or “bring your own device” movement, many organizations are beginning to adopt mobile policies that ban the use of particular apps or app behaviors in the workplace. For example, some companies may not want their CEO walking around with an app that can track their location, perhaps giving away location data while on-site at a potential merger/acquisition target or some other sensitive location. Additionally, most companies worry about apps that leak the corporate address book, usually intermingled with the personal contact book on an employee-owned device. So how can we start building apps with security and privacy in mind?

Insecure Programming Practices

One problem we’ve seen is that developers are taking raw data from a server and outputting it to the user, without sanitizing that data first. This is particularly of concern for HTML, JavaScript or other “web content-based” data. If that data is displayed to the user without being validated, it can do things you might not expect. For example, some time back, the Skype iOS app had a related vulnerability due to a failure to properly sanitize data received from the network before displaying it to the user. This ultimately allowed for an attacker to execute malicious JavaScript code and gain access to some sensitive data. Other issues we’ve observed relate to how apps store and transfer data. Data at rest must be stored and secured properly. In the case of iOS, as opposed to storing some kinds of sensitive data to disk in plain text or another insecure format, the Keychain can be used (with appropriate protection classes) to securely store smaller vales. Encryption can be used to secure larger files and data in transit can be secured using TLS/SSL. In the case of Android, care must be taken not to modify file permissions in a way that allows other apps to access sensitive data. Also, avoid storing sensitive data to an SD card, as this will be accessible to all apps on a device and could be easily removed from the device. Another mistake as relates to apps for Android is asking for too many permissions. This is a red flag from the security perspective. Only request the permissions that your app actually needs. This will result in reduced access to features and functionality provided by the operating system, which is a good thing, as this equates to less potential for bad or unintended app behaviors. Some organizations may ban an app from being used in the workplace if it exhibits what we call “risky behaviors,” such as accessing location information, contacts lists, calendar details or other sensitive corporate information. It is best to be selective on the permissions required by your app.

Best Practices for Authentication

If your app requires a login, make sure that tokens are set to expire at an hour (or less). Apps that generate tokens with shorter lifespans are much more difficult to hack. Previous versions of the Facebook mobile app are interesting examples of what not to do, since these tokens were set to never expire. In fact, we were able to hack the Facebook mobile app in less than five minutes. When that happens, any linked mobile accounts, like Foursquare, Twitter, and other apps that you’ve linked and granted access to your Facebook account become vulnerable as well. For some of these apps, that could mean accessing their services, and at the very least, provide access to your Facebook account and private data.

Compiler and Platform Security Features

In 2011, Apple introduced support for a feature in iOS 4.3 known as Address Space Layout Randomization (ASLR). This feature was introduced in order to make it more difficult for attackers to exploit vulnerabilities in apps for iOS. Developers can take advantage of this feature by ensuring that the “Position Independent Executable” (PIE) feature is enabled at compile-time. In recent versions of Xcode, this setting is enabled by default.

Ad Networks and Other Tools

Using some particular third party SDKs, ad networks and analytics frameworks can reflect poorly on your app in some cases. If you choose to use an SDK, be sure to understand what that SDK does. What data does it collect and transfer? What are the security characteristics of the data transfer? If you’re looking to support a free app by using an ad network, ask if they filter for malicious ad content. It can be difficult to find out that sort of information, so do some asking around on whether a particular SDK or other tool is known to be problematic. In some cases, it is best to use a third party security and privacy review service that can help analyze and evaluate not just the app you are developing, but the SDKs and ad networks you are using as well. Making security and privacy an integral part of your app development lifecycle can help streamline this entire process. Additionally, working with a third party to perform threat analysis will help build your app’s defenses. By maintaining a mindset of “security and privacy first,” you’ll have a better shot at selling your app to privacy-minded customers (such as enterprise, healthcare, government) and your users will ultimately benefit. Not only will being a security and privacy minded developer help strengthen your reputation and the reputation of your apps, it may also help keep you protected from legal issues.   Michael Price is an iOS internals expert and heads Appthority iOS R&D. Most recently, Michael was the head of McAfee Labs for Latin America. Previously, he was a core member of the Foundstone research team and was responsible for vulnerability research and signature development. Michael is a published author and co-founder of the 8.8 computer security conference, held annually in Santiago, Chile. He is currently working on a new book, titled “Hacking Exposed Mobile,” scheduled for publication in July 2013. Image: kromkrathog/Shutterstock.com