According to Breach Level Index, more than 5 million records are lost or stolen every day. The total amount of compromised records since 2013 is more than twice the number of global internet users and only 4% of them involved encrypted data that would be useless after being stolen.
It is easy to say that insecure applications are a product of bad design, but further investigations show that bad economics are at the heart of the problem. The natural market forces at play disincentivize the players of an interconnected market of services, products and consumers to invest in their security.
Ross Anderson, Professor of Security Engineering at the University of Cambridge, and Tyler Moore, Assistant Professor of Cyber Security at the University of Tulsa, show in their paper that information security is a common good on the internet, meaning that investing in a node’s security on the network will increase the overall security of all the network and vice versa.
And as other common goods, information security being a shared resource, also suffers from the tragedy of the commons. The marginal costs of increasing a participant’s security is higher than its marginal benefits, disincentivizing the investment necessary to improve and maintain the overall security.
The market for secure applications is also a market for lemons. Customers are not able to differentiate between a secure and insecure product in advance. This urges them to prioritize cost over security when choosing applications. Since data breaches don’t occur regularly, this approach seems to be “cost-effective” in the short-term, but the end result is an increasingly insecure mishmash of applications on a deteriorating spiral of security right down the rabbit hole.
The key to creating, implementing and using secure ecosystems of applications is to pinpoint the stakeholding participants and giving them the incentives to move against the aforementioned dynamics.
The responsibility for securing every software falls on three main parties: developers, implementers and users of the application. Developers are responsible for building and maintaining secure software, while implementers need to carefully vet and validate applications to make sure they meet the required security standards and work as intended. And users, on the other hand, are responsible for protecting their privacy and data by choosing secure applications and using them wisely. After all, even the most secure application in the world isn’t immune against reckless behavior in a connected world.
To get a better picture, let’s take a look at the Strava disaster, a mobile app for tracking and sharing athletic activity. The data of thousands of government and military employees, the buildings and bases where they’re stationed, their individual jogging routes and more was revealed. This information was leaked through connected wearables where lenient privacy settings on the users’ side allowed Strava to share the private data on a heatmap created by Strava itself.
How did the Strava debacle happen?
In the case of Strava’s heat map case, all three parties failed to live up to their responsibility.
Strava, playing the role of the developer here, made a few mistakes. To begin with, anonymous GPS data sharing was on by default while it should have been an opt-in feature. In addition, opting out of GPS data sharing was not straightforward. Unlike Google, Strava hadn’t removed military bases from their maps.
The military, playing the role of the enterprise implementing the software, failed to educate and guard its personnel against the dangers of fitness trackers and their mobile GPS. In contrast, in 2015, the chinese military knowledged the security threats of wearables and imposed a ban on them.
And last but not least come the users in form of soldiers here. Given their security-sensitive jobs, they should have been more vigilant about their privacy and data security and ready to make sacrifices by setting more rigorous standards when it comes to sharing their data on social networks. One can assume that at least a number of the military personal who used Strava are tech-savvy individuals with IT security jobs.
The key to increasing security is for all the three parties to live up to their responsibilities. Here’s what everyone of them needs to do.
Developers designing more secure apps
There are a number of things developers can do to increase the security of their apps.
- Security by design: By designing a threat model from the start, you are able to implement security into your applications from the beginning. Put yourself in an attacker’s shoes and investigate how they can exploit different aspects of your application? Ask yourself, how can users compromise their security when using your application. Bring a professional security team into the loop from the start to play the bad guys by throwing exploits and vulnerabilities at you. These folks are professionals and live to find vulnerabilities.
- Avoid hardcoding secrets into your code. API tokens, transport layer security (TLS) keys, passwords, and secret keys should never be hardcoded in plaintext format within your configuration files or source code. Support multifactor authentication and single sign-on authentication. Encrypt all forms of sensitive data with standard and strong encryption and implement the least-privilege principle when it comes to access the back-end systems. In essence, the principle of least privilege (also known as the principle of least authority) promotes minimal user profile privileges to a system, based on the users’ job necessities.
- Implement security as part of the user experience. Password complexity is a must for every application, but there are still many users who don’t bother to implement them. Integrating these requirements into the application in concise, secure, and unobtrusive ways needs proper planning. Implementing a certain degree of gamification in your applications will also further incentivize users to adhere to cybersecurity standards. Making the use of password managers simple will also simplify the use of complex passwords for users.
Implementers increasing application security
Enterprises are at the center stage of application security. They plan, educate, organize, purchase, and orchestrate whole application ecosystems, being able to pull the strings on both the users and developers.
- Assess your ecosystem as a whole and not application by application. To have a good understanding of the security standards for an application, analyze it as part of your whole ecosystem. Cybersecurity, being a complex and multifaceted field, depends on the whole. There may be apps that don’t have evident security flaws per se, but in combination with other application lead to your network’s demise.
- Test, Test, Test. Don’t rely on unproven statements about an application’s security. Perform penetration testing before purchasing, or study previous pentest reports if available.
- Secure implementation and continuous monitoring. The most secure application won’t help if not implemented securely in your network and data sharing and application ecosystem. Implement wisely and keep in mind that combinations of secure applications can result in insecure systems. In addition, monitor your application for potential attacks and misuse. Keeping an eye on performance and availability is not enough.
- Select applications that protect users and data. From proper encryption, easy opt-out data sharing features (which are initially opt-in), and features such as multifactor authentication; make sure that the software you use support these features in one way or another.
Users’ strategic position and what they need to do to keep data secure
No matter how hard developers and implementers try to create and provide for secure software, negligent and security-unaware users can blunder everything. On the other hand, privacy aware users can make up for holes that have not been covered by the former.
- Protect your data and privacy outside of the enterprise. Many an enterprise has been hacked through its employees’ private computers and accounts from outside the corporate network. Regularly review your applications and their privacy settings and use available security features like encryption and multifactor authentication. Whenever you feel at risk, alert technical staff.
- Create and manage your passwords responsibly. Use strong passwords, avoid password reuse and use a password manager to save and manage your secrets. Don’t forget to use multifactor authentication whenever possible.
- Avoid unencrypted wireless networks as much as possible. Use secure virtual private networks (VPNs) whenever you are in an open wireless network. The dangers are just too many to do otherwise.
Final thoughts about secure software
It is worth mentioning that there is much movement on different parts to provide for secure applications.
On the legislative side, governments are introducing new bills to force companies to disclose data breaches and ramp up security. U.K. data protection bill, the European Union’s General Data Protection Regulation, and Australia’sPrivacy Amendment (Notifiable Data Breaches) Act are some of the major steps forward. All these steps will incentivise both developers and implementers to invest in cybersecurity.
And many new and more mature organizations and nonprofits have made it their goal to improve cybersecurity standards by both advocating for better regulations and educating all the stakeholders from developers to end users.
Afterall, an ever increasing cybersecurity talent shortage crisis can prominently show us what we’ve missed over the past two decades, what we have learned about the importance of cybersecurity, and what we need to make up for.
- This article was originally published on Tech Talks. Read the original article here.