We are well on our way to a world where communications traffic between mobile apps will be completely secure. Whether voice or text, monitored traffic will be encrypted and uncrackable, even with the cooperation of the app or device developers.
A recent example, Facebook’s WhatsApp is reportedy causing law enforcement concern, as it appears to be impervious to decryption efforts. Government legislation forcing vendors to incorporate some type of backdoor password seems to be the only alternative to living with this new reality, but legislation may be unenforceable in the international context of app development and distribution.
The debate encompasses legal policy and security architecture, treacherous topics for the layman: we profess no deep knowledge of these areas; but we do write mobile apps, and claim to know a bit about them. This column is a mobile app-developer’s view.
For many years app developers did not consider security important. For example, Google recently started blocking apps with “trust-all-servers” SSL (https) clients on their Play Store. Only Google engineers know how many lazy app developers this is catching but the programming forums indicate it’s a large number. Similarly many voice and text communications, even today, are transmitted en clair.
But encryption is becoming more widely implemented, due to increasing respect for privacy on mobile devices. As an example of ongoing privacy measures, Android and iOS recently began to hide devices’ MAC addresses, both over-the-air (with anonymized probe requests) and from apps (withdrawing APIs that provide global device-specific identifiers). The trend to encrypt communications is a consequence of this reach for privacy. App developers mostly find security a pain and get by with the least possible effort, but they understand users need privacy.
Meanwhile, cryptography has made great strides. The principles required to make a virtually uncrackable encryption system, though complex, are now widely understood. Nevertheless, implementation details remain complicated, with many opportunities to design-in unintended vulnerabilities. But the gap between encryption theory and practice is narrowing: mobile app encryption software is much improved. Developers now have access to many well-tested software libraries and open-source projects that implement encryption correctly.
The other driver of secure communications is peer-to-peer architecture with end-to-end encryption. Many apps formerly routed traffic via a central cloud server, making a user-server-user connection, and often decrypted messages at the server. But several forces, including privacy, are causing apps to move to a peer-to-peer architecture where traffic is routed directly to the recipient. There is no longer a central point where traffic may be decrypted. When encryption is end-to-end, an eavesdropper must first determine a connection’s route through the Internet and apply a wiretap, then set about cracking the encryption.
Well-implemented end-to-end encryption systems provide eavesdroppers, whether government-sponsored or illegal, with few opportunities. They could try to seize the device and recover the encryption keys. But examining one of the devices used in an exchange and extracting passwords will become increasingly difficult as app design and device encryption improve.
A second approach would be to try decoding intercepted traffic without knowledge of passwords. This is impractical for modern, well-designed encryption systems. That leaves open the possibility of detecting and exploiting implementation flaws, which can narrow the search-space for a brute-force attack. But flaws will become increasingly rare over time, as white-hat hackers identify them and the authors implement fixes.
Thirdly, passwords could be recovered from users by phishing or similar trickery, or by legally requiring app developers to design-in backdoor passwords and disclose them on-demand (on-legal-demand). Of all the alternatives, this is the only one that is likely to stand the test of time: the rest will become increasingly infeasible.
Therefore, requiring developers to design-in specific vulnerabilities or backdoor passwords is the only long-term solution that meets the needs of law enforcement. But there are practical obstacles to enforcing backdoor design and disclosure. The international nature of the software industry means governments would be unable to constrain app developers without very tight international cooperation and controls. And distribution of unauthorized apps would be a constant issue, requiring national firewalls to prevent downloads from foreign app stores or international distribution by email or other means.
These are not practical solutions in most of the modern world: we must learn to live with communications that are not hackable by anyone, even their creators. Scott McNealy’s famous maxim will be reversed: “Users of communication apps will have 100% privacy, get over it”.
This article is published as part of the IDG Contributor Network. Want to Join?