Apple bans researcher for app exposing iOS security flaw

Flaw lets malware run on iPhones, iPads

Apple has banned well-known security researcher Charlie Miller from its developer program, for creating an apparently benign iOS app that was actually designed to exploit a security flaw he had uncovered in the firmware.

Within hours of talking about the exploit with Forbes' security reporter Andy Greenberg, who published the details, Miller received an email from Apple: "This letter serves as notice of termination of the iOS Developer Program License Agreement ... between you and Apple. Effective immediately."

IN PICTURES: Security Industry All-Stars

Based on Greenberg's follow-up story, Apple was clearly within its rights to do so. Miller created a proof-of-concept application to demonstrate the security flaw and how it could be exploited by malicious code. He then hid it inside an apparently legitimate stock ticker program, an action that, according to Apple, "violated the developer agreement that forbid[s] him to 'hide, misrepresent or obscure' any part of his app," Greenberg wrote.

He quoted Miller, who works for security consultancy Acuvant, "I'm mad. I report bugs to them all the time. Being part of the developer program helps me do that. They're hurting themselves, and making my life harder."

Miller, a former National Security Agency staffer, is a well-known "white hat" hacker (he made Network World's recent list of "Security All Stars"), with expertise in Apple's Mac OS X and iOS platforms, including the Safari browser, and in Android. Miller "has found and reported dozens of bugs to Apple in the last few years," Greenberg noted. Miller reported the latest one barely three weeks ago, and it was Greenberg's public account of it yesterday, in advance of a planned public presentation by Miller next week, that got the researcher kicked out of the developer program.

The vulnerability is a fascinating exercise in information security sleuthing. Miller uncovered a flaw introduced in Apple's restrictions on code signing on iOS devices. Code signing is a process by which only Apple-approved commands run in device memory, according to Greenberg's account.

Miller began to suspect a flaw when Apple released iOS 4.3 in March. He realized that to boost the speed of the mobile Safari browser, Apple for the first time had allowed javascript code from a website to run at a deeper level in memory. This entailed creating a security exception, allowing the browser to run unapproved code. According to Greenberg's story, Apple created other security restrictions to block untrusted websites from exploiting this exception, so that only the browser could make use of it.

Miller wasn't the only one to notice that Apple had done something different with Safari in iOS 4.3, but many didn't understand what was actually happening. Various news sites and bloggers claimed that Web apps running outside of Safari, and its new Nitro javascript engine, were slower. Some suggested that Apple was deliberately slowing them down to make Web apps less attractive than native ones.

Untrue, wrote tech blogger John Gruber, on his Daring Fireball blog. "What happened with iOS 4.3 is that web apps (and JavaScript in general) running inside Mobile Safari have been made significantly faster," Gruber wrote. "The Nitro JavaScript engine is only available within Mobile Safari. Outside Mobile Safari -- whether in App Store apps using the UIWebView control, or in true web apps that have been saved to the home screen -- apps get iOS's older JavaScript engine."

Why did Apple do this? For security reasons, Gruber explained. Nitro uses what's called "just in time" (JIT) compilation to speed its processing of javascript. "A JIT requires the ability to mark memory pages in RAM as executable, but, iOS, as a security measure, does not allow pages in memory to be marked as executable," he wrote.

Most modern OSes do allow this, to optimize performance. Apple's iOS blocks it for Mobile Safari, to optimize security. "If you allow for pages of memory to be escalated from writable to executable ... then you are enabling the execution of unsigned native code. It breaks the chain of trust. Allowing remote code to execute locally turns every locally exploitable security flaw into a remotely exploitable one."

And that, apparently, is exactly what Miller was able to do. Miller not only realized Apple had created this exception for Mobile Safari, but he also uncovered what he called "this one weird little corner case" -- a bug -- where it was possible for another program besides the browser to also use it.

Miller hasn't yet publicly revealed what the bug is. But he created a booby-trapped app, called Instastock, to demonstrate it. The app passed Apple's code inspection and was published on the App Store. (Yesterday, after Greenberg's story went live, it was removed.) On the surface, the app just listed stock tickers. But underneath, it connected to a server in Miller's St. Louis home. The device could pull down from the server, and execute, whatever commands he coded. The accompanying video, made by Miller, shows the app reading files on an iPhone and making it vibrate.

"Now you could have a program in the App Store like Angry Birds that can run new code on your phone that Apple never had a chance to check," Miller says, reported in the Forbes story. "With this bug, you can't be assured of anything you download from the App Store behaving nicely."

Apple apparently has not yet said anything publicly about the exploit or its implications.

John Cox covers wireless networking and mobile computing for Network World.

Twitter: http://twitter.com/johnwcoxnww

Email: john_cox@nww.com

Blog RSS feed: http://www.networkworld.com/community/blog/2989/feed

Learn more about this topic

App stores should cooperate to improve smartphone security

10 iPhone passcodes to pass on

Smartphone security follies: A brief history

Insider Shootout: Best security tools for small business
Editors' Picks
Join the discussion
Be the first to comment on this article. Our Commenting Policies