Easter eggs and the Trusted Computing Base
Sign up to receive this and other networking newsletters in your inbox.
The confluence of several security threats has destroyed the Trusted Computing Base (TCB) on which security has depended for the last two decades.
The TCB was the constellation of trustworthy hardware, operating systems and application software that allowed for predictable results from predictable inputs.
Did you know that there is a flight simulator concealed in Microsoft Excel 97? To access this game use the following sequence of commands (detailed by Larry Werring in RISKS Digest 19.53 on January 5, 1998):
- Open Excel 97.
- Open a new worksheet and press the F5 key.
- Type X97:L97 and press the Enter key.
- Press the Tab key.
- Hold Ctrl-Shift and click the Chart Wizard button on the tool bar.
- Once the Easter egg is activated, use the mouse to fly around - right button for forward, left for reverse.
If you have DirectX drivers installed, a bizarre landscape appears and you can "fly" over (or under) the geometric forms by using the arrow keys on your keyboard. If you look carefully in the virtual distance, you can find a stone monitor planted in the ground. If you get close enough, you can see the names of the development team scrolling by.
How much space in the source and object code does this Easter egg take? How much RAM and disk space are being wasted by all the people who have installed and are using this product? And much more seriously, what does this Easter egg imply about the quality assurance at the manufacturer's offices?
An Easter egg is presumably undocumented code - or at least, it's undocumented for the users. I do not know if it is documented in internal Microsoft documents. However, I think the fact that this undocumented function got through Microsoft's quality assurance process is terribly significant. I think the failure implies that there is no test-coverage monitoring in that quality assurance process.
When testing executables, one of the necessary (but not sufficient) tests is coverage - that is, how much of the executable code has actually been executed at least once during the quality assurance process. Without running all the code at least once, one can state with certainty that the test process is incomplete. Failing to execute all the code means that there may be hidden functionality in the program: anything from an Easter egg to something worse. What if the undiscovered code was to be invoked in unusual circumstances and cause damage to a user's spreadsheet or system? We would call such code a logic bomb.
That's bad enough, but it gets worse. Consider the following observations:
- There is already at least one family of Excel macro virus that alters the contents of cells; the Macro.Excel.Sugar virus randomly inserts silly text into up to 200 cells. This payload is immediately obvious, but more insidious Excel macroviruses might cause subtle problems. For example, a virus could cause shifts in the low-order significant digits of constants - something that might not be noticed in individual cells but which might have significant effects on calculated results.
- Research projects by Coopers & Lybrand in London, England, showed that 90% of all the spreadsheets with more than 150 rows had errors in them. Research on production spreadsheets by University of Hawaii scientists revealed that in 300 files tested and in experiments with more than 1,000 users, many spreadsheets contained at least one significant formula mistake.
- In December 1999, Computer Associates issued a warning about the W.95.Babylonia virus, described as an extensible virus whose payload could be modified remotely by its author. The December outbreak of Babylonia in the wild involved a Trojan horse disguised as a Y2K bug fix for Internet Relay Chat users. The Trojan horse would send itself to other users and also poll an Internet site in Japan looking for updated plug-ins to alter the effects of the malicious software.
- Distributed computing in today's Internet means that most na´ve users accept code from Web sites with little awareness of the dangers of executing unknown and perhaps poorly tested or malicious code on their desktop.
- Recent distributed denial-of-service attacks have shown how easy it is to install unauthorized code on Internet-connected systems and have that code lie quiescent until instructions are broadcast from a master program on a remote system.
Well then, here's the scenario: Bad Guys infiltrate major software company and install undocumented code in widely distributed spreadsheet software. Faulty quality assurance allows the logic bomb to go into production releases.
The logic bomb in the spreadsheet software receives payload instructions from an Internet connection.
At a specified time, the spreadsheet program alters data in millions of spreadsheets in, say, the U.S. Calculations go awry in subtle but dangerous ways. Since almost no one bothers to document their spreadsheets or provide test suites that can validate the calculations, few people notice the changes.
Business, engineering, medical and academic users make mistakes - they allocate the wrong amounts to investments and inventory, they predict the wrong stresses on bridge components, they calculate bad dosages for patient medication and they assign good grades to bad students.
This situation leads to decreased efficiency in the U.S. economy and is a contributing factor to a national and eventually international recession.
This scenario is an example of asymmetric information warfare - electronic sabotage on a grand scale but for low cost. Winn Schwartau used just this kind of scenario in his 1991 novel, Terminal Compromise - great fun and still available from Interpact (e-mail email@example.com).
So the next time you play with an Easter egg in commercial software, stop and think. Should you express your concerns to the manufacturer, instead of just chuckling over a programmer's joke?