- 15 Non-Certified IT Skills Growing in Demand
- How 19 Tech Titans Target Healthcare
- Twitter Suffering From Growing Pains (and Facebook Comparisons)
- Agile Comes to Data Integration
Network World - You read last week's Backspin (you did, didn't you?) so you'll know it included a riposte from my old friend and verbal sparring partner, Winn Schwartau, to my April column "Curse You, Users."
Winn's contention was that "in so many ways, the IT community can be legitimately accused of Epic Fail. We have overestimated users beyond comprehension and that is our fault. There are three million of us geeks and three billion of them. Mark and I are both right, but seem to disagree on how high the digital literacy bar should be."
Winn raised a really interesting point: How high should the digital literacy bar be? Another way of thinking of that is, how low can it go and still make computers useful?
IN PICTURES: Microsoft's most glorious failures
What Bob showed the market was two things: First, Microsoft could make huge, expensive and inconceivably ridiculous mistakes; and second, "dumbing down" the user interface doesn't solve the problem of making the average user more comfortable with using a computer and more productive.
When I say "dumbing down" I really mean it. Microsoft Bob was death by metaphor, turning even the simplest operations into labyrinthine tasks involving "assistants," animated software characters that were supposed to help you but really just served to annoy the crap out of you.
And, of course, these assistants eventually lead to the execrable Clippy, one of a handful of the most annoying conceits ever spawned by the insane engineers and designers trapped in the Microsoft laboratories. (Smithsonian Magazine called Clippy "one of the worst software design blunders in the annals of computing.")
Microsoft eventually realized that Bob wasn't so much a failure -- it was more of a disaster -- and dropped the product like it was radioactive. Which brings me to the whole issue of how to make computers simpler for users.
The answer is not to dumb down computers but to dumb them up; make the hardware and software so well designed that the barriers to understanding and productivity are minimal. And what product has managed to achieve this lofty goal? Roll the drums, please ... the Apple iPad.:
As much as I hate to hand the future of end user computing to a single company, the iPad is da bomb! It really is. Not for heavyweight stuff like Adobe's Photoshop or Wolfram's Mathematica, but rather for everyday programs such as document creation and editing and handling email.
Is the Apple iPad the only game in town? Of course not, there are some other really nice pad-style hardware platforms out there, but none of them have the right software and many have poor hardware.
BlackBerry's PlayBook? Nice, but RIM made some dumb design decisions with it, such as charging only works while the device is on and it doesn't come with an email client (you have to tether to a BlackBerry phone to handle email). And tablets running Windows 7 tend to look like a collection of spare parts flying in formation. Android? Much better than the others, but compared to the iPad it subtly misses. It's just not as, well, elegant. It takes users too much time to get things done and each vendor does its own thing with the Android operating system.