The contradictions of software releases and updates

Keeping up with the new features of software releases and upgrades is becoming more and more difficult.

software releases

Long ago, people waited in front of retail stores for Microsoft Windows releases, then Apple releases. Some of these releases could have monumental screw ups after much user experiences, including genuflecting, tent raising, and line-waiting only to add gnashing one's teeth, wailing, reformatting, and tossing machines from the roofs of office buildings.

If an operating system consists of a running kernel, then apps, device drivers, and a raft of legacy user products, it becomes impossible to fully regression test all of the possibilities. This in no way excuses vendors that do not test their products before releasing them. It explains how our industry is in a partial state of permanent change that is no longer marketing event-driven.

Oh, boy! Apple Yosemite after Mavericks is Free! Free! And now—get this, if you bought any of the foolishness leading up to Windows 8.1 after all of its headaches (my opinion is that the tech press ganged up on some stupid stuff and blew "failures" of Windows 8x way out of proportion), you can get Windows 10 Free! Here, let me add some more !!!!!!, just to give things a sense of fanfare.

Virtually any desktop OS now has rolling, slipstreamed upgrades. Please, please get our beta, so that we can make you our crash-test dummies, and we will give it to you for free when it's finally, umm, finished. Test out all this stuff for us! Please!

A decent part of the rationale in this ploy is to test parts of a release based on small kernel changes, while discovering what doesn't work with what—new stuff or legacy. No longer is there any guarantee that an OS platform will be adopted by all new users with absolutely no legacy problems, because now, everything IS legacy. Occasionally, someone does something different. Thunderish IO. USB81. Pulse-taking keyboards. Wearable synchronizable stuff.

Way back when, I remember Steve Ballmer promising that all codebases would be unified back at Windows NT 4.0. Yes, folks, a single kernel. No more 16bit for you, and 32bit ring-zero for you. All one code base. Linux had a similar foray. Every Linux user prior to 2008 had a server on their desktop with a GUI in gnome or KDE. Apple users had a slow-boat to nowhere until the tenets of Mac finally became OSX. Then Apple made a huge investment in servers, but no one likes shiny expensive 1U servers, no matter their specs.

The results:

  • Linux has three main server trees, divided into processor/bit support. These range from versions 2x-4x, not counting a raft of sub-types, including various server editions, sub-server editions (think: openstack ISOs, etc.), those for desktops (has a UI varietal like gnome, KDE, xfce, etc.), each slipstreaming from various sources through two main updating branches, Red Hat/CentOS/SUSE and Debian/Ubuntu. New releases have become milestones, and a yawn. Even Linus Torvalds shrugged his shoulders and said something like, "Ok, guess it's 4.0 now…"
  • Windows XP support folds entirely after July 2015, thus orphaning a generation+ of people whose machines sort of worked, forcing them largely to leapfrog two versions (Windows 7, and 8.X) and the ghost version of Windows 9 to Windows 10, which lots of people are testing right now through a semi-controlled pre-release program. Windows Server editions are in the 2012 R2 cycle, with various add-ins and add-ons now leaning toward partial cloud versions with Windows Azure, a product on which Microsoft is hoping to get IT staff addicted to using for various projects. These projects quickly turn to concrete, thus leaving them situated, where else but in the Azure Cloud(s). Microsoft intends to keep these Azure Cloud advances chimed with a push towards Windows Phone—whose Windows 10 version may be forthcoming after a long catch-up.
  • Apple has largely left the server business, but its desktop OS is essentially free, and upgrades include new feature sets that go statistically to those who keep up with the latest upgrades, in the current case, Yosemite. Despite rumors, Apple's iOS and OSX upgrade in tandem, and attempt to stay in feature sync.

Keeping apprised of the current sense of features has therefore become more difficult, and will remain difficult, because there are four vectors at work here concurrently.

Security/Asset Protection, Lifting All Boats Has Become Impossible, Nothing is Foolproof, and People Trust Too Much

The most valid reason for micro upgrades is the fact that security components are based from core components that interact with other core components, and surgical changes are smarter than forklift upgrades. This is not to say that in the future, some part of your domain will need heretofore untouched hardware components like routers changed to software-defined routing, perhaps very, very rapidly. Router compromises will stop your business cold, no matter the diversity of your infrastructure.

Securing your assets in such a way that the assets don't lose value or compromise privacy will continue to be terribly difficult, in my estimation. Even with one vendor, let's say Microsoft for purposes of argument, very few organizations have a flat profile of servers and software components. The business of Microsoft, and all of the sponsors of Linux, and Apple itself, is to sell you new stuff. Maintaining and hardening revisions is tough. It feels like each vendor just puts more and more updated stuff into their OS payload until they can call it a new version. And that new version of updated stuff, plastered with a few compelling new features, comprises value that we'll pay for. A new version lifts many boats, but by the time releases occur, updates are already in the pipeline, because: security entropy.

Recent zero-days have taught us that keeping patch/fix levels high are important, but nothing is foolproof. Look at how openssl and the Heartbleed flaw made almost every https webserver (yes, save Microsoft's and a handful of others) vulnerable. In the Dark Web, I'm guessing there are many zero-days available for a price, along with data from Home Depot, Target, Anthem, and many others. Some were vulnerabilities, some were systems that weren't foolproof.

My bottom line is that we have an incorrectly placed baseline of trust. Trust no site. Not your own, and no one else's, either. Governments have done pretty much nothing to bring these injustices to trial. But hey, our own NSA infects stuff. Our own government. For us, keeping it all patched and shrugging our shoulders isn't good enough anymore. Instead, my dystopic thoughts turn towards Sean Gallagher's essay, about CyberGeddon.

To comment on this article and other Network World content, visit our Facebook page or our Twitter stream.
Related:
Must read: Hidden Cause of Slow Internet and how to fix it
Notice to our Readers
We're now using social media to take your comments and feedback. Learn more about this here.