Microsoft Security Intelligence Report scores low IQ

Redmond has recently published their semi-annual recap on the (in)security of their leading products.  The Microsoft Security Intelligence Report (MSIR), released approximately two weeks ago, provides an "in-depth perspective" for the second half (Jul-Dec) of 2007.  As usual, a professional appearing report with statistics and graphs are presented to the reader.   Although, after successfully downloading and reading their Key Findings Summary, it appears to have been co-authored by Microsoft Bob and Clippy.

I'm aware that in the fast paced world of security blogging, talking about the release of the MSIR is old news by now.  However, I thought I would wait a week or two to see if people actually read and analyzed the report.  Since I have yet to read any skepticism or criticism, I'm left to assume that the answer is no.  With my corrective red pen in hand, I have downloaded and thoroughly read similar vendor reports covering the same time period.  I think there may be a few questionable graphs, statistics, and statements made by Microsoft. 

It is important to remember that these reports present their findings based on data they each collect, which can vary considerably.  However, with large enough data sets, studies will often yield similar trend results. Microsoft's report is no exception, presenting many of the same trends included in the other vendor studies.  Although this may be a function of Microsoft's usage share of operating systems and web browsers.

While some of the data they provide is more speculative, than authoritative, the ultimate downfall of the MSIR results from its lack of standardization and clarification of data and graphs.  Many of the comparative statistics are rendered meaningless due to erroneous assumptions and absent relative variables.  Actually, it became pretty painful to read after awhile.

The report begins with a few statements about vulnerability disclosures.  They note a 15% drop in new vulnerability disclosures, and a decline in disclosing high severity vulnerabilities.  They fail to mention that Microsoft had the highest vendor accountability for vulnerabilities disclosed from their products, accounting for 3.7% all reported for 2007, according to IBM's Internet Security Systems X-Force 2007 Trend Statistics report.  Regardless, commenting on the number of vulnerability disclosures is a difficult statistic to interpret.  Historically, the different motives and trends for disclosure have been a subject of debate, with recent changes thought to be economically influenced.

Their discussion of vulnerability trends would have provided more contextual meaning if presented as a function of security patches. Highlighting a decrease in low complexity exploits, one might assume the same trend for patches.  They continue to state that high severity vulnerabilities are harder to exploit and require some level of specialization.  Is this always true?  Once disclosed, many exploitable vulnerabilities can have severe consequences until patched.

Microsoft matched publicly available exploit code with corresponding CVE identifiers and Microsoft security bulletins.  Presenting the annual number of bulletins as a function of unique vulnerabilities for 2006 and 2007, they demonstrated a decrease in both number of bulletins and unique vulnerabilities per bulletin last year.  However, there is no mention of the vulnerability's severity, the number of products affected, the time period before patch deployment or the result of exploitation.

When a product-by-product comparison was performed, they discovered that more recent products appear to be more secure than earlier products.  Let's see...newer products, more secure....older products, less secure...that sounds about right.  In fact, the obviousness of this statement made me question where this path of logic was leading.  To illustrate this point, it was noted that the exploit risk for Microsoft Office had decreased from Office 2000 to Office XP, decreased again from Office XP to 2003, and then yet again from 2003 to 2007.  No surprise there.  In a summation, they proudly declared that the risk posed to the Office 2007 system was an "impressive 41.3% decrease when compared to Office 2000".  Have they been reduced to marketing Office 2007 by demonstrating its superior security to Office 2000?   Are they referring to the same Microsoft Office 2000 which had its mainstream support retired in June 2005?

Their comments regarding security breach notifications "as a lens into security failures" was pointless.  Several hundred reports and in-depth studies can be found regarding security breach notifications.  Even those are of questionable validity due to the "self-reporting" of these breaches accounting for only a fraction of the true number.  If a company does have a security breach do they report directly to Microsoft?  As soon as I saw the graph that accompanied their two bullet points on security breaches, the reason for its inclusion was evident.  It displays an incorrectly used bar graph, plotting breach types versus percentage of total, displayed so a quick glance would give the impression that some security metric was decreasing over time.

The next section covered malware and served as campaign platform for their "all knowing, all detecting and all reporting", Malicious Software Removal Tool (MSRT).  Once again, they provide data that's rendered useless, in the absence of context.  Stating only that the total amount of malware removed by the MSRT increased by 40%, completely lacks meaning.  Is this increase due to the growing volume of malware?  Or the widespread use of the MSRT?  Is it because the MSRT runs on startup, and removes the low hanging fruit before Ad-aware and Spybot have had a chance to run?  Or was the MSRT basically doing nothing before and now it's just doing nothing 40% better?

Actually I'm not sure why the MSRT is even mentioned in a discussion on malware trends.  Microsoft describes the awesome power of the MSRT on their website, by boasting "The tool removes only specific, prevalent malicious software. Specific, prevalent malicious software is a small subset of all the malicious software that exists today" and then adding "The tool cannot remove malicious software that is not running. However, an antivirus product can perform this task."  Pretty powerful stuff.  

Nowhere do they make any disclaimer stating that the MSRT may not be reporting back to Microsoft.  Many privacy advocates, like me, never optionally allow any information regarding internet activity to be gathered and sent to 3rd parties.  (Unless it's a really good party.)  Therefore, you may have individuals running the MSRT, who have created the registry key,


Entry name: \DontReportInfectionInformation


Value data: 1

which disables its reporting feature.  For some reason, they go on to comment on the reduced prevalence of spyware, despite the same aforementioned MSRT site stating, "It does not remove spyware."  Comically, Microsoft's spyware removal tool, Windows Defender, is never mentioned once in the reports' text.  It is only referenced in figure 9, one of the many examples of "how not to represent data."

Apparently, the graphs for this report were created the night before it was due, a method I can relate to.  Microsoft, like many industry vendors, seems to lack personnel trained in the fundamental principles and practices of effective graph design.  The MSIR demonstrates this point quite well.   They have incorrectly used a non-normalized stacked bar chart in figure 1 to show CVS severity of vulnerability disclosures over time.  What they have titled figure 5, is a clustered column chart that has been mis-scaled.  The number of disinfections quantified on the Y-axis is too great to represent numerous categories on the X-axis.  Furthermore, the colored legend, which indicates time periods, is not listed chronologically (an important trait for a variable like time). 

They have creatively combined a process cycle diagram with a pie chart, creating a confusing hybrid graph, called figure 6.   Worthless in value, it tries to show the normalized number of computers cleaned with the MSRT by OS version.  Instead, the multi-colored circles demonstrate why you never use similar adjacent colors and how pie(-ish) charts should be ordered by size to facilitate interpretation and meaning (both lacking here).  Instead of critiquing all the graphs, I will just make the recommendation that they buy a copy of Edward Tufte's, The Visual Display of Quantitative Information, read it, memorize it and carry it at all times.

At this point, it's probably overkill to go into the poor use of inferential statistics, lack of variable relationship, absence of statistical correlation and it's essentially containing nothing to indicate causality over chance. 

You'll just have to take my word for it. 

Nevertheless, I really appreciate their effort and believe they gave it their best shot.  When it comes to security intelligence reports, everyone's a winner!

Thanks for playing.     I can be box plotted at:

Copyright © 2008 IDG Communications, Inc.

The 10 most powerful companies in enterprise networking 2022