* Still using the SWAG 'scientific wild-ass guess' method of valuing data? Sometimes the value of data is defined by a service-level agreement and sometimes we come up with our own valuing methods. Companies that lack an SLA and, whether they admit it or not, don’t really know much about their storage, typically use the SWAG method. The SWAG (“scientific” wild-ass guess) that most of us take when it comes to deciding whether some data sets are valuable or not, in most cases is based either on how recently the data was accessed or on how frequently the file has been accessed. Both of these are pretty poor contrivances to help decide how much service and what kind of services should be provided to it.Sometimes the value of data is defined by a service-level agreement and sometimes we come up with our own valuing methods. Companies that lack an SLA and, whether they admit it or not, don’t really know much about their storage, typically use the SWAG method. The SWAG (“scientific” wild-ass guess) that most of us take when it comes to deciding whether some data sets are valuable or not, in most cases is based either on how recently the data was accessed or on how frequently the file has been accessed. Both of these are pretty poor contrivances to help decide how much service and what kind of services should be provided to it.Last issue, we discussed some of the problems associated with the assumption that the older data gets, the less valuable it becomes. I noted that value over time is often a much less reliable metric than is an understanding of how the new uses for the data indicate that the data will be accessed. This may seem like a small distinction, but it lies at the heart of understanding the data lifecycle. Now we come to what is probably the worst valuation method – evaluating data based on how frequently it is accessed, or if it has been rewritten. In an ideal world, usage and value will have a direct relationship, so that the more frequently a file is accessed, the more valuable it can be assumed to be. This of course assumes that all data currently within the system belongs within the system. In the real world however, despite quotas and the best efforts of management, much of the data on our systems shouldn’t be there in the first place. I have heard some system admins estimate that more than half the data in some of their shops is personal or in some other way inappropriate within a corporate context. Clearly if there is no justifiable reason for this stuff being there at all, there is no way we should be ensuring its continuity by throwing backup resources at it.A trivial example: in such circumstances, the spreadsheet files containing each department’s sports pool will likely get heavy usage on Fridays and Mondays during the football season. Spreadsheets of course are not easily “tagged” as potential offenders, as can be the case with MP3 files and the like, so, when a bit is flipped indicating that file content has changed, in most environments the file qualifies for a backup. Need a real metric? I don’t have one, but here is one way you might begin to build your own. Some percentage of the e-mail your site receives each day is for non-business use, and some percentage of that amount certainly has attachments. How much of that amount finds its way out of the e-mail system and into the file system? As this procedure would likely iterate on a daily basis, it is easy to see how corporate disk assets could get used up in a hurry. The only thing that can be said in favor of a system that backs up files just because they have changed is that this method is still better than the alternative that many shops still rely on, even in 2003… make no decision at all, and just back up everything. Nobody’s data gets slighted (except for those who couldn’t get serviced during the maintenance window), and it does have added social value as it allows IT managers to be on a first name basis with their media providers.Probably we need a better way to manage our data. Related content news Broadcom to lay off over 1,200 VMware employees as deal closes The closing of VMware’s $69 billion acquisition by Broadcom will lead to layoffs, with 1,267 VMware workers set to lose their jobs at the start of the new year. By Jon Gold Dec 01, 2023 3 mins Technology Industry Mergers and Acquisitions news analysis Cisco joins $10M funding round for Aviz Networks' enterprise SONiC drive Investment news follows a partnership between the vendors aimed at delivering an enterprise-grade SONiC offering for customers interested in the open-source network operating system. By Michael Cooney Dec 01, 2023 3 mins Network Management Software Network Management Software Network Management Software news Cisco CCNA and AWS cloud networking rank among highest paying IT certifications Cloud expertise and security know-how remain critical in building today’s networks, and these skills pay top dollar, according to Skillsoft’s annual ranking of the most valuable IT certifications. Demand for talent continues to outweigh s By Denise Dubie Nov 30, 2023 7 mins Certifications Certifications Certifications news Mainframe modernization gets a boost from Kyndryl, AWS collaboration Kyndryl and AWS have expanded their partnership to help enterprise customers simplify and accelerate their mainframe modernization initiatives. By Michael Cooney Nov 30, 2023 4 mins Mainframes Cloud Computing Data Center Podcasts Videos Resources Events NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe