• United States

Protecting data in a hybrid cloud environment

May 01, 201810 mins
Cloud SecurityData ManagementHybrid Cloud

In an era when ‘data is the new oil,’ protecting your business’ data is a critical element of your storage strategy. Here’s how you can keep your customers’ information – and your own business – safe and sound.

The past few months have been incredibly instructive on the critical importance of keeping one’s data safe, be it customer data or your own intellectual property.  Data protection itself covers a broad span:

  • Physical data protection
  • Protection from device failure
  • Protection from data loss and breach

Not only is data security important to the success and reputation of your company, it can be IT that goes “under the bus” when a security event occurs. This means that your career is literally on the line. As a result, your storage architecture better be up to the task of maintaining the integrity of your data store.

Hybrid-cloud architectures provide one of the most secure means of protecting stored data

The good news is that the hybrid-cloud storage architecture we’ve been examining in this column is one of the best potential solutions for small- and medium-sized enterprises (SMEs) to leverage when security is of paramount importance. It delivers a secure, end-to-end architecture that provides the flexibility of the cloud with the performance of an on-premises solution, while still encrypting data flows from one site to the other.

You might well ask the question – Why can’t a data center be made as secure and fault-tolerant as the cloud? The answer is clearly it can, however this is very costly, and while affordable for very large enterprises, this option is out of the price range and scope for SMEs. With their scale, cloud providers can afford highly qualified specialists in redundant facility design, network security, network operations and develop optimized products and processes. Public cloud data centers typically have at minimum SOC-2, ISO 27001 and PCI-DSS compliance and extend to federal compliance standards.

Public cloud providers are starting to apply big data and AI techniques to monitoring their cloud operations looking for leakages and misconfiguration. Only the largest organization can afford or acquire this expertise inhouse. Public cloud providers rely on their brand to protect their business and invest accordingly, while many CIOs and IT manager will only be too aware that IT is still often considered a cost center.  Hybrid-cloud storage enables SMEs to garner the benefits of cloud scale and efficiency including soft benefits of expertise and operational excellence.

Physical data protection

Cloud protection starts with physical security protecting against theft, loss, accidents, power failures and natural disasters. Cloud data centers are physically secure, often in remote areas, with multiply redundant, backed-up power supplies, redundant telecom connections, have secure building physical security with controlled access and their size and nature of storage management makes it near impossible identify the physical location or device storing any one organizations data.  By comparison many enterprises at best tend to have a single data center, while SMEs might just have an in-building server room or data closet.  Very small companies may just have a NAS sitting unprotected on site.  To protect against physical data loss, it essential to have a physically separate offsite backup copy.  Unsurprisingly, simple data backup to cloud is the oldest application and until the advent of big data with cloud compute one of the largest consumption of cloud storage.

For physical separation, cloud storage is divided into redundancy or availability zones.  Users can select from multiple zones within one data center (locally redundant) or data can be duplicated across different data centers in different locations in a region (zone redundant) or in different regions (geo-redundancy).  Unlike traditional storage tiering or offsite backup, cloud-based storage is distributed across redundancy zones and handled by the cloud storage system software transparently to users.

Protection from device failure

The next stage is protection from data loss stemming from device failure.  No matter the storage medium, there is always the risk of device failure, and with HDD its inevitable and Flash devices used in SSD will wear out.  RAID technology was developed to protect against drive failure although with very large drives, RAID is increasingly less effective. For traditional storage, best practice in the industry is to follow a 3-2-1 backup strategy – backup to a second device and then backup to offsite.  This quickly becomes expensive both in hardware and IT time spent on maintenance, time that could be spent on strategic business initiatives.

A variant of data loss is inadvertent or malicious deletion of data.  Over time users, and even IT managers, utilizing file hosting and collaborative solutions such as Dropbox and Office 365 have become so accustomed to cloud reliability they assume files are always available.  However, if a file is deleted it is only available for recovery for a short time.  A 2015 study by EMC found the top causes of data loss were accidental deletion (41%), migration errors (31%) and accidental overwrites (26%).To protect against this several new products that provide cloud backup are becoming available especially for Office 365.  

Data can also be lost via corruption by viruses or ransomware.  Ransomware is the most prevalent incident of malware today, per Verizon’s 2018 study of business risks. Another recent example including the WannaCry attack and the major metro of Atlanta, Georgia is still reeling from a major ransomware attack that crippled the city’s applications, from payroll to public transportation.

By using a hybrid-cloud architecture, the authoritative data storage is in the cloud and gains all the benefits of cloud storage covered below, while still presenting a traditional on-premises filer interface, with the added advantage that the filer is now no longer a critical, high maintenance component.  As the filer is just a cache of the cloud data, if it is replaced it will simply replenish with most active files once accessed.

Data in cloud storage is spread across multiple drives and data on the drives is managed throughout their lifecycle by the cloud provider to prevent data loss and make failed drive replacement transparent to the user.  As noted above data can also be saved in geo-redundant locations for maximum protection.

For additional protection the cloud object store can be configured with versioning and made immutable – meaning data can only be written not erased, although in practice time limits can be set for when erasure is enabled.   This ensures any saved version of the file is always available for recovery.

Disaster recovery/file level recovery 

With legacy NAS devices based on hard drives, we know that these drives will inevitably fail and it’s only a matter of time before data must be recovered. As one of the most basic protection mechanisms available, disaster recovery is a storage function that everybody recognizes as an important baseline to have implemented. However, many businesses today are leveraging two different storage backup and Disaster Recovery (DR) strategies. They have one system for use as primary storage and another separate version for backup and recovery.

Leveraging the hybrid-cloud model streamlines this process significantly, as SMEs use the same cloud storage service for both primary storage and backup/DR. The hybrid-cloud storage architecture consolidates files into a single store. This is especially beneficial to organizations with multiple site as it avoids multiple copies being stored on separate File Servers for access with the attendant replication costs, active version headaches and overhead.  With the scalability and falling cost of cloud storage, combined with full namespace visibility and cached cloud filers, it makes sense to just keep every file available in the cloud at all times.

Hybrid-cloud storage services support file-level restore combined with versioning that lets users find prior versions of their files, which means you can restore/backup individual files without having to deal with the whole data store. And all of these have a high-performance connection as part of the on-premise acceleration.

Protection from data loss and breach

The third part of data protection is Protection from Data Breach incurred through human behavior.  Many data breaches and even ransomware incidents start with phishing attacks through social engineering.  Another problem especially with file hosting solutions is Shadow IT where employees upload restricted data to an unauthorized personal cloud file hosting application – such as Google Drive, OneDrive or Dropbox.

Many of these do NOT deliver encrypted end-to-end traffic, although this might be expected from more consumer-oriented services.  The bigger issue is all these services readily facilitate file sharing and now IT has no knowledge of what files have been shared and with whom.   This can easily violate industry compliances like CJIS (Criminal Justice Information Services), FERPA (Family Educational Rights and Privacy Act), HIPAA (Health Insurance Portability & Accountability Act), MPAA (Motion Picture Association of America) and GDPR (General Data Protection Regulation). 

Data breaches remain a significant IT problem,  mostly a result of human error.  While the best prevention is training, systems and process, an ongoing challenge is being aware a breach has incurred. By avoiding Shadow IT, Investing in Audit tools, using Identity Management like Azure AD combined with Device Management and encrypting files at rest and in-transit, breaches can be better avoided and identified when the do occur.

Until recently there was no requirement to report breaches and they typically only became publicly known when they hit the news. The GDPR (General Data Protection Regulation) changes that and makes Breach reporting mandatory. The GDPR which comes into effect May 25, 2018, with severe penalties, both monetary and otherwise, not only applies to organizations located within the EU but it will also apply to organizations located outside of the EU if they offer goods or services to, or monitor the behavior of, EU data subjects. It applies to all companies processing and holding the personal data of data subjects residing in the European Union, regardless of the company’s location.

While most major cloud vendors (AWS, Azure, etc.) fully intend to be GDPR compliant, it’s incumbent upon you and your IT organization to ensure your on-premises and global file system together make for a compliant storage architecture.

By adopting a hybrid-cloud architecture with secure on-premises filers for access, encryption at rest and in transit, utilizing Identity and Device Management and Audit capabilities, preventing shadow IT and limiting who and how files can be shared,  breaches can be minimized.  In the unfortunate event of a breach, accurate log files, immutable data and versioning will speed forensics and recovery.

Maintaining security on an ongoing basis – audits/reviews

Of course, once you finally secure your hybrid-cloud storage architecture, there is no guarantee that it will stay that way! Constant vigilance is always warranted, as well as regularly checking on your platform to ensure it’s still where it needs to be. As a result, you should perform regular cloud-compliance audits to make sure everything is as it should be. These audits can span both your cloud storage provider (or providers) and your own on-premise architecture piece as well.

In many ways, securing your business’ data has become the most critical role for your IT group. As this dynamic market creates even more sophisticated attacks and glaring vulnerabilities, it will be IT’s responsibility to stay ahead of the game. A hybrid-cloud storage architecture should smooth that pathway.


Paul Tien, CEO of Morro Data, is a storage industry veteran that has been developing new models for storage technology over the last two decades. He helped to create the market for consumer and SMB Network Attached Storage (NAS) with the popular Infrant ReadyNAS line of storage appliances, which was later acquired by NETGEAR in 2007.

Prior to Infrant, Paul founded two other successful semiconductor companies. Paul has an MS EECS degree from University of California, Berkeley.

The opinions expressed in this blog are those of Paul Tien and do not necessarily represent those of IDG Communications, Inc., its parent, subsidiary or affiliated companies.