Today’s enterprise organizations are obligated to satisfy constant demands for secure and efficient access to information in order to remain relevant and maintain a competitive edge. The reality is that all the business critical functions of an organization rely on the use and access of information. Depending on the enterprise organization and their business, it has become almost unrealistic from a cost perspective, both CapEx, and Opex, to retain all of the data generated and required by a company locally. The demands for the use and access of data is often balanced between characteristics such as usability, compliance, and availability against cost. As a result of this balancing act, compromising decisions are made to move and relocate valuable data assets to cost-effective locations, infrastructures and public clouds platforms.

While the approach described above may potentially and temporarily solve some of the challenges related to cost, a different set of problems present themselves. In this case, data portability risks (physical or electronic), data integrity, security, loss of data visibility, increased RPOs/RTOs, just to name a few. Information is the core of any business prosperity; there should be very little to no compromises when it comes to satisfying business requirements for data access and visibility. There are also impending data compliance regulations and laws such as GDPR that carry hefty financial penalties that are applicable for not managing data according to the regulations because of lack of visibility. Moving or transporting data to places such as distant Mountains 😉  may expose organization’s data to the risks I mentioned above. This applies to the use of legacy gateways and software solutions to push data to the cloud. It becomes incredibly challenging to use your data when you lose visibility to it. For example how efficient will a process be when it doesn’t know the type of data, the point in time, or the different types of data sets that may be available for processing?

Expand on this concept and apply it to a BCDR scenario where the requirements are to restore unstructured data such files, databases, media files, and not just virtual machines. The deep we go down the rabbit hole, more issues keep coming up. Enterprise organizations should be able to take advantage of the benefits provided by public clouds and modern data management solutions without compromise. They should never have to lose visibility to their data and also have the ways to efficiently and effectively manage it. Knowing where data resides at all times, accessibility, ability to catalog data remotely, and more all needs to be done and accessible electronically. In the currently growing hybrid cloud model evolution, enterprise organizations need to expand their horizons and look for modern data platforms and public and private cloud solutions that are capable of satisfying the data management business requirement of an enterprise organization.

One platform solution worth looking into is Cohesity’s DataPlatform. A solution that has predominantly recognized as a data protection and recovery product, but has proven to be much more than just backup. Through its DataPlatform native cloud integration capabilities, Cohesity’s provides enterprise organizations the ability to maintain visibility of their data and also allow organizations to catalog, inventory and download data from any of the supported data repository targets and locations where data has been stored for long-term retention or business continuity and disaster recovery purposes.With Cohesity organizations can access, retrieve, and catalog their data from any supported internal (on-premise) and external (public cloud) repositories regardless of the accessibility of the originating data source (cluster). For example, an on-premise cluster that archives data for long-term retention by replicating the data to an external cloud target (S3, Blob, NFS) and there is a need to access that archived information to validate compliance from a different site on a separate cluster as illustrated below:

In combination and integration with public clouds, the Cohesity DataPlatform allows organizations to maintain visibility of their data and enable secured access and retrieval of data from any of the available Cohesity cluster deployment models (on-premise/Virtual Edition/Cloud Edition). Accessibility also includes search, metadata/data download capabilities and optimize data egress from public clouds for selective BCDR scenarios.

The demonstration below showcases the scenario and capabilities discussed above.

I’ll say it once again, One of the most valuable assets of any business is information. As it has been said before, “data is the new oil” its time for everyone to step their game up and treat it like it is!

–    Enjoy

For future updates about Hyperconverence, Cloud Computing, Networking, Storage, and anything in our wonderful world of technology be sure to follow me on Twitter: @PunchingClouds.

X