The more data in one place, the more data it attracts.

This “data gravity” is a familiar function for enterprises, even if the term isn’t. As the number of applications hosted on local servers increases, so too does the amount of data necessary for them to operate. Add more data and more applications are required to manage this data. Over time, the cycle repeats again and again as data gravity builds.

Now, this gravity is shifting to the cloud. With companies making the move to cloud storage, analytics and compute services, the volume of data — and its commensurate gravity — is on the rise. But are the very same clouds designed to boost performance at risk of becoming data black holes?

What is Data Gravity?

Coined in 2010 by Dave McCrory, data gravity is an analog for its physical counterpart.

In the world around us, large objects attract smaller masses. It’s why we don’t fly off the Earth and why the Earth rotates around the sun. Moving large objects away from a center of mass is difficult — this is why sending shuttles into space requires thousands and thousands of tons of rocket fuel to break free from our planet’s gravitational pull.

In the digital world, data gravity refers to the tendency of large data “masses” to attract and retain more data. For example, if a company uses a cloud-based ERP system, this system naturally attracts data related to customer histories, transaction details and key business operations. These data types are themselves governed by applications such as CRM solutions or eCommerce portals, which are carried along toward the data center. These applications also come with their own data and, in turn, the applications required to manage that data — and on and on it goes.

The result is a growing data mass that picks up attractive speed the more data it brings in. This mass also makes it prohibitively time and resource-expensive to run functions outside the center. Consider a security control located at the edge of company networks. Because data must travel back and forth between the control and the central storage mass, the time required to complete key processes goes up. In addition, data may be compromised on its journey to or from the center, in turn lowering the efficacy of these edge tools.

To address this loss of performance and increase in lag time, many companies are now centralizing key services in the cloud — creating even bigger data masses.

From Shared Responsibility to Shared Fate

In a shared responsibility model, ensuring cloud services were available and secure was the role of the provider. Cloud customers, meanwhile, were responsible for configuring and using the cloud — and for any issues that arose due to this configuration and use.

The problem? According to research firm Gartner, cloud customers are the primary driver of cloud security failures. In fact, Gartner estimates that by 2025, 99% of cloud security failures will be the customers’ fault.

To combat this challenge, companies like Google are moving to a “shared fate” model that takes a more active role in cloud configurations with guidance, tools and blueprints to help customers succeed. IBM, meanwhile, has developed solutions such as Continuous Cloud Delivery which help companies create and implement cloud application toolchains that enhance app management and ensure process repeatability.

While the primary impact of this effort is reduced cloud misconfigurations, it also comes with a knock-on effect: increased gravitational pull. If companies know that providers are willing to take on additional responsibilities for data protection and service operation, they’re more likely to accelerate their move to the cloud.

Laws of Attraction: Navigating the Cloud Paradox

If enough physical mass is concentrated in one area, the pressure of the system converts it into a black hole. Not only do these holes in space consume everything around them, but once mass goes past the event horizon, there’s no coming back.

This is the cloud paradox. As providers recognize the shift toward all-in cloud models, they’re creating solutions that make it possible for enterprises to shift every aspect of their IT framework into the cloud. Underpinned by evolving solutions such as software-defined networking (SDN), powerful data analytics and artificial intelligence, it’s now possible for cloud services to outpace on-premises options when it comes to everything from security to performance to collaboration.

The challenge? The more data in the same place, the harder it is to leave. While storing services and data across multiple providers increases complexity, it also lowers the overall escape velocity of data. Put simply; it’s much easier for companies to leave a cloud they use only for a few services or applications. It’s much more difficult to make the switch if critical functions and data are housed in a single cloud. The time and effort required to move increase exponentially as more services are added.

Breaking Free

When it comes to black hole clouds, the solution lies not in avoidance but in agility.

As noted above, cloud providers are moving to a shared fate model as they recognize the role of data gravity in enterprise operations. To bring businesses on board, they’re both reducing prices and improving performance, making data attractors hard to resist.

To make the most of these solutions without drifting past the point of no return, companies need to create comprehensive and consistent policies around which data and applications belong in large clouds, which are better served by specialized providers and should stay on-site. For example, a financial institution might shift its client demographic analysis to a large-scale cloud provider and make use of its computational power. The same company might procure cybersecurity services — such as threat intelligence and incident detection and response — from a provider that specializes in these solutions. Finally, they might opt to keep core financial data in-house and under lock and key.

This approach naturally creates a barrier against data attractors and helps companies resist the pull. It’s also worth digging into provider policies around data lock-in, such as any costs for removing data from cloud frameworks or limits on the amount of data that can be moved at once.

Put simply? Data gravity is growing. To avoid being caught in cloud black holes, enterprises need to identify their IT needs, determine the best-fit location for disparate data sources and deploy specialty providers where appropriate to keep operational data in orbit.

More from Cloud Security

Cloud Workload Protection Platforms: An Essential Shield

2 min read - Businesses of all sizes increasingly rely on cloud computing to power their operations. This shift has brought with it a new set of security challenges. To protect their workloads in the cloud, many of these businesses are deploying a critical tool for cloud security: cloud workload protection platforms (CWPPs).What are Cloud Workload Protection Platforms?CWPPs are comprehensive security solutions designed specifically for cloud-based environments. They provide advanced protection and threat detection capabilities to safeguard cloud workloads and guarantee the confidentiality, integrity…

2 min read

Why Data Security is the Unsung Hero Driving Business Performance

3 min read - In the digital economy, data is like oxygen — giving life to innovation. And just as important, data security establishes the trust needed for that data to deliver value. In fact, organizations with the most advanced security capabilities delivered 43% higher revenue growth than peers over a five-year period, according to research from the IBM Institute for Business Value (IBM IBV). Yet, when corrupted or exposed through cyberattacks, data can fuel disruption. The cost of a data breach averaged almost…

3 min read

Merging DevOps and SecOps is a Great Idea: Get Started Now

4 min read - In the past, developers created the software, and security teams made it secure. But now, agile organizations are baking security into software from the very start. DevSecOps (development, security and operations) is a framework designed to automate security integration during the entire software development and deployment process. The DevSecOps concept is a necessary replacement for the old approach of adding security elements after the development cycle by a separate team. DevSecOps enables security professionals to share cybersecurity responsibility with developers…

4 min read

Is Your Critical SaaS Data Secure?

4 min read - Increasingly sophisticated adversaries create a significant challenge as organizations increasingly use Software-as-a-Service (SaaS), Platform-as-a-Service (PaaS) and Infrastructure-as-a-Service (IaaS) to deliver applications and services. This mesh of cloud-based applications and services creates new complexities for security teams. But attackers need only one success, while defenders need to succeed 100% of the time. Organizations are contending with an exponential rise in advanced threats that are not only increasing in volume but also sophistication. The IBM Cost of Data Breach Report 2022 found…

4 min read