Managing Multi-Cloud Complexity Across the Globe
- by 7wData
Digital transformation has heightened the focus on data as value, so businesses need to reevaluate the legal and logistical obstacles that come with the global distribution of data and determine what infrastructure will be required to make it work.
Dana Gardner, Principal Analyst at Interarbor Solutions, sits down with leading IT industry analysts for his BriefingsDirect Voice of the Analyst podcast series to wrestle with the mounting complexities businesses face as they transform their IT strategy. To tackle the implications of globalization, Gardner spoke with Peter Burris, Head of Research at Wikibon in Palo Alto, CA.
To begin, Burris breaks down three main concerns that companies should be weary of when adjusting their IT strategy for globalization: latency, privacy, and control.
In a digital world, it’s easy for businesses to forget that the complex physics of cloud computing on a global scale — moving data of any size across different regions — can be extremely expensive due to bandwidth costs and latency issues. Factoring costs and where they wanted to run particular applications into a global strategy becomes complicated when a service is being consumed thousands of miles away from where the data resides.
Gardner suggests that if it requires substantial heavy lifting to make bandwidth capable, international businesses might consider sticking to a private cloud or on-premises approach with a small, local data center. Burris identifies two architectural means of achieving this approach: edge centers, where data processing can be closer to the source for lower storage and processing cost, or a true private cloud.
“True private cloud is our concept for describing how the cloud experience is going to be enacted where the data requires, so that you don’t just have to move the data to get to the cloud experience,” explains Burris.
Ultimately, the less data that has to be moved, the less latency and bandwidth will drive up costs to impede a strong IT strategy. Deployment decisions won’t completely absolve these concerns, however; once the architecture is established, IT teams will need tools to keep everything running efficiently.
According to Burris, the second major concern in a global hybrid cloud scenario is privacy. Intellectual property is treated differently from one global region to another, so while most major hyperscale cloud-service providers are US-based corporations, there should be concern for how Intellectual property is treated globally. Under certain circumstances, governments can access hyperscalers’ infrastructure, assets, and data. That means non-US companies may worry that US-based firms are fairly liberal with how they share data with the government. A truly global-minded enterprise needs to think about privacy in a way that accommodates local markets and local approaches to property.
“All hyperscalers are going to have to be able to demonstrate that they can, in fact, protect their clients, their customers’ data, utilizing the regime that is in place wherever the business is being operated,” says Burris.
[Social9_Share class=”s9-widget-wrapper”]
Upcoming Events
From Text to Value: Pairing Text Analytics and Generative AI
21 May 2024
5 PM CET – 6 PM CET
Read More