Enterprises are embracing a hybrid, multi-cloud approach — Putcha

Shiv Putcha

As we emerge from the pandemic, enterprise strategies for data are being impacted by a number of major changes and trends. As economies open up and rebuild supply chains, the landscape looks distinctly different. One of the biggest changes affecting the enterprise is that the Future of Work looks increasingly hybrid and is driving demand for connectivity. Besides hybrid work, an increasing number of use cases that rely on low latency communications are also driving connectivity demand. 

Another shift that is happening is that the number of data sources for the enterprise is increasing rapidly. Enterprises are deploying connected endpoints not only on their production lines but also across their facilities, for logistics and transport, for employee safety and a whole lot more. Greater mobile connectivity with 4G/5G and IoT networks will only drive this trend further, contributing even further to the increasing data volume. 

A third shift is that the types of data and their velocity are also changing. Traditionally, data generated by the enterprise tended to be more structured, with databases and simpler querying tools. However, structured data is now being augmented by unstructured, semi-structured and data with high volume and velocity. For example, the rising number of CCTVs are generating high volumes of video, all of which need to be analyzed and potentially transported back to the corporate servers. A manufacturing plant may have machine-related data stored on premises but other data from the production line could be stored and analyzed in the cloud, as is happening with digital twins. 

Hybrid, multi-cloud will require hybrid data architectures

The upshot of all these shifts is that enterprises must prepare for a future where remote employees must connect to a much larger number of connected endpoints for daily business operations. This will also lead to a diversification of the cloud, with different vendors’ clouds potentially chosen for different applications. In other words, enterprises are moving toward a hybrid, multi-cloud environment. Enterprises are increasingly combining on-premises data with reliable connectivity to the cloud by implementing both hybrid and multi-cloud network strategies. No one hyperscaler will meet all requirements and in many cases, critical enterprise data will continue to be hosted and processed on premises. This is especially true for specific industry verticals like manufacturing, telecommunications, energy, utilities and others.

RELATED: Can you master multi-cloud with certifications? Cloud network architects say maybe

The rising complexity from hybrid and multi-cloud networks has implications that go beyond mere connectivity but applies to enterprise data as well. In other words, as enterprise workloads get increasingly distributed and spread over a hybrid, multi-cloud network, enterprise data generated by these workloads will become increasingly hybrid as well and will cover multiple scenarios. Data may be stored on-premises but the employees may be remote. Data may be stored in the public cloud, or across multiple clouds but employees will increasingly need access to data across these clouds or even a combination of data in the public cloud and on-premises.

For example, different lines of business may use different clouds for their domain-specific applications. There are also an increasing number of cases of enterprises moving workloads back from the cloud to an on-premises setup. 

As data gets increasingly distributed, enterprises will quickly come to grips with storage solutions. However, distributed data throws up a number of challenges as well. Business analytics tends to cut across domains but with data distributed, it is not practical for all of these data users to keep copies of data. Not only will this run up storage costs but also costs of data transfer. Traditional data lakes will also have challenges with respect to available compute resources, with multiple teams potentially competing for finite resources. Moreover, data stored in multiple locations will not look the same or be accessed in the same way, which inevitably causes additional problems with portability and usability of the data.

That said, data still needs to be moved across systems utilized by the enterprise, and this needs to be done in a consistent manner. Universal data distribution with tools like Apache NiFi are doing a lot to help enterprises with these challenges.  

All of these requirements are pushing enterprises to adopt modern data architectures that enable agility, security and easier governance of enterprise data. Many are adopting or evaluating new technologies like the data lakehouse, data mesh and data fabric as they seek to strike a balance between the sometimes-conflicting objectives of keeping costs low and accessing data in a consistent and easy-to-use manner that scales.

Several enterprises are also looking at ways to connect their data to third party marketplaces where they can have access to third party data, applications and developers. This will require new frameworks that abstract from the physical location of the data and allow data to be moved seamlessly with consistent interfaces, features and tools. 

The Telco industry is a great case study for modern data architectures

A great example of an industry vertical that is embracing modern data architectures is the telecommunications industry. If viewed as an industry vertical, telcos will easily rank among the highest producers and consumers of data. Many telcos are well along the path of internal transformation and becoming more data-centric and cloud-native. This internal transformation certainly helps make CSPs more agile and cost competitive but doesn’t automatically translate into greater monetization abilities if they are unable to leverage the modern data architecture to help enterprise requirements.

CSPs can offer crucial differentiation with new data architectures that enable data transfers across these diverse data sources, and points of storage and analysis. Put another way, the data itself needs to be “networked” and will require connectivity to help enterprises analyze, make sense of the data and generate actionable insights that add strategic value.

RELATED: Telcos travel the road to becoming TechCos: Putcha

CSPs are already partnering with leading proponents of hybrid data clouds like Cloudera to help secure and manage the data that will be in transit and cross different domains. The biggest reason for this approach is that the public cloud isn’t a silver bullet for all workloads and datasets, which have varying cost, security, performance and privacy profiles with their own demands. Some will be better off on-prem and this necessitates a hybrid approach for data management and analytics. In some cases, data will move across the CSP network entirely. In most cases, data will have to cross multiple domains where the CSP becomes an honest broker by enabling transit across domains. This will require CSPs to work closely with ecosystem partners like hyperscalers but they can still play a crucial role and add value.

Shiv Putcha is the Founder and Principal Analyst at Mandala Insights, an independent, boutique analyst firm that offers insights, opinions and research on the network and emerging technologies that will drive the next billion digital opportunities in Asia. Shiv is also keenly focused on the intersection of rising enterprise productivity, Industry 4.0 and 5G. Prior to founding Mandala, Shiv covered the telecommunications industry in Asia-Pacific for IDC and Ovum, along with stints at the Yankee Group, Qualcomm and LogicaCMG while based in the United States.

"Industry Voices" are opinion columns written by outside contributors—often industry experts or analysts—who are invited to the conversation by Fierce staff. They do not necessarily represent the opinions of Fierce.