The growing adoption of digitalization presents significant resource and data processing challenges, writes Jackson Lee
Data defines the world as we know it today. Don’t believe me? Take this into consideration – by the time you have finished reading this article, over 14 million queries will have been performed on Google alone and Facebook users will have sent an average of 186 million messages and viewed 16 million videos. Staggering? I agree.
As data volumes continue exploding, the brunt of the impact will be felt by data centre providers across the globe. We are already seeing digital powerhouses such as Amazon, Google, IBM and Microsoft using more compute capacity, networking and storage through hyper-scale server farms to meet growing data demands and workloads.
Hyper-scale computing is definitely not a new development – in recent times, we have seen a prolific rise across a number of verticals. From banking to manufacturing, more and more industries are adopting these cost-effective digital technologies as a means to scale up and improve agility to meet customer demands.
On the opposite side of the same coin is the evolution of edge computing and micro-data centres. Speed is everything. Latency can make or break a business, especially when dealing with financial transactions and trading.
These days, the entertainment and automotive industries are also demanding faster connectivity with the lowest possible latency. Media streaming services cannot afford for buffer time when viewers are watching a movie or video online. The same goes for connected car manufacturers. A split second of connection downtime could result in unforeseen accidents.
When applications and data are moved from centralized points to the outer layers of a network, the distance between users and that data narrows. It makes delivering the right information at the right time to the user or the device quicker and more efficient. The increase in interconnectivity between machines, applications and other IoT-based devices using cloud providers is directly tied to this trend. As virtual reality (VR), the connected home and driverless cars emerge as mainstream products and services, a latency-centred product that sits closer to the user is key.
Today, almost every company and user requires near-instant access to data in order to be successful. This might explain why edge computing has been publicized as the next multibillion-dollar tech market. Organizations across the board are increasingly looking to double-down on customer experience through the delivery of services, content and data in real time.
The hybrid generation
The growing adoption of digitalization has given rise to new forms of competition and lifestyle improvement for end users. However, more digitalization also presents significant resource and data processing challenges.
Firstly, a data centre strategy that combines hyperscale and edge computing into one, or chooses one over the other, is neither cost effective nor competitive. It is no longer practical for every connected device or application to use the cloud in the same way smartphones do.
Consider the millions of connected artificially intelligent devices, medical equipment, manufacturing robots and VR headsets in use today. The strain on network bandwidth and speed that these devices can pose soon makes sense. In short, it is highly likely that the user experience of such devices will rapidly deteriorate if congestion and latency are not addressed.
This is why a hybrid strategy – one that welcomes both full hyper-scale (centralized) and edge (decentralized) computing – is so important. If the type of product or service offered is not latency or bandwidth-driven (for example, the billing process after a transaction has been made on Amazon) it makes more sense to host it in the server farm that sits out of town away from the user. Low-level processing, backup or storage are other examples to mention.
However, technologies such as drones, driverless cars and connected fridges are latency-driven products that require more ‘edges’ so that the information can be distributed quicker and the distance between device and data narrowed, thus improving the end user experience. These products produce too much data for it to be processed in a location far away. In order to function effectively and meet the demands of the user, the products need immediate results. This is particularly true of driverless cars.
‘Connected things’ will continue to grow in popularity over the coming years, resulting in an abundance of data being created every single day. Data centre providers will play an important role in helping organizations meet new demands as the data boom continues.
To effectively match user expectations, edge computing and hyper-scale technology must work in tandem. This will provide organizations the ability to leverage the best of both worlds in meeting customer needs effectively while lessening the resulting IT workload and operating costs.
The journey towards a fully hybrid future is inevitably on its way. Organizations and data centre providers alike must be ready to embrace the change.
Jackson Lee is vice-president of corporate development at Colt Data Centre Services