INFORMATION TECHNOLOGY : Sometimes IT seems as if the cloud is swallowing corporate computing. Last year businesses spent nearly $230 billion globally on external { or '' public '' } cloud services, up from less than $100 billion in 2019.

Revenues of the industry's three so-called '' hyperscalers '', Amazon Web Services  [AWS], Google Cloud Platform and Microsoft Azure, are growing by over 30% a year.

The trio are beginning to offer clients new fangled artificial-intelligence [AI] tools, which big tech has the most resources to develop. The days of the humble on-premises company data centre are, surely, numbered.

OR are they? Though cloud budgets overtook in-house spending on data centres a few years ago, firms continue to invest in their own hardware and software.

Last year these expenditures passed $100 billion for the first time, reckons Synergy Research Group, a firm of analysts. Many industrial companies, in particular, are finding that on-premises computing has its advantages.

A slug of the data generated by their increasingly connected factories and products, which Bain, a consultancy, expects soon to outgrow data from broadcast media or internet services, will stay on premises.

The public cloud's convenience and, thanks to its new economies of scale, cost savings come with downsides. The hyperscalers' data centres are often far away from the source of their customers' data.

Transferring these data from this source to where they are crunched, sometimes half a world away, and back again takes time. Often that does not matter; not all business information is time-sensitive to the milli-seconds. But sometimes it does.

Many manufacturers are creating '' digital twins ''of their brick-and-mortar factories, to detect problems, reduce downtime and improve efficiency. They are also constantly tweaking new products under development, often using data streaming in from existing products out in the world.

For all such purposes data need to be analysed in as close to real time as possible, ideally with no  ''jitter'' [ inconsistency of data transfer ] data loss or service outages, all which are surprisingly common in the public cloud.

Many firms also prefer to keep any data on which they train their AI models close to their chest.  Giordano Albertazzi, chief executive  of Veritiv, which provides data-centre infrastructure, thinks this may become a competitive advantage.

Running your data-centre close to your factory also pre-empts looming requirements on localisation and '' data sovereignty '' from governments afraid of letting data leak across their borders.

Countries which have passed sone version of a data-sovereignty laws include China, where plenty of manufacturers of factories, and India [ although its rules apply primarily to financial companies for now].

It is for such reasons that industrial firms are still spending on their data centres to house the data needed to hand, while shipping off less-time-critical-information to the hyperscalers.

Companies that embrace this dual approach include industrial champions such as Volkwagen, A German carmaker, Caterpillar, an American maker of diggers, and Fanuc, a Japanese manufacturer of industrial robots.

This Essay Publishing continues. The World Students Society thanks The Economist.


Post a Comment

Grace A Comment!