On a recent visit with Schneider Electric to Cisco’s hub in London, The Stack heard from industry experts Daniel Bizo of 451 Research, and Robert Price of Cisco UK, on the movement towards edge computing, the challenges it presents, and some of the impressive numbers involved in all the internet-connected ‘things’ out there

During a wide-ranging discussion on a variety of perspectives and approaches to edge computing, Bizo and Price considered the growing demand for edge technologies and micro data centres, what it actually means when we talk about the emerging Internet of Things, and some unexpected consequences of moving data to the edge.

451 Research analyst, Bizo noted that the cloud is now accepted in many enterprise circles as a way forward, if not the way forward. While it may seem contradictory to be talking about the growth of smaller data centres, Bizo argued that the edge is simply an ‘organic continuation of that trend’.

An idea, Bizo added, that is strongly supported by cloud thought leaders, such as James Hamilton, AWS’s chief architect who has commented: “Taking both the number of regions and the number of data centers required in each of these regions into account, the total data center count of the world’s largest cloud operators will rise from the current O(10^2)  to O(10^5).” 

Edging in on the cloud?

Defining the edge is relatively simple according to Bizo: the place where data is generated – including by sensors and ‘things’. Crucially, he argues that edge presence will be defined and driven by data requirements.

If IoT is concerned with connecting things up, digitisation is what you do with that connection

Bizo identifies three key categories when it comes to edge requirements; latency, data criticality, and data volume.

Data criticality is considered as a critical factor in comparison to data volume – the latter a simple business decision as to whether to use a greater amount of bandwidth in cases with a greater volume of data. While latency is not as important in some instances, it is also crucial in others – imagine the consequences of two seconds of latency in a self-driving car.

IoT – so what?

IoT tends to conjure up images of the future – everything connected, everything smart. Cisco’s Price put the question to the audience: So what?

As is of common opinion, Price argued that the value of this connectivity is in the data itself. He labelled this process ‘digitisation’: ‘If IoT is concerned with connecting things up, digitisation is what you do with that connection.’

He reiterated Cisco’s much-cited 20.8 billion devices connected to the internet by 2020. These 20.8 billion devices, Price explained, would also be producing around 600 zettabytes of data by that time – enough data to stream trillions of hours of music, or, enough that had you started listening at the time that Earth formed, you still wouldn’t be finished.

The edge and IoT

Any data being collected at the edge needs to be analysed before being useful

Given the amount of data soon to be produced by IoT devices, the micro data centres at the edge will be handling a significant proportion of the world’s data – it is estimated that up to 40% will be stored, processed, analysed and acted upon at the edge by 2020.

The analysis of this data, argued Price, is the crux of the issue. When this much data is produced, several difficulties arise – first, it is more data than could reasonably be stored or transferred. The transfer of that data is particularly important when it comes to the edge. If data is received at the edge, it is likely to be transferred to a centralised data centre for storage or analysis, which becomes a major issue when there is so much of it.

To illustrate the point, Price introduced the Square Kilometre Array. A massive array of individual antennas spread across the southern hemisphere designed for astronomical study in unprecedented detail. One of the challenges for the project is the amount of data being brought in – estimated to be around one exabyte every day.

In order to process this insurmountable level of data, each antenna comes with an analytics device that filters out the majority of the unusable data, in this case background radiation, allowing the useful data to be sent back to a centralised location. Price put forward the idea that this same principle can be applied on a smaller scale – any data being collected at the edge needs to be analysed before being useful.

Projects like this truly highlight that the necessity to move towards the edge, as we increasingly find new ways to connect, across new devices, and are presented with whole new sets of challenges and opportunities.


Schneider Electric is an official Knowledge Partner for The Stack, providing industry expertise on many aspects of data centre infrastructure. To read more from Schneider Electric, please visit its Partner Page.