According to The Information Catastrophe, a study by Dr Melvin Vopson, a senior lecturer in physics at the University of Portsmouth – 110 years from now there could be more digital bits than there are atoms on Earth, and the power required to sustain them could exceed our total planetary consumption today (assuming a 50 per cent annual growth rate and no further innovations in computing technology).
That is mind-boggling and, while it will be a challenge for future generations, our immediate challenges are how best to manage the volume of data we’re currently generating. While, in recent years, some companies have consolidated operations and are sending data to the cloud for processing and storage, which includes traditional data servers and cloud data centres, there is increasingly a shift to a distributed model where some computing occurs at the edge of the network, closer to where the data is created. This is known as edge computing.
According to a report by Intel entitled The Edge Outlook: The Now, the New and the Next of Edge Computing, putting computing closer to data sources at the edge – that is, all the data endpoints that exist outside the data centre, from smartphones and PCs to Internet of Things devices and sensors – is the only way to manage the volume of data being generated as it enables rapid, near realtime analysis and response. The edge also functions as an interface with the cloud, which will now only be supplied with the most critical and processed data, and so helping to reduce the amount of data traffic.
Benefits of connectivity
In the Intel report, Inma Martinez, artificial intelligence (AI) scientist, explains how the edge is key to maximising the benefits of what she calls “a magical world of connected devices”. She goes on to say: “The edge makes possible a world where, all of a sudden, every single object has the potentiality for information – information that can be extracted and used in real time”.
In an Industry 4.0 and smart factory set-up, where a great deal of raw data is being generated by connected machines and devices, edge computing allows data to effectively be processed and analysed in real time right where it’s generated. This is especially important as new technologies come online, such as AI, that require actionable intelligence in near real time.
The Intel reports highlights some of the use cases of edge computing, including predictive maintenance and how it can work to remove human error and limitations. For instance, with the help of machine learning, edge computing collects, aggregates and filters data from multiple machines, processes and systems to adapt the manufacturing process in real time and deliver precision monitoring and control.
The rise of 5G is also critical to the future of the edge. Low-latency capabilities of 5G will make the shorter connection between the device and the edge even more efficient. This will be explored in a new smart factory that is being set up in Verona, Italy. For the project, Intel is teaming up with Exor International, an industrial PC and human-machine interfaces manufacturer.
Utilising multiple Intel products, including processors and edge-computing technologies, the factory will highlight what can be achieved with the latest 5G and AI technologies, in an agile and modular application environment.
Intel vice-president Christine Boles said: “We’re seeing Industry 4.0 adoption accelerating. Exor’s new smart factory is a great example of how deploying solutions based upon standards with open architectures can help lower maintenance costs and increase productivity.”
Want the best engineering stories delivered straight to your inbox? The Professional Engineering newsletter gives you vital updates on the most cutting-edge engineering and exciting new job opportunities. To sign up, click here.
Content published by Professional Engineering does not necessarily represent the views of the Institution of Mechanical Engineers.