What is Edge Computing (vs. Cloud Computing)?
Edge Computing is a decentralized, distributed computing infrastructure that has evolved with the growth of IoT. IoT devices often generate data that requires quick processing and/or real-time data analysis. Cloud computing addresses this via a centralized, cloud-based location (often a datacenter) many miles from the device. Edge computing, on the other hand, brings data computation, analysis, and storage closer to the devices where the data is collected, removing the need to backhaul information to the cloud. And with a properly designed architecture that combines hardware and software components at the edge, data can be secured.
Why Edge Computing?
By moving data processing and analysis closer to the point where the data is captured, the need for expensive bandwidth is minimized, response times are reduced, performance is improved, and operational costs are reduced. Organizations can better maximize the value of their IoT devices with deeper insights, better response times and faster, more reliable customer experiences. This is why, according to Gartner, 75% of data will be processed outside the traditional datacenter or cloud by 2025. Popular examples of Edge Computing include autonomous vehicles, smart cities, Industrial IoT, remote weather sensing, streaming services, and smart homes.
Related Solutions and Resources
Exascale workloads benefit from the increased performance, energy efficiency and design flexibility of Arm Neoverse.
This Forrester Total Economic Impact Study demonstrates how Arm Neoverse-based servers and cloud instances are driving IT transformation.
Cloud and hyperscale vendors are experiencing increased performance and efficiency with Arm Neoverse.
Building the world's fastest supercomputer doesn't mean breaking the power budget, as engineers at Fugitsu and RIKEN proved.