2 min read
Edge vs Cloud Computing: What Are the Differences?
Jacqueline L. Mulhern : 1/31/22 11:15 AM
Modern development requires running advanced calculations and processing large amounts of data. But because of the amount of resources this requires, these processes cannot be run on local servers. Instead, they often run either in the cloud or on external processors next to the device. In the cloud, this is known as cloud computing, while external processing is called edge computing. Let’s look at the differences between cloud computing and edge computing, and when each type of processing is used.
What is Cloud Computing?
Cloud computing is the practice of managing and processing data on the cloud, i.e on remote servers that are hosted on the internet. Through cloud computing, such processes become accessible and cost-effective, compared to running these computations locally.
For AI and ML-based products, cloud computing is especially valuable, since these calculations require many computation resources. Thanks to the cloud, scaling becomes achievable at the click of a button and the required servers and software are immediately within reach.
However, cloud computing is not always the best solution. Data that leaves the operator’s premises and is transferred to the cloud could pose a considerable cybersecurity threat. In addition, sometimes cloud computing is technologically challenging. In the case of field devices on infrastructure, like in water systems, on bridges or in the sewage system, transmitting to the cloud is not logistically easy. This is where Edge Computing comes in.
What is Edge Computing?
Not all processing and calculations take place in the cloud. In some cases, processors are implemented on edge devices. Calculations are run physically close to the source and without the need to transmit data to the cloud. This is known as edge computing.
Edge computing is especially useful when calculations are needed in real-time, like for autonomous vehicles, or when privacy is a concern, like in the case of smart glasses or smart homes. Edge computing also helps improve the user experience since it reduces latency, like in the case of live streaming TV.
The idea of edge computing can also be utilized on field devices. By implementing tiny IoT devices that have embedded AI straight onto infrastructure, AI capabilities can also be used in the field. When an anomaly is detected, the information is transmitted immediately to operational teams, so they can take action.
This enables alerting about events that require immediate action, like a sewage spill, as well as forecasting and providing operators and governments with ample time to act before a disaster. For example, measuring vibrations to predict if a bridge is going to collapse.
Conclusion
Cloud computing and edge computing each have their own advantages, and they are fit for different use cases. While cloud computing is more accessible, scalable and requires less maintenance, edge computing is more secure and improves performance.