Evolution of Edge Computing
Due to the Internet revolution, we now have to deal with millions of data. Handling these data is a challenge. If you need particular information, you need to search through the cloud and other places to collect it which takes time and more bandwidth. ‘Edge computing’ is a technique that brings data close to where it is needed. That is, the computing will be near the location of data. The growth of Internet of Things (IoT) devices means that real-time processing of data is necessary and edge computing can be very helpful in this regard.
The concept of edge computing was found in the 1990s from Akamai’s Content Delivery Network (CDN). The idea was to include nodes near the end-user to transfer images or videos. This was thought to reduce the data transfer time. In 1997 it was found that applications that run on mobile devices can reduce their loads to other servers, thus saving battery life of the mobile devices.
In 2001, the concept of the decentralized distributed application was in place. The network provided load balancing, object location, and good routing. Cloud computing had a great impact on edge computing. In 2006, Amazon came up with the concept ‘Elastic Compute Cloud’ which created new ways of storing information and doing computation. However, cloud computing was not enough to process data from IoT devices. It requires local processing so that decisions can be made in real-time.
In 2009, the term ‘cloudlet’ was introduced. It is a small cloud data center at the edge of the internet. It supported resource-intensive mobile applications by providing computing resources to mobile devices in close proximity. Later in 2012, Cisco came up with the concept of ‘fog computing’ to take care of big data and IoT devices in case of real-time low latency applications.
The IoT is on the rise and many businesses are using multiple sites. This generates data outside the data centers and that’s why edge computing has become essential. From a centralized model, computing has now moved to a decentralized model due to edge computing. The expense of adopting edge computing is low, so more people are going for it.
As we have 5G now, it is important to reduce data latency and this is only possible if edge data centers are placed near the end-user. Due to 5G, it is now possible to carry out virtual class and other works. The demand for edge computing will rise due to the increased use of IoT. More enterprises will adopt it to work with a higher volume of data and reduce data latency. It will process and store data faster, thus supporting real-time applications.
Edge computing is now considered to be the most important driver of growth in the storage and server industry. Edge computing is helping us to move to the new technological era.