THE EVOLUTION OF EDGE COMPUTING

The origins of edge computing are in the 1990s with the creation of the first content delivery network (CDN), which put data collecting nodes closer to end users. But this technology was limited to images and videos, not massive workloads of data. In the 2000s, the increased shift to mobile and early smart devices increased the strain on existing IT infrastructure. Creations such as pervasive computing and peer-to-peer overlay networks sought to alleviate some of that strain.

However, it wasn’t until the mainstream application of cloud computing that true decentralization of IT began, giving end users enterprise-level processing power with increased flexibility, on-demand scalability, and collaboration from anywhere in the world.

Yet, with more end users demanding cloud-based applications and more businesses working from multiple locations, it became necessary to process more data outside of the data center right at the source and manage it from one central location. That’s when mobile edge computing became a reality.

Today, the “Era of IoT” is changing how businesses allocate IT for their business, making previously complex data collection less of an arduous task.
Posted on by