What is edge computing?

Edge computing is a distributed computing paradigm that brings computation and data storage closer to the sources of data. 
 Edge computing is essentially the process of decentralizing computer services and moving them closer to the source of data. This can have a significant impact on latency, as it can drastically reduce the volume of data moved and the distance it travels.
The origins of edge computing lie in content distributed networks that were created in the late 1990s to serve web and video content from edge servers that were deployed close to use.
There are various types of edge computing:
Sensor
Computer edge
Device edge 
Cloud


Posted on by