Many people are hearing about something called “the edge,” but they are unclear about what it means and why it matters for their business. This topic can be even more confusing for business leaders. So, where do we start to unpack what may be the biggest paradigm shift in computing since the cloud?
In this episode of the Cutting Edge podcast, I break down a common nomenclatural error – the difference between edge computing and endpoint computing. I share how the two computing approaches differ, how they relate, and why it’s important to recognize the difference.
01:03: The difference is simple: Edge computing is about distributed computing, while endpoint computing is about localized computing and is commonly referred to as computing on a device or edge device.
05:50: Putting things up in the cloud is taking on a different meaning. This is thanks to the gravity of computing shifting out across the edge because of endpoint computing. The advent of edge cloud with 5G and Edge AI also creates new possibilities for how applications can be distributed.
06:55: Edge computing and endpoint computing are changing the way we think of cloud, as well as distributed and localized computing. In order to see the possibilities, we need to arrive at common terminology and concepts to advance the conversation and discover value.