Cloud Computing – The Age of Alexa
Just like any faithful Amazon shopper, I made sure to keep an eye out this year for the best Prime Day deals during the tech giant’s Black Friday-like event. Over the last few years, I’ve noticed that the variety of products on sale actually manufactured by Amazon grows, from their expansive “AmazonBasics” brand to their incredibly popular Echo product line. What began as Amazon’s single digital assistant is now an entire product and service ecosystem, a perfect illustration of the recent boom of IoT tech.
Over the last month, my family has experienced something rather peculiar: whenever friends with kids visit our house for the first time, at least one child from each family has at some point spoken a request out to the air, addressed to one of several digital assistants, but most commonly Alexa. Somehow there was an unspoken assumption that somewhere nearby there was a listening digital assistant waiting on their beck and call (and impromptu dance party requests).
While an inaccurate assumption in my own house, the undeniable grain of truth here is that for an increasing portion of the world, IoT devices are quickly becoming an inseparable part of life. The explosive increase in variety and ownership of connected devices has had a profound effect on a number of industries, especially on service providers. Each added smart device requires additional bandwidth and faster connection speeds, making any gaps in connection speed or reliability painfully obvious. The unprecedented expansion of IoT infrastructure, exceeding increases in network speed or decreases in service costs, has service providers looking at new ways to maintain customer satisfaction as demands increase. One particularly promising new development in the tech world can be seen in the push toward “Edge Computing.”
What Exactly is Edge Computing?
A fairly new tech buzzword, “Edge Computing” refers to the movement of data computing to be geographically closer to the actual device sourcing the data, rather than pushing data out to the cloud and a distant server for computation and then receiving back the computed data in its final form 1. The need for edge computing is the next logical step after the boom of cloud computing. We’ll get into the mechanics here shortly, but the big takeaway is that the exponential growth of smart devices and the cloud computing they require is putting a strain on networks that is growing faster than the networks themselves. Edge computing reduces that strain by performing tasks locally, on or near the device in question, rather than across large distances and multiple potential points of failure.
While this may feel like a fairly abstract or technical development, the difference between cloud and edge computing bears a strong resemblance to a dynamic most businesses are already familiar with: in-house development versus out-sourcing. Many small businesses find it far more efficient to outsource particular tasks or operational objectives to third-party resources, such as handing over website design and maintenance to a marketing agency. Marketing tasks would then be completed with less cost and greater skill than would be required to hire a full marketing team or make do with less-skilled internal resources. However, if the marketing agency is located on the other side of the country, let alone in another country, several issues arise: multi-step processes, communication delay, even translation issues.
The same is true with the cloud computing functionalities at the core of most IoT smart devices. Raw data is sourced by a smart device in one location, data is uploaded to the cloud and processed at a second physical location, and then the analyzed data is transmitted back to the smart device or another hub for practical use. This process takes up additional bandwidth, increases latency for all devices sharing the same network, and creates three to four times the possible number of areas for vital resources to fail 2. These barriers are then multiplied over and again with the addition of each cloud computing device. Edge computing, on the other hand, has additional startup costs but adds efficiencies over time. Each individual device must have greater processing capabilities, requiring increased power and more expensive components. However, because the data is processed at the source, not only is less traffic placed on the common network, but both the per-instance latency and potential downtime of distant resources are removed. And just like with cloud computing, these efficiencies are multiplied with each added device.
Even more importantly, edge computing is a spectrum; it is the movement of processing closer to the data source. While the goal of many edge computing devices is to enable data processing at the data’s immediate source, another option is the prioritization of data processing at the closest possible data processing center. Continuing our marketing agency analogy, this would resemble hiring a marketing agency in-town rather than across an ocean or in-house. While this requires more intelligent network infrastructure and resource utilization, latency and potential points of failure may not be eliminated but can still be significantly decreased. By triaging the location of the data source, edge computing can process as much of the data locally as possible, increasing processing speed and decreasing the route length for each process.
What’s Next for Service Providers?
While broadband speed and reliability have been steadily increasing, the data suggests that cloud computing IoT devices have been growing at a far faster rate 3. This should hold true experientially as well for most, as the number and variety of connected smart-devices available for purchase have been expanding at a much higher rate than broadband speeds have increased or service rates have decreased. A statistically-average family of three may have had three mobile phones and a laptop or two just 10 years ago, but could easily now have three mobile phones, two tablets, a smart TV, a game console or two, a digital assistant with several access points, and a variety of smart-home management devices, just to start 4, 5. The load this has already placed on broadband service providers could easily be dwarfed by continued growth over the next few years, and the most direct response currently being considered is Edge Computing.
Movement from limited copper network infrastructure to more expandable and future-proof fiber plant is only likely to continue growing in importance for providers planning on sticking around for the long-term. If improving and expanding physical plant is one side of the coin, the other has to do with the actual management of that plant. It’s more important than ever before that providers be capitalizing on every mile of plant, making sure that every strand of fiber is being tracked, maintained, and utilized. If that realization brings concerns to mind over how your broadband network is currently being managed, then please, don’t wait until you’re playing catch-up. Let us know, and allow us to help you get the most out of your existing network and plan more proactively and intelligently for your network’s future growth.
While I’m not likely to be found purchasing an Alexa-connected AmazonBasics microwave oven anytime soon (nope, not joking, that totally exists), it’s impossible to deny the increasing breadth and depth of IoT and cloud computing’s impact on our lives. The areas of our lives untouched by IoT steadily decrease, as the depth and complexity of usage skyrockets. The upstream effects are sure to be felt by providers as per-user and per-access point bandwidth and speed needs grow. While there are a number of factors which could conceivably slow down future IoT expansion, from general security concerns to hard caps on number of per-subscriber connected devices, all signs currently point to significant continued growth – and service providers will have to keep up, or better yet find ways to get ahead of the growth curve.