Cutting into the Edge: Why Data is Coming Closer to You

Cutting into the Edge: Why Data is Coming Closer To You

By Juan Penaranda
Published: December 15, 2020

Have you ever had a package lost in limbo? With the holidays upon us, many of us are ordering and sending packages, hoping they arrive in time. Getting the packages to their destination requires a network of postal sorting centers to get parcels to their destination efficiently.

What's true of packages is also true of data -- from a text message to the complex code that operates autonomous vehicles. To get the "package" of data from Point A to Point B, avoiding bottlenecks is critical. More critical than ever, in fact, because we're sending record amounts of data: In the minute or so that you’ve been reading this article, over 18 million text messages and 187 million emails have been sent. Without an efficient network to process requests, the last text you sent would travel no more efficiently than a holiday parcel routed to the wrong sorting center.

That's where edge data centers come in. They store and process local copies of data you might need, allowing your request to be processed closer to you in the network. By having this data and processing closer to your location, you get a response faster, which means quicker load times on videos or higher resolution TV streaming. And that's more important than ever, because of the data requirements driven by machine learning, artificial intelligence, virtual reality, telehealth, smart transportation and so much more.

All these emerging applications require the ultra-low latency that edge computing provides. Latency refers to the round-trip time it takes for a data center to receive a request, process it, and send back the necessary information -- like a video, text response, and so on. Historically, network capacity and technology determined latency – imagine the difference in internet speeds between dial-up and fiber optic broadband.

To understand why edge computing is so valuable, let’s look at a use case: cloud gaming. That’s where the game is run on a data center server, instead of your home PC or console. The game is streamed live to your TV, and your control inputs are sent to the data center in real-time, allowing you to control your game character. This allows users who don’t have top-end consoles or PCs to play the newest games – but it requires incredibly low latency, or the game will lag and stutter.

As we begin to implement new technologies like cloud gaming, existing network capacity and technology won’t be sufficient to support wide-scale adoption. To make these technologies possible, service providers are moving their data centers closer to the “edge” of the network. In doing so, the data has less distance to travel before it’s processed, and latency is reduced to the necessary levels for these emerging applications.

Edge computing isn’t just applicable to these emerging, low-latency applications. It also has an important role to play with mainstream data-intensive applications.

When you click to watch a video online, that request makes its way through the network, going from your phone to a cell tower, through a fiber cable, and eventually ending up at the data center housing the video you requested. Transmitting this information through the network costs money, in the form of power – the farther it needs to travel, the more expensive it is. With sustainability being top-of-mind for operators, keeping power usage down is both fiscally and socially responsible. Edge data centers are a key component in delivering superior service, while maintaining an acceptable level of cost.

The drivers for edge computing are three-fold: emerging applications that demand lower levels of latency, keeping cost down despite the enormous amount of data we generate on a daily basis, and doing so in a sustainable fashion. While these emerging applications are exciting, the reality of edge computing lies in the cost efficiencies it generates for data center operators. Edge computing is both a necessary part of a sustainable networking future, as well as a key driver in enabling future technologies. Just as logistics networks are investing in their distribution infrastructure to efficiently capture future demand, telecommunications need to invest in edge infrastructure to do the same.

To keep up with these demands, data center operators are looking for innovation. Corning has answered with our CleanAdvantage™ and extreme density solutions, which save time and maximize capacity. Corning’s Data Center solutions are uniquely positioned to prepare you for any upcoming data center deployment, edge or otherwise. Please click the link to learn more about edge data centers.

Juan Penaranda is a data center specialist in market development for Corning Optical Communications. He specializes in multi-tenant data centers and data center interconnects, analyzing the market and focusing on how trends such as AI, the Internet of Things, and 5G will affect this data center market segment.

Interested in learning more?

Contact us today to learn how our end-to-end fiber optic solutions can meet your needs.

Thank you!

A member of our team will reach out to you shortly.