Data Expedition, Inc.


Articles, events, announcements, and blogs

The Global Cloud Bottleneck

by Seth Noble |  Blog Apr 19, 2017

It is often said that cloud computing is coming to dominate modern network workflows, but a look at the numbers shows just what an understatement that really is.

According to the Cisco Global Cloud Index: Forecast and Methodology 2015 - 2020, global cloud IP traffic will account for 92 percent of total data center traffic by 2020, with an annual total of 14.1 zettabytes (14.1 trillion gigabytes per year). This growth is driven by the many familiar benefits of the cloud, a list which a recent Interop ITX survey (2017 State of the Cloud Report) found to be topped by greater scalability, higher performance, better and faster access to technology, and cost savings.

But taking these two factors together, staggering network volume with substantial in-cloud benefits, also reveals the cloud's greatest weakness: getting those benefits means moving a lot of data. Simply put, you can't do anything in the cloud until your data gets there, and nothing you do there matters until you get your results out.

As anyone who has tried to FTP a file over any distance knows, having big pipes is not enough. The TCP/IP software infrastructure behind familiar technologies like FTP, HTTP, and SCP was designed for the networks of 1974, not 21st century data paths. Faced with trying to lift petabytes of data into the cloud, even an 18-wheeler full of hard-drives may start to look attractive.

However, this shift from on premise data centers toward cloud infrastructure has created new opportunities as well as challenges. On the one hand, cloud data is never local to anywhere. On the other hand, having only a few big IaaS vendors means that the infrastructure is well known and fairly predictable.

At Data Expedition, Inc., we're in the business of getting the most performance out of networks through the use of our thoroughly modern data transport technology, MTP™/IP (Multipurpose Transaction Protocol®). CloudDat is the distillation of our years of experience helping companies move data through cloud workflows. Whether it is lift-and-shift data center migration, or an ongoing media-rich workflow, it is critical to be able to move data at a cost point that doesn't negate cloud's benefits. Moving data in and out of the cloud at up to 900 megabits per second for each file or data stream, and doing it with commodity internet connections, means a huge reduction in both the cloud's cost of entry and the cost of doing business.

Exporting data to hard-drives, shipping them, and then sorting out the imported files is labor-intensive, expensive, and error-prone. Leasing a dedicated network line or licensing acceleration software that charges by the byte is just plain expensive. That's why we're launching CloudDat: an alternative to both that brings down the barriers to cloud adoption and breaks the bottleneck.