We’re facing the end of the cloud. It’s a bold statement, I know, and maybe it even sounds a little mad. But bear with me.
The conventional wisdom about running server applications, be it web apps or mobile app backends, is that the future is in the cloud. Amazon, Google, and Microsoft are adding layers of tools to their cloud offerings to make running server software more and more easy and convenient, so it would seem that hosting your code in AWS, GCP, or Azure is the best you can do — it’s convenient, cheap, easy to fully automate, you can scale elastically … I could keep going. So why am I predicting the end of it all?
A few reasons:
It can’t meet long-term scaling requirements. Building a scalable, reliable, highly available web application, even in the cloud, is pretty difficult. And if you do it right and make your app a huge success, the scale will cost you both money and effort. Even if your business is really successful, you eventually hit the limits of what the cloud, the web itself can do: The compute speed and storage capacity of computers are growing faster than the bandwidth of the networks. Ignoring the net neutrality debate, this may not be a problem for most (apart from Netflix and Amazon) at the moment, but it will be soon. The volumes of data we’re pushing through the network are growing massively as we move from HD, to 4k to 8k, and soon there will be VR datasets to move around.
This is a problem mostly because of the way we’ve organized the web. There are many clients that want to get content and use programs and only a relatively few servers that have those…