Cloud computing becomes decentralized: Future of Computers P5
Cloud computing becomes decentralized: Future of Computers P5
It's an abstract term that snuck its way into our public consciousness: the cloud. These days, most people under 40 know that it's something the modern world can't live without, that they personally can’t live without, but most people also barely understand what the cloud really is, let alone the coming revolution set to turn it on its head.
In this chapter of our Future of Computers series, we’ll review what the cloud is, why it’s important, the trends pushing its growth, and then the macro trend that will change it forever. Friendly hint: The future of the cloud lies back in the past.
What is the ‘cloud,’ really?
Before we explore the big trends set to redefine cloud computing, it's worthwhile to offer a quick recap of what the cloud actually is for the less tech-obsessed readers.
To start, the cloud is comprised of a server or network of servers that are themselves simply a computer or computer program that manages access to a centralized resource (I know, bare with me). For example, there are private servers that manage an intranet (an internal network of computers) within a given large building or corporation.
And then there are commercial servers that the modern Internet operates on. Your personal computer connects to the local telecom provider's internet server that then connects you to the internet at large, where you can then interact with any publicly available website or online service. But behind the scenes, you are really just interacting with the servers of the various companies that run these websites. Again, for example, when you visit Google.com, your computer sends a request through your local telecom server to the nearest Google server asking for permission to access its services; if approved, your computer is presented with Google's homepage.
In other words, a server is any application that listens for requests over a network and then performs an action in response to said request.
So when people refer to the cloud, they are actually referring to a group of servers where digital information and online services can be stored and accessed centrally, instead of inside individual computers.
Why the cloud became central to the modern Information Technology sector
Before the cloud, companies would have privately owned servers to run their internal networks and databases. Typically, this usually meant buying new server hardware, waiting for it to arrive, installing an OS, setting up the hardware into a rack, and then integrating it with your data center. This process required many layers of approval, a large and expensive IT department, ongoing upgrade and maintenance costs, and chronically missed deadlines.
Then in the early 2000s, Amazon decided to commercialize a new service that would allow companies to run their databases and online services on Amazon's servers. This meant companies could continue to access their data and services via the web, but what then became Amazon Web Services would take on all hardware and software upgrade and maintenance costs. If a company needed additional data storage or server bandwidth or software upgrades to manage their computing tasks, they could simply order the added resources with a few clicks instead of slogging through the months-long manual process described above.
In effect, we went from a decentralized server management era where every company owned and operated their own server network, to a centralized framework where thousands-to-millions of companies save significant costs by outsourcing their data storage and computing infrastructure to a very small number of specialized ‘cloud’ service platforms. As of 2018, the top competitors in the cloud services sector include Amazon Web Services, Microsoft Azure, and Google Cloud.
What’s driving the cloud’s continued growth
As of 2018, over 75 percent of the world's data is housed in the cloud, with well over 90 percent of organizations now operating some-to-all of their services on the cloud as well—this includes everyone from online giants like Netflix to government organizations, like the CIA. But this shift isn’t just due to cost savings, superior service, and simplicity, there are a range of other factors driving the cloud’s growth—four such factors include:
Software as a Service (SaaS). Aside from outsourcing the costs of storing big data, more and more business services are being offered exclusively over the web. For example, companies use online services like Salesforce.com to manage all their sales and customer relationship management needs, thereby storing all of their most valuable client sales data inside Salesforce’s data centers (cloud servers).
Similar services have been created to manage a company's internal communications, email delivery, human resources, logistics, and more—allowing companies to outsource any business function that's not their core competency to low-cost providers accessible solely via the cloud. Essentially, this trend is pushing businesses from a centralized to a decentralized model of operations that's usually more efficient and cost-effective.
Big data. Just as computers consistently grow exponentially more powerful, so too does the amount of data our global society generates year over year. We're entering the age of big data where everything is measured, everything is stored, and nothing ever gets deleted.
This mountain of data presents both a problem and an opportunity. The problem is the physical cost of storing ever larger amounts of data, accelerating the abovementioned push to move data into the cloud. Meanwhile, the opportunity lies in using powerful supercomputers and advanced software to discover profitable patterns inside said data mountain—a point discussed below.
Internet of Things. Among the biggest contributors of this tsunami of big data is the Internet of Things (IoT). First explained in our Internet of Things chapter of our Future of the Internet series, the IoT is a network designed to connect physical objects to the web, to "give life" to inanimate objects by allowing them to share their usage data over the web to enable a range of new applications.
To do this, companies will begin placing miniature-to-microscopic sensors onto or into every manufactured product, into the machines that make these manufactured products, and (in some cases) even into the raw materials that feed into the machines that make these manufactured products.
All these connected things will create a constant and growing stream of data that will likewise create a constant demand for data storage that only cloud service providers can offer affordably and at scale.
Big computing. Finally, as hinted at above, all this data collection is useless unless we have the computing power to transform it into valuable insights. And here too the cloud comes into play.
Most companies don't have the budget to purchase supercomputers for in-house use, let alone the budget and expertise to upgrade them annually, and then purchase many additional supercomputers as their data crunching needs grow. This is where cloud services companies like Amazon, Google, and Microsoft use their economies of scale to enable smaller companies to access both unlimited data storage and (near) unlimited data-crunching services on an as-needed basis.
As a result, various organizations can do amazing feats. Google uses its mountain of search engine data to not only offer you the best answers to your everyday questions, but to serve you ads tailored to your interests. Uber uses its mountain of traffic and driver data to generate a profit off of underserved commuters. Select police departments worldwide are testing out new software to track various traffic, video, and social media feeds to not only locate criminals, but predict when and where crime is likely to occur, Minority Report-style.
Okay, so now that we’ve got the basics out of the way, let’s talk about the future of the cloud.
The cloud will become serverless
In today's cloud market, companies can add or subtract cloud storage/computing capacity as-needed, well, kind of. Often, especially for larger organizations, updating your cloud storage/computing requirements is easy, but it's not real time; the result is that even if you needed an extra 100 GB of memory for an hour, you might end up having to rent out that extra capacity for half a day. Not the most efficient allocation of resources.
With the shift toward a serverless cloud, server machines become fully ‘virtualized' so that companies can rent out server capacity dynamically (more precisely). So using the previous example, if you needed an extra 100 GB of memory for an hour, you'd get that capacity and be charged only for that hour. No more wasted resource allocation.
But there’s an even bigger trend on the horizon.
The cloud becomes decentralized
Remember earlier when we mentioned the IoT, the tech that's poised to many inanimate objects ‘smart'? This tech is being joined by the rise in advanced robots, autonomous vehicles (AVs, discussed in our Future of Transportation series) and augmented reality (AR), all of which will push the boundaries of the cloud. Why?
If a driverless car drives through an intersection and a person accidentally walks into the street in front of it, the car has to make the decision to swerve or apply the brakes within milliseconds; it can't afford to spend waste seconds sending the person's image to the cloud and wait for the cloud to send back the brake command. Manufacturing robots working at 10X the speed of humans on the assembly line can't wait for permission to stop if a human accidentally trips in front it. And if you're wearing future augmented reality glasses, you'd be pissed if your Pokeball didn't load fast enough to capture the Pikachu before it ran off.
The danger in these scenarios is what the layperson refers to as ‘lag,' but in more jargon-speak is referred to as ‘latency.' For a large number of the most important future technologies coming online over the next one or two decades, even a millisecond of latency can render these technologies unsafe and unusable.
As a result, the future of computing is (ironically) in the past.
In the 1960-70s, the mainframe computer dominated, giant computers that centralized computing for business uses. Then in the 1980-2000s, personal computers came on the scene, decentralizing and democratizing computers for the masses. Then between 2005-2020, the Internet became mainstream, followed shortly after that by the mobile phone's introduction, enabling individuals to access a limitless range of online offerings that could only be offered economically by centralizing digital services in the cloud.
And soon during the 2020s, IoT, AVs, robots, AR, and other such next-gen ‘edge technologies’ will swing the pendulum back towards decentralization. This is because for these technologies to work, they will need to have the computing power and storage capacity to understand their surroundings and react in real time without a constant dependence on the cloud.
Switching back to the AV example: This means a future where highways are loaded with supercomputers in the form of AVs, each independently collecting vast amounts of location, vision, temperature, gravity, and acceleration data to drive safely, and then sharing that data with the AVs around them so that they drive safer collectively, and then finally, sharing that data back to the cloud to direct all AVs in the city to efficiently regulate traffic. In this scenario, processing and decision making happens at the ground level, while learning and longer-term data storage happens in the cloud.
Overall, these edge computing needs to will spur a growing demand for ever more powerful computing and digital storage devices. And as is always the case, as computing power goes up, the applications for said computing power grows, leading to its increased use and demand, which then leads to a reduction of price due to economies of scale, and finally resulting in a world that will be consumed by data. In other words, the future belongs to the IT department, so be nice to them.
This growing demand for computing power is also the reason why we’re ending this series with a discussion about supercomputers, and followed by the coming revolution that is the quantum computer. Read on to learn more.
Future of Computers series
Emerging user interfaces to redefine humanity: Future of computers P1
Future of software development: Future of computers P2
The digital storage revolution: Future of Computers P3
A fading Moore’s Law to spark fundamental rethink of microchips: Future of Computers P4
Why are countries competing to build the biggest supercomputers? Future of Computers P6
How Quantum computers will change the world: Future of Computers P7
Next scheduled update for this forecast
Forecast references
The following popular and institutional links were referenced for this forecast:
The following Quantumrun links were referenced for this forecast: