Future of software development: Future of computers P2
Future of software development: Future of computers P2
In 1969, Neil Armstrong and Buzz Aldrin became international heroes after being the first humans to step foot on the Moon. But while these astronauts were the heroes on camera, there are thousands of unsung heroes who without their involvement, that first manned Moon landing would not have been impossible. A few of these heroes were the software developers who coded the flight. Why?
Well, the computers that existed at that time were far simpler than they are today. In fact, the average person's worn out smartphone is several orders of magnitude more powerful than anything onboard the Apollo 11 spacecraft (and all of 1960s NASA for that matter). Moreover, computers at that time were coded by specialized software developers who programmed software in the most basic of machine languages: AGC Assembly Code or simply, 1s and 0s.
For context, one of these unsung heroes, the Apollo space program’s Director of the Software Engineering Division, Margaret Hamilton, and her team had to write a mountain of code (pictured below) that using today’s programming languages could have been written using a fraction of the effort.
(Pictured above is Margaret Hamilton standing next to a stack of paper containing the Apollo 11 software.)
And unlike nowadays where software developers code for about 80-90 percent of possible scenarios, for the Apollo missions, their code had to account for everything. To put this in perspective, Margaret herself said:
"Due to an error in the checklist manual, the rendezvous radar switch was placed in the wrong position. This caused it to send erroneous signals to the computer. The result was that the computer was being asked to perform all of its normal functions for landing while receiving an extra load of spurious data which used up 15% of its time. The computer (or rather the software in it) was smart enough to recognize that it was being asked to perform more tasks than it should be performing. It then sent out an alarm, which meant to the astronaut, I'm overloaded with more tasks than I should be doing at this time, and I'm going to keep only the more important tasks; i.e., the ones needed for landing ... Actually, the computer was programmed to do more than recognize error conditions. A complete set of recovery programs was incorporated into the software. The software's action, in this case, was to eliminate lower priority tasks and re-establish the more important ones ... If the computer hadn't recognized this problem and taken recovery action, I doubt if Apollo 11 would have been the successful moon landing it was."
— Margaret Hamilton, Director of Apollo Flight Computer Programming MIT Draper Laboratory, Cambridge, Massachusetts, "Computer Got Loaded", Letter to Datamation, March 1, 1971
As hinted at earlier, software development has evolved since those early Apollo days. New high-level programming languages replaced the tedious process of coding with 1s and 0s to coding with words and symbols. Functions like generating a random number that used to require days of coding are now replaced by writing a single command line.
In other words, software coding has become increasingly automated, intuitive, and human with each passing decade. These qualities will only continue into the future, guiding the evolution of software development in ways that will have a profound impact on our day-to-day lives. This is what this chapter of the Future of Computers series will explore.
Software development for the masses
The process of replacing the need to code 1s and 0s (machine language) with words and symbols (human language) is referred to as the process of adding layers of abstractions. These abstractions have come in the form of new programming languages that automate complex or common functions for the field they were designed for. But during the early 2000s, new companies emerged (like Caspio, QuickBase, and Mendi) that began offering what are called no-code or low-code platforms.
These are user-friendly, online dashboards that enable non-technical professionals to create custom apps tailored to the needs of their business by way of snapping together visual blocks of code (symbols/graphics). In other words, instead of cutting down a tree and fashioning it into a dressing cabinet, you build it using pre-fashioned parts from Ikea.
While using this service still requires a certain level of computer savvy, you no longer need a computer science degree do use it. As a result, this form of abstraction is enabling the rise of millions of new "software developers" in the corporate world, and it's enabling many children to learn how to code at an earlier age.
Redefining what it means to be a software developer
There was a time when a landscape or a person’s face could only be captured onto a canvas. A painter would have to study and practice for years as an apprentice, learning the craft of painting—how to blend colours, what tools are best, the correct techniques to execute a specific visual. The cost of the trade and the many years of experience needed to perform it well also meant that painters were few and far between.
Then the camera was invented. And with the click of a button, landscapes and portraits were captured in a second that would otherwise take days to weeks to paint. And as cameras improved, became cheaper, and became plentiful to a point where they’re now included in even the most basic smartphone, capturing the world around us became a common and casual activity that everybody now takes part in.
As abstractions progress and new software languages automate ever more routine software development work, what will it mean to be a software developer in 10 to 20 years time? To answer this question, let’s walk through how future software developers will likely go about building tomorrow’s applications:
*First, all standardized, repetitive coding work will disappear. In its place will be a vast library of predefined component behaviors, UI’s, and data-flow manipulations (Ikea parts).
*Like today, employers or entrepreneurs will define specific goals and deliverables for software developers to execute through specialized software applications or platforms.
*These developers will then map out their execution strategy and begin prototyping early drafts of their software by accessing their component library and using visual interfaces to link them together—visual interfaces accessed through augmented reality (AR) or virtual reality (VR).
*Specialized artificial intelligence (AI) systems designed to understand the goals and deliverables implied by their developer's initial drafts, will then refine the drafted software design and automate all quality assurance testing.
*Based on the results, the AI will then ask a multitude of questions to the developer (likely through verbal, Alexa-like communication), seeking to better understand and define the project’s goals and deliverables and discuss how the software should act in various scenarios and environments.
*Based on the developer’s feedback, the AI will gradually learn his or her intent and generate the code to reflect the project goals.
*This back and forth, human-machine collaboration will iterate version after version of the software until a finished and marketable version is ready for internal implementation or for sale to the public.
*In fact, this collaboration will continue after the software is exposed to real-world usage. As simple bugs are reported, the AI will fix them automatically in a manner that reflects the original, desired goals outlined during the software development process. Meanwhile, more serious bugs will call for a human-AI collaboration to resolve the issue.
Overall, future software developers will focus less on the ‘how' and more on the ‘what' and ‘why.' They will be less craftsperson and more architect. Programming will be an intellectual exercise that will require people who can methodically communicate intent and outcomes in a manner that an AI can understand and then auto-code a finished digital application or platform.
Artificial intelligence driven software development
Given the section above, it’s clear that we feel AI will play an increasingly central role in the field of software development, but its adoption isn’t purely for the purpose of making software developers more effective, there are business forces behind this trend also.
Competition between software development companies is getting fiercer with each passing year. Some companies compete by buying out their competitors. Others compete on software differentiation. The challenge with the latter strategy is that it isn't easily defendable. Any software feature or improvement one company offers to its clients, its competitors can copy with relative ease.
For this reason, gone are the days when companies release new software every one to three years. These days, companies that focus on differentiation have a financial incentive to release new software, software fixes, and software features on an increasingly regular basis. The faster companies innovate, the more they drive client loyalty and increase the cost of switching to competitors. This shift towards the regular delivery of incremental software updates is a trend called “continuous delivery.”
Unfortunately, continuous delivery ain't easy. Barely a quarter of today's software companies can execute the release schedule demanded of this trend. And this is why there's so much interest in using AI to speed things along.
As outlined earlier, AI will eventually play an increasingly collaborative role in software drafting and development. But in the short term, companies are using it to increasingly automate quality assurance (testing) processes for software. And other companies are experimenting with using AI to automate software documentation—the process of tracking the release of new features and components and how they were produced down to the code level.
Overall, AI will increasingly play a central role in software development. Those software companies that master its use early will ultimately enjoy exponential growth over their competitors. But to realize these AI gains, the industry will also need to see advancements in the hardware side of things—the next section will elaborate on this point.
Software as a service
All manner of creative professionals use Adobe software when creating digital art or design work. For nearly three decades, you purchased Adobe’s software as a CD and owned its use in perpetuity, buying future upgraded versions as needed. But in the mid-2010s, Adobe changed its strategy.
Instead of buying software CDs with annoyingly elaborate ownership keys, Adobe customers would now have to pay a monthly subscription for the right to download Adobe software on their computing devices, software that would only work alongside a regular-to-constant Internet connection to Adobe servers.
With this change, customers no longer owned Adobe software; they rented it as-needed. In return, customers no longer have to constantly purchase upgraded versions of Adobe software; so long as they subscribed to the Adobe service, they would always have the latest updates uploaded to their device immediately upon release (often several times a year).
This is only one example of one of the biggest software trends we've seen in recent years: how software is transitioning into service instead of a standalone product. And not only smaller, specialized software, but entire operating systems, as we've seen with the release of Microsoft's Windows 10 update. In other words, software as a service (SaaS).
Self-learning software (SLS)
Building on the industry shift toward SaaS, a new trend in the software space is emerging that combines both SaaS and AI. Leading companies from Amazon, Google, Microsoft, and IBM have begun offering their AI infrastructure as a service to their clients.
In other words, no longer is AI and machine learning accessible to only to software giants, now any company and developer can access online AI resources to build self-learning software (SLS).
We'll discuss the potential of AI in detail in our Future of Artificial Intelligence series, but for the context of this chapter, we'll say that current and future software developers will create SLS to create new systems that anticipate tasks that need doing and simply auto-complete them for you.
This means a future AI assistant will learn your work style at the office and begin completing basic tasks for you, like formatting documents just as you like them, drafting your emails in your tone of voice, managing your work calendar and more.
At home, this could mean having an SLS system manage your future smart home, including tasks like pre-heating your home before you arrive or keeping track of groceries you need to buy.
By the 2020s and into the 2030s, these SLS systems will play a vital role in the corporate, government, military, and consumer markets, gradually helping each improve their productivity and reduce waste of all kinds. We'll cover SLS tech in more detail later in this series.
However, there is a catch to all this.
The only way the SaaS and SLS models work is if the Internet (or the infrastructure behind it) continues to grow and improve, alongside the computing and storage hardware that runs the ‘cloud' these SaaS/SLS systems operate on. Thankfully, the trends we're tracking look promising.
To learn about how the Internet will grow and evolve, read our Future of the Internet series. To learn more about how computer hardware will advance, then read on using the links below!
Future of Computers series
Emerging user interfaces to redefine humanity: Future of computers P1
The digital storage revolution: Future of Computers P3
A fading Moore’s Law to spark fundamental rethink of microchips: Future of Computers P4
Cloud computing becomes decentralized: Future of Computers P5
Why are countries competing to build the biggest supercomputers? Future of Computers P6
How Quantum computers will change the world: Future of Computers P7
Next scheduled update for this forecast
Forecast references
The following popular and institutional links were referenced for this forecast:
The following Quantumrun links were referenced for this forecast: