A Brief History of Edge
Come with me, and we'll be, in a world of computing innovations. Take a stroll through the history of cloud computing, leading up to edge computing.
Join the DZone community and get the full member experience.
Join For Free
How did edge computing begin — and what made its start possible? In just a few decades, the IT world has evolved from the mainframes, to client/servers, to the cloud, and now to edge computing. How do these eras interconnect, what spurred these evolutionary transitions, and where will we go next? The historical perspective outlined below explains how we landed where we are today: outside of the traditional data center, and at the rise of the era of edge computing.
1960s-1970s: The Mainframe Era
Computers were invented during the era of the mainframe. Large, monolithic systems were a treasured commodity that only sizable organizations could afford to deploy and maintain. Companies conducted all network computing in a physical datacenter. Computing was evolving to meet the rising demands for information access and availability.
Imagine an antiquated world where computers were completely void of font style and size options, graphics — and didn’t even include a mouse. Data entry employees relied on “dumb” terminals to capture digital data at the user level. For example, airline reservations during this time were manually entered by a data entry clerk — with one font color option and on a system with very slow processing power. In fact, some airlines still have these legacy systems in place today.
1980s-1990s: The Client/Server Era
Speed and power were key drivers of the 1980s-1990s. Intel Corporation was formed and revolutionized the IT world with the launch of standard microprocessors. Companies like Dell and Hewlett-Packard were founded, and delivered computing into smaller organizations, and even our homes, with the introduction of small, powerful servers, and desktop and laptop computers.
During this era, Intel founder Gordon Moore observed that transistor densities were doubling every 18 months based on incremental technology improvements. “Moore’s Law” findings held true well into the 21st century, and countless IT vendor organizations have relied on the Law for research, development and accurate roadmap planning.
The IT landscape had to accommodate the increasingly mobile processing demands as business boomed in the 1980s. Computing power was by now directly in the hands of users, with the advent of applications like word processing, spreadsheets, and databases. While some compute was conducted locally, organizations still relied on connections to a primary datacenter for large projects and data storage. Envision an engineering or accounting firm with multiple personal computers for design and file creation sending locally to a network server, then connecting out to a larger data center for further processing, backup, and storage or archival.
2000s-2010s: The Cloud Era
Fast forward to the new century. Y2K is behind us, computers small and large are a staple of every organization, and the Internet is now in every household. Smartphones and tablets enter the market and instantly turn all users into professional photographers and social media members; everyone’s briefcase, backpack, and back pocket is filled with GBs of storage with multiple applications to choose from and manage quickly.
Businesses have by now undergone large economies of scale and are using dozens or hundreds of servers onsite in one location, similar to practices in the mainframe days. With digital data explosion on a significant rise, and regulations now in place holding organizations legally accountable for responsible data use; businesses are looking for a lily pad to migrate the data to while protecting it from the increasing amount of cyber crimes, front-ended by a seamless experience to the IT administrator (or personal user).
The cloud enters as a beautiful, buttoned-up option during this time — users can access applications locally from a computer, phone, or tablet, with all of the processing and storage happening offsite in the cloud. Many organizations around the world have significantly reduced the size of their own on-site datacenter and instead leverage public cloud sites to run applications and store their data. The benefits of the cloud are that CAPEX goes almost to zero and admin time to take care of those systems is cut significantly. However, networking costs and cloud provider fees can really add up fast. Industry behemoths that arose during the Cloud era include Software-as-a-Service companies like Amazon, Netflix, Salesforce, and Spotify and public cloud platforms like Amazon's AWS, Microsoft's Azure, and Google's GCP.
Today: The Edge Computing Era
This leads us to where we are today: the edge computing era, where there is a corollary back to the client/server days. With the rise of IoT, the increase in the number of businesses with multiple sites, and the fact that more and more data is being generated outside the data center, computing systems need to deployed at the edge. These systems must be super low-cost and lightweight so they can be deployed in many small locations and can be managed from one central location since IT resources at the edge are sparse.
The natural evolution of IT transitioned from a centralized model during the mainframe era, to a decentralized model during the client/server era, back to centralized with cloud computing, and today, the trend sits steadily with a decentralized computing model with growing edge adoption.
Because edge computing technology is making the complex simple for IT generalists, the adoption of decentralized edge computing will continue to rise into the foreseen future.
Opinions expressed by DZone contributors are their own.
Comments