I've been around technology since circa 1970. I spent my entire career running and programming computers. Needless to say I've seen exponential changes and advancements in the technology. Except for maybe warp drives and transporters, computers and technology have advanced beyond what even the science fiction writers could dream of just a few years ago. It was an exciting career. The faint of heart or those who didn't embrace constant change need not apply. This post is not to detail the specific advancements and changes but rather to give my view of the general ebb and flow of the industry.
When I started my career in the late 1960's all the computers were huge and in a special room. The input data came in via punched cards, paper tape or magnetic tape. The output was punched cards, magnetic tape or printout. No one except for the computer operator ever touched a computer. There was no real time updating or instantaneous results. Applications were processed once a day at the most. Maybe by Wednesday morning you would know what happened on Tuesday by reading the printed reports. Computers and data processing were only for large companies.
Within a couple of years there were a few CRT terminals introduced into the landscape. These CRT's, aka dumb terminals, were big and expensive. Initially they were for inquiry only. The information available was usually a very limited subset of that on the printed reports from the night before. They could only display a few lines of text, no graphics or color. CRT's were few and far between. A department might only have one or two for several people. All the computing was still done in data center.
The next step was a wider distribution of CRT's with bigger screens and the addition of input data. Now people in the user departments could add, delete or edit data directly into the computers. Applications were still typically only updated once a day, usually overnight. Most everything was still centrally located with a very few terminals connected by primitive and slow phone line networks.
A couple of companies like Digital Equipment Corporation (DEC) began selling smaller computers. These systems were for smaller companies or for specific applications. They were still company or department systems with maybe a terminal or two. Batch processing still was the mainstay.
The model of a central data center with dumb terminals prevailed until the mid to late 1980's. By then mini computers, personal computers (PC) and faster networks came on the scene. Computing began to migrate from the big central computer room towards the desktop. By the 1990's the dumb terminals were almost completely replaced by PCs. Many were still connected to a big computer for some information but once the data reached the PC it was reformatted, combined and manipulated by the desktop computer.
More and more processing was moved from the big mainframe computers to mini computers to small servers to desktop PCs. It seemed to be a race to get applications to the smallest computer possible. This was not necessarily the best strategy but it was the popular trend. I'm not sure when this trend reached its peak but it was probably in the early 2000's.
About then, companies started to move back towards more central control of data and applications. Instead of huge water cooled computers they installed several smaller servers, often in racks, networked together. Part of this was economics and part was for security and control.
In the past couple of years we have seen an explosion of cloud storage and computing. This is a more centralized approach than local servers or even company data centers. Now instead of a corporate computer there is a national shared data center run by Google, Amazon, Microsoft or any number of cloud companies.
As we move again to more centralized storage and processing we need less computing power at the desktop or our mobile device. So even though your smart phone is more powerful and has more storage than a PC of a few years ago it couldn't function without all the cloud data and computing. We will never go back to days of completely dumb terminals but we may be headed toward less smart desktop and mobile devices.
As networks get faster and cloud computing gets cheaper there will be less and less reason to compute or store locally. An example is the new prices for Google Drive storage. You can now get a terabyte of storage for $10/month. That's probably less than the electricity to power a local TB disk drive. Besides the storage you can use the cloud servers computing power to run you applications. This post is being written, edited, previewed and published in a web browser tab. None of it is stored or processed locally on my PC.
The Chromebook and Chromebox are today's equivalent of the old dumb terminal. They are dependent on a big computer center to do useful work. While everything has gotten faster, bigger (not physically), cheaper and more available we seem to be migrating back toward a centralized computing model. Will this be permanent or is it just the latest phase (fad) of computing technology?
There will always be ebb and flow from central to individual. Companies like centralized for control and security reasons. Individuals like decentralized and local for the freedom and customization. There is also the cost issue. Are a couple of big data centers cheaper than hundreds or thousands of smart desktops or individual devices?
Regardless of the technology or cost issues, never underestimate the effect of social trends. Many a CEO will adopt a new technology direction because it is the popular thing to do. He/she may have read in Forbes or seen on CNBC that everyone is now doing whatever the technology du jour is. It doesn't matter if that strategy is right or cost effective for their company. Individuals are the same. It's trendy to have a MacBook or iMac but if all you do is email and Facebook it is very expensive overkill. Get a Chromebook for about a $1,000 less.
So in the last 40 years we have gone from behemoth central computers with dumb terminals to mini computers to local servers to desktop PCs to web browsers to national cloud data centers with limited function desktop and mobile devices. Will we continue the shift toward centralized computing or will there be a trend toward individual computing? Technology advancements will have much influence on this but so will social pressure.
The pendulum always keeps swinging, it never stops moving. Only the direction is in doubt.
Got any predictions?
wjh