Consequentially Limited

IT in language you understand

Personal computing - a retrospective

It is 37 years since IBM launched its first PC, and while many will argue whether it was the first, our relationship with them has developed. In many ways, it has been like that of a new parent with their first child. In the early days, the PC was functional, running on an operating system without a graphical user interface, something almost unimaginable nearly 40 years later. There was little that had preceded the PC to give a user a clue on how to use it. Slowly man and machine had to get to grips with each other. Like a newborn child crying, PCs could give cryptic error messages that something was wrong, but frequently that message was incomprehensible to the user.

As the PC moved into its second decade, it had definitely grown. It had adopted GUIs and was more user-friendly. The arrival of Microsoft's Windows 3.0 was, for many, the beginning of modern personal computing. The child had learned to walk. The Internet in the early 1990s was used by the few, as connections were created using dial-up modems. Websites were in their infancy, and much of the content was on bulletin boards. Email too was primitive, used more internally by large companies than in the wider community. To many, it was not worth the effort.
With the turn of the millennium, PCs were beginning to be more productive and useful. The website was a common phenomenon. Speeds were increasing through ISDN if one was prepared to pay for it. Email use was growing fast both internally and externally. The PC could walk and talk, and through networks was beginning to build relationships. Naturally with more exposure to others viruses were only waiting to take advantage, another industry spawned.

Today with high-speed broadband and miniaturisation there are smartphones and tablets as the leading personal computing devices, perhaps the children of the PC. Itself now approaching its 40s, it has become the parent.

What have we learned from 37 years of the PC? Regarding creative output, there is a vast amount of data. The data is the bedrock of the current interest in big data and data analytics. It, in turn, has allowed concepts such as machine learning and artificial intelligence to bubble up. We are on the cusp of immense change, almost science fiction. The press is full of articles about job losses as a direct result of technological intervention, but other jobs will arise. There is a need for people.

People have been at the heart of the PCs growth and influence. We are drawn to them, almost as a moth to a flame. It seemed ridiculous that Facebook or Twitter would become as influential as they are, but it happened because society made it so. It can just as easily withdraw its support and, what seems as a Colossus, be turned back to dust. The ability for almost anyone to post their opinion on an open platform is democratising freedom. In the past, the news has been controlled and spun by press barons. They dictated how the public should think and reported what they wanted, and excluded inconvenient counterpoint. Today it is possible for all sides of an argument to get airtime, the debate is alive and real. The mass moderates it. Fake news is not new. It is always subject to manipulation. People need to learn to question what they read and to question everything.

Looking to the future is always foolhardy, but it would seem that personal computing will become wearable. It will rely on more high-speed resilient networking, both mobile and fixed. The growth in AI is inevitable, as inter-related systems share data to enhance services. Will cryptocurrencies become ubiquitous? Perhaps, but if not the smartphone will play a more significant part in transactions. I am an early adopter and keen on the possible advantages it can bring, and I understand the concerns of others too. Personal computing has changed the way most of us live, and it has generally been a force for good. The future depends on us, and personal computing will go where we ask.