IBM has come a long way since they introduced their first personal computer in 1981. The advent of the device transformed the world. In the 36 years since then engineers have continued to innovate with computer technology, some might say that the effect computers have had on society is beyond measure.
The history of computers reaches to a period long before 1981, but IBM’s maiden voyage into unfamiliar territory undoubtedly changed the game, and influenced countless numbers of engineers. It was, after all, the International Business Machine.
Now personal computers are everywhere; in our pockets, in our televisions, and even in our watches. And, unfortunately for the technophobic, almost everything in our world is becoming a computer, thanks to the Internet of Things.
The World Economic Forum is closely monitoring the development of computers, tracking their role in creating a better world. As we know computers have had an immense impact on the industrial, commercial and domestic environments in the 21st century; this has encouraged the World Economic Forum to put them on their agenda. They plan to influence the changes that revolutionize the technologies involved and to measure the effect of the internet. Justine Cassell, the Associate Vice-Provost for Technology Strategy and Impact at Carnegie Mellon University, wrote a report for the WEF,
“In the same way we have a tendency to think of computers as rectangular boxes, we have a tendency to think of the internet as being some kind of ether that floats around us. But quite recently researchers have made enormous breakthroughs in creating a way for all objects to communicate; so your phone might communicate to your refrigerator, which might communicate to the light bulb. In fact, in a near future, the light bulb will itself become a computer, projecting information instead of light.”
Designing the future
IBM is continuing the trend of engineering the future with computational technologies (software and hardware), enabling engineers to work more efficiently and build a more interconnected, smarter future. They do this with what they call ‘Continuous Engineering’. They write:
“IBM IoT Continuous Engineering helps teams keep on top of the complexity of developing smart, connected products. Running in the cloud, it helps systems engineers and software developers to deliver against requirements, respond efficiently to change and create high-quality designs faster-- while controlling development costs and meeting compliance needs.”
In truth, many tech companies are tirelessly working at equipping computers with the ability to interconnect almost everything; from our clothes, to our cars, to our houses. They’re trying to make machines more intelligent, so they can do the repetitive tasks that humans find frivolous. In essence they want computers to think for us.
The first evidence of this is perhaps no better exemplified than in the recent boom of in-home artificial intelligence assistants. These devices use hardware and software and an internet connection which is connected to an artificial intelligence. This intelligence learns your preferences, understands what you need when you need it, and can talk to you as well.
IBM has their very own ‘Watson’, Amazon has ‘Alexa’ and Google has their ‘Home’ device. The future of computers seems to involve engineers attempting to build seemingly omniscient machines. But are they getting it right?
Jeff Hawkins, founder of Palm Computing Inc, the company behind the Palm Pilot, has moved on to neuroscience technology. His new company, Numenta, utilizes the current technology within the neuroscience and engineering industry, in the hopes that it will enable computers to mimic the human mind. He writes:
“The solution is finally coming within reach. It will emerge from the intersection of two major pursuits: the reverse engineering of the brain and the burgeoning field of artificial intelligence. Over the next 20 years, these two pursuits will combine to usher in a new epoch of intelligent machines.”
Numenta discovered that for machines to think like humans, they need what is called sensorimotor integration. To see and ‘feel’ the world in the same way humans do. Hawkins calls it the “principle of combining movement with changing sensations”. Hawkins believes computers need to mimic the absorption of information which is traditionally the responsibility of a human neocortex. Unless they are programmed with this capability the ‘intelligent machine’ will remain elusive,
“Intelligent systems need to learn multidimensional models of the world. Sensorimotor integration doesn’t occur in a few places in the brain; it is a core principle of brain function, part of the intelligence algorithm. Intelligent machines also must work this way.”
The video below from IEEE sums it up:
Hawkins’ aim is to reverse engineer the neocortex and understand it from back to front so that he can somehow transfer that understanding into a computer. He says he does not think that those working in machine learning and artificial intelligence are aiming high enough. He concludes:
“While it is exciting for today’s computers to classify images and recognize spoken queries, we are not close to building truly intelligent machines. I believe it is vitally important that we do so.”
IBM IoT Continuous Engineering. 27 June 2017. Web. 27 June 2017.
Hawkins, Jeff. "What Intelligent Machines Need to Learn From the Neocortex." IEEE Spectrum: Technology, Engineering, and Science News. IEEE Spectrum, 02 June 2017. Web. 28 June 2017.
Written by Justine Cassell, Associate Vice-Provost for Technology Strategy and Impact, Carnegie Mellon University. "By 2030, This Is What Computers Will Be Able to Do." World Economic Forum. Web. 27 June 2017.