?Edward R. Murrow
Don?t forget history?s lessons
IT (Information Technology) on its own has no physical form; it needs a host in which to inhabit so that it can push out technical innovation onto the world stage.
We have now had over sixty years of experience with IT residing in computers; and we’ve personally felt the results as IT hosted in mainframes, desktops, tablets, on our laps and in our pockets changed just about everything around us. Ourselves included!
Put IT into a cellphone and it becomes a Smartphone. Put IT into a home thermostat and it becomes Nest. And so on.

In 1972, Hewlett-Packard (H-P) put IT into a 35-key handheld calculator, named it the HP-35, and the device revolutionized science and engineering.
As H-P wrote about it back then: “The HP-35 Scientific Calculator, and others like it, quickly replaced the faithful slide rule that had been used by generations of engineers and scientists for rapid calculation and simple computation.”
By 1975 over 300,000 HP-35s had been sold (at $395 each!), and in the process, completely gutted the multi-billion dollar slide rule industry. With it went thousands of jobs, careers, livelihoods, and a magnificent craft.
Also, in the very same process, thousands of new jobs, careers and livelihoods came into being, as well as another magnificent craft: the handheld calculator industry.
That industry, of course, now resides, with the appropriate app in place, on our Smartphones. IT elegantly and innovatively recombined itself into a new host that we now tote around in our pockets and purses.
With the advent of Google Glass, we can now wear a Smartphone around on our faces.
The many faces of information
As Erik Brynjolfsson and Andrew McAfee have offered up in their The Second Machine Age, the inexorable churn of Moore’s Law, digitization, and recombinant innovation are now at an inflection point where IT is powerfully ubiquitous and omnipresent. IT can literally go anywhere and do anything, naturally eliminating industries and jobs or naturally creating industries and jobs.The app industry is one of IT’s most recent creations. IT and robots will do likewise.
In the July/August issue (2014) of Foreign Affairs, McAfee and his writing buddy, Erik Brynjolfsson, along with Michael Spence from NYU?s Stern School of Business, put out an essay called New World Order. Three paragraphs down, they hot-stamp their grand driving force into the essay:
“Machines are substituting for more types of human labor than ever before. As they replicate themselves, they are also creating more capital. This means that the real winners of the future will not be the providers of cheap labor or the owners of ordinary capital, both of whom will be increasingly squeezed by automation. Fortune will instead favor a third group: those who can innovate and create new products, services, and business models.”
Robot innovators are in that third group.
The key to it all is not the devices within which IT dwells, but rather, the information itself. As Charles Seife writes in Decoding the Universe:
?Each creature on Earth is a creature of information; information sits at the center of our cells, and information rattles around in our brains. But it?s just not living things that manipulate and process information. Every particle in the universe, every electron, every atom, every particle not yet discovered is packed with information.?
Information Technology’s newest host body?its newest form factor?is that of the robot.
If we are information machines, so too are the machines that we make in our image; and those machines will do the work that humans endow them to perform, and they will do it better than humans. That’s just an inescapable fact that humans have been grappling with for decades: feeling marginalized by the very machines that we create.
And like the thousands of jobs, careers, livelihoods and masterful craft lost by the slide rule industry to the calculator industry, so too will thousands or tens of thousands more be lost to these newly IT-enabled robots.
Moore’s Law, digitization, and recombinant innovation
Modern robotics, which is robotics with IT hosted within it, is a relatively recent development; one that can be centered in 1993 with the construction of Carnegie Mellon?s robot Dante.

Microprocessor-based robotics, as with Dante, was made possible by miniature electronics like Intel?s 80386 microprocessor, introduced in 1985, that held 275,000 transistors. Today, Intel?s Core i7 processor holds 2.27 billion transistors, or nearly eight thousand times as many.
The power of exponential growth, like those billions of transistors on the i7, and the level of miniaturization of computing, mean everything for the advance of robotics.
At such an inflection point for robotics as we are now entering, the capabilities of robots will be amazing and dramatic to most, yet fearful, maybe terrifying, to many others. Especially fearsome to those whose jobs are in imminent jeopardy.
We?ve already been warned, made painfully aware of which jobs have a high probability of being lost to machines. An Oxford study from 2013 forecasts 47 percent of U.S. jobs under threat, and it even names the jobs: The Future of Employment: How Susceptible Are Jobs to Computerization?
What we don?t know is what new jobs will come from this genocide of occupations, and what we will have to do and go through to make those new jobs our new reality.
History offers a good look at what the future may hold.
1952
Everything has a beginning.
In 1952, the U.S. didn?t have an IT or robot worry in the world.
Time magazine that year, in reporting the nation?s $350 billion in gross national product, commented that it was the ?greatest material outpouring in its history.? Incredibly, the U.S. produced 52 percent of the world?s mechanical energy and made 65 percent of the world?s manufactured goods.
According to the World Bank, the U.S. today produces 18 percent of global manufactured goods.
Adlai Stevenson?s presidential campaign slogan for 1952 was, We Never Had It So Good.
Indeed!
World population was 2.6B; U.S. unemployment rate was 3.3 percent; and the DOW was 269. Yes, 269!

Purring away for CBS-TV that fall was America?s first computer, the UNIVAC or (Universal Automatic Computer). UNIVAC famously predicted the result of the 1952 presidential election: with a sample of just 1 percent of the voting population, the computer picked Eisenhower in a landslide while the conventional wisdom favored Stevenson.
CBS was aghast that UNIVAC could be so off in its prediction; the network had technicians rejigger the machine to produce a result that simply said: too close to call. The plan was to junk the malfunctioning machine soon after the election.
Next morning, UNIVAC?s prediction was proven correct: Eisenhower?s landslide was right on the money.
?The trouble with machines,? CBS commentator Edward R. Murrow famously remarked, ?is people.?
That was the beginning of computers entering into our lives.
Four years later, two professors from the University of Chicago coined the term “Information Technology” for an article in the Harvard Business Review titled: Management in the 1980s.
The IT juggernaut had begun. Soon after, people started to lose their jobs.
Bank tellers, elevator operators, airline reservationists, teachers, telephone operators, linotype operators, typists and librarians, plus thousands of analog engineers and tens of thousands of analog technicians were suddenly unemployed.
George Valley, an MIT physicist who worked on the SAGE digital computer project, lamented years later at the severity and speed of the losses. He said that through no fault of their own, thousands had their livelihoods erased. The digital world had suddenly become all too real for millions.
Spencer Tracy and Catherine Hepburn even made a popular movie about the clash of people and computers: Desk Set (1957).
By 1963, the unemployment rate had spiked to 5.7 percent; there were six million people out of work; and Congressional hearings on ?persistent stubborn unemployment? and ?Automation? began in earnest. In a special message to the Congress, President Kennedy said: “Large scale unemployment during a recession is bad enough, but large scale unemployment during a period of prosperity would be intolerable.”

Free-market economist and professor at the University of Chicago, Yale Brozen, penned his famous, Automation: The Retreating Catastrophe, in which he tried to state the case for the necessary advance of technology and point out how millions of skilled jobs created by IT overwhelmed the loss of so many unskilled ones.
As the wave of digital automation rose and broke over the country, and then dragged those made newly unnecessary back out to sea, another wave began to rise over the same shores carrying with it millions of new jobs. Many of those new jobs and functions were totally unknown until the wave arrived.
In 1952 there were fewer than a dozen computer programmers in the U.S., probably because there was less than a megabyte of RAM on the entire planet.
Things changed. They will again.
Here in 2014, all of the foregoing rings all too familiar and discomforting.
IT is up to its old tricks once more. This time robots are the targeted hosts; and as with all previous waves of IT, great things are ahead.
If we want to catch a glimmer of what a wave of robot-IT transformation can bring, we need look no further than to Pittsburgh.