Attack of the Job-Killing Robots, Part 3

The final months of World War II included the first-ever battle of robots:  on one side, the German V-1 missile and on the other, an Allied antiaircraft system that automatically tracked the enemy missiles, performed the necessary fire-control computations, and directed the guns accordingly. This and other wartime projects greatly contributed to the understanding of the feedback concept and the development of automatic control technology.  Also developed during the war were the first general-purpose programmable digital computers: the Navy/Harvard/IBM Mark I and the Army/MIT ENIAC…machines that, although incredibly limited by our presented-day, standards were at the time viewed with awe and often referred to as ‘thinking machines.’

These wartime innovations in feedback control and digital computation would soon have enormous impact on the civilian world.

This is one in a continuing series of posts in which I attempt to provide some historical context for today’s discussions of automation and its impact on jobs and society…a context of which people writing about this topic often seem to have little understanding.

 

By the early 1950s computers were being used for business as well as scientific and engineering applications, and the huge clerical staffs that had built up in many kinds of companies (in the insurance industry, for example) were an obvious target for computer-based automation.  Although clerical operations often proved more difficult to automate than initially expected,  it’s fair to say that by the mid-1970s most functions such as payroll, billing, and order processing had been automated to a considerable degree.

The early 1950s also saw the introduction of the numerically-controlled machine tool. Instead of the metal-cutting path of a lathe or milling machine being controlled by an operator turning wheels–or by a physical prototype as in Blanchard’s copying lathe–the process could be automatically directed by a sequence of instructions on a punched paper tape.  The tape itself could be produced by a computer program that performed the necessary geometrical calculation.  The advantages for increased manufacturing flexibility, as well as labor reduction, were tremendous, and much of the contemporary journalism concerning these systems was quite similar to today’s journalistic coverage of 3-D printing.

Special-purpose electronic and electromechanical systems reduced labor content in a whole range of activities.  In the early 1950s, there were something like 500,000 elevator operators in the US.  Most of them  were soon to be replaced by automatic systems.  (See this 1931 article about an early implementation of such technology: the ‘robot elevators’ of the Empire State Building. The article makes the important point that real, useful robots are likely to look nothing like the ‘tin men’ of popular imagination; indeed, still today proposals for taxing robots seem to define the concept of a ‘robot’ rather narrowly.)

Feedback control and electronic computing (initially mostly analog, with an increasing role for digital) enabled automation of refineries and chemical plants.  While this automation did reduce the number of workers required, the impact was predictably overstated by the media and ‘experts’ of various kinds.  The president and founder of the Institute for Cybercultural Research wrote in 1964:

In Texas and New Jersey, in the oil refineries–the silent, lifeless ghost towns of this century–crude oil is processed into different grades of gasoline and various byproducts…Crude oil is pipped in–gasoline and byproducts emerge…There are no workers, no supervisors, no executives; just a few highly trained engineers standing by in the central control room, watching their brainchild fend for itself.

Fortune editor Charles Silberman, in his 1966 book The Myths of Automation, responded:  “Unfortunately (the author) is closely guarding the identity of these refinery ‘ghost towns.’  In Port Arthur, Texas, however, the Texaco refinery alone employs 5000 people, the Gulf refinery 4000.”

Silberman cites sociologist-physicist Donald Michael as coining (circa 1963) the term Cybernation, referring to the marriage of computers with automatic machinery, which combination, according to Michael, meant “an end to full employment.”  An influential group called the Ad Hoc Committee on the Triple Revolution went further: in its view, ‘cybernation’ meant an end to all employment, or almost all.  The committee warned President Johnson in March 1964 that “the nation will  be thrown into unprecedented economic and social disorder.”

There was plenty of hype…one example was the media attention to a  manufacturing system called the TransferRobot, made by a company called US Industries. A 1963 article in Life Magazine, asserted that “almost anything hands born of woman can do” the TransferRobot “can do better, faster, more cheaply.”  But very few of these systems were actually sold.

Despite the excessive hype, though, there really were considerable advances in productivity-related automation.  The ERMA check-sorting machine was prototyped in 1955 and production models were available by 1959….the system displaced the labor of those who had previously been required to sort checks physically and post them to individual accounts.  The first automated teller machines in the US were installed in 1969, allowing customer to get money without interacting with a human teller.

Previous to 1946, airline reservations were handled by roomfuls of clerks retrieving pieces of paper from filing cabinets, checking for flight availability, and writing down new ticket sales.  In that year, special-purpose electromechanical and electronic systems began enhancing the manual systems, with considerable labor savings, and in 1964 the IBM/American Airlines SABRE system, with fully computerized reservations, was operational.  Air Traffic Control, also, was computerized:  the labor requirements for tracking the vastly increased numbers of flights would have been insupportable with purely manual methods.

The checkout of customers in retail stores had long been partly mechanized by mechanical cash registers: labor content and skill requirements were further reduced when bar codes were introduced in the early 1970s.  Although there was initially some public resistance…worries about potential price fraud and conspiracy theories involving the number 666, as well as concerns about privacy…the use of bar codes quickly spread throughout the retail landscape.

The standardization of product identifiers that was required to make bar coding work also helped to enable direct electronic communication between companies and their suppliers, greatly reducing human work required for data entry. Electronic  Data Interchange, as it was called, predated the commercial Internet, and by the late 1980s was prevalent in many industries.

A huge transformation in office work, with great consequences for employment patterns, was the introduction of word processing systems, such as the dedicated systems sold very successfully by Wang Laboratories and others starting in the 1970s, followed by PC-based software such as Word Perfect and Microsoft Word.  Secretaries became much-less-common in businesses and other organizations.  (One company president remarked that ‘the main effect of the computer revolution to-date has been the conversion of highly-paid executives into incompetent clerk-typists.’)

The work of engineers, also, was greatly affected by automation:  the early scientific computers, and even simple calculators, greatly reduced the amount of time that engineers needed to spend on number-crunching.  With the introduction and improvement of computer-aided-design, the amount of engineering labor required for design projects was still further reduced.

Computer programming itself was also subject to substantial automation:  compilers, which convert somewhat-English-like symbolic instructions into machine code, greatly improved the productivity of software development. Probably the first true compiler was Grace Hopper’s Flow-Matic, introduced in 1958 and followed quickly by the COBOL and FORTRAN languages.  Tremendous strides in programming productivity were also accomplished by prepackaged software for such things as payroll and by problem-oriented languages that let an engineer or a financial analyst specify a problem without knowing anything about programming per se.  Microsoft Excel and Google Sheets are current examples of such systems.

Again, this series of posts has been an attempt to provide some historical context for the often rather hysterical discussions of the economic and social impact of automation.  Technological changes have certainly led to suffering for workers in particular industries and communities, but..so far at least..not to the overall disappearance of work for most people and consequent widespread unemployment and impoverishment projected in many dark visions.  Do today’s innovations represent a sharp upward break in labor-productivity, or are they rather a continuation of a long-existing trend line? Neither the historical experience nor the quantitative statistics can provide a definitive answer to this question, but they do provide grounds for being cautious about extreme and panicky conclusions.

Related posts:

About those job-killing robots

Attack of the job-killing robots, part I

Attack of the job-killing robots, part II

Technology, work, and society:  a view from 1836

Are those robots slacking off on the job?

New Jobs Contest! You could be a winner!

10 thoughts on “Attack of the Job-Killing Robots, Part 3”

  1. If you believe that Says Law is correct, then temporary dislocation would be accompanied by increased output per capita.

    The question comes down to: Is there a possibility that production technology changes faster than solutions to dislocation due to structural unemployment? It seems to me that a similar rate of technological change in production could happen in remediation of labor dislocation. There likely is some lag due to institutional rigidities due to a variety of factors. Such as governmental control of education, job training and standards/certifications and the nature of such dislocation efforts being reactive to actual proven technological production innovations. With increasingly higher rates of production innovation, it might turn out that the structural component of unemployment might increase more than proportionally.

    Concerns that technology impacts on employment result in increasing and largely permanent net labor demand reduction does not seem to follow actual historical experience as discussed in the post. I think these concerns might be based on an assumption that the aggregate increase in output will not result in solutions to the rate of structural employment by itself. I think it likely that an increase in the rate of technology change in production will compound labor market rigidities and inefficiencies in structural unemployment remediation. Only rigid shortages of new tech labor can cause a permanent displacement of aggregate labor by technology driven capital substitution as long as input prices (especially labor) are flexible and labor markets resonably efficient. To the extent the education/retraining becomes less efficient and labor markets become less efficient, the lag time between structural unemployment and remediation will become even longer.

    If these impediments to resolving increasing structural unemployment persist, I think it is likely that market forces will generate more efficient means in the areas of tech training and labor mobility. Just as national and international labor markets with the internet have tended to reduce search time (frictional unemployment) in the face of increased labor turn over in the information age.

    Death6

  2. Those 1950’s Computer Numerical Control machine tool codes (G-codes) are still in use and are what the 3D printers understand. They just come from a PC now rather than from a paper tape.

    Great article. I look forward to more.

    Stan W

  3. I came across cybernation while reading Robert Theobald when I was a kid. He was on that ‘Triple Revolution’ committee and was one of the early advocates of basic guaranteed income. I remember sparring with my father over Theobald’s contentions. Fortunately for me, my father, a much more sensible person than me, was a systems engineer who pointed me to Deming and the Toyota Production System.

    Looking back on it now, Theobald was a misanthrope who believed people weren’t just interchangeable with machines but that they behaved like machines. He was a firm believer in the Peter Principle and that organizations just naturally reach a level where they start destroying the world. He and his comrades had no idea of the concept of culture or how it affected performance and productivity.

    While lambasting hierarchies, their simple-minded thinking was actually hopelessly stuck in them. They were unable to imagine the second and third order effects of technology that were about to transform hierarchies. They totally missed the management revolution of constantly adding value, continuous improvement, coordinating and exploiting organizational knowledge, and, most importantly for the so-called ecologically minded, eliminating waste.

    Sadly, these solutions aren’t new or exotic, as they’ve been known to every craftsman throughout history. Only recently have we adopted such little faith in humanity.

  4. Just a random thought — from the narrow perspective of a First World nation, is there any major employment-impact difference between (a) automation and (b) off-shoring of manufacturing?

    The world has certainly seen big impacts from off-shoring in the last few decades — highly beneficial for China & India; not so great for the UK and USA. But nothing happens in isolation. In parallel with the off-shoring of production & employment, the world has seen (since the 1970s) an explosion in the growth of Government and consequent bureaucratic employment. The change in the relative importance of New York City and Washington DC is only one example, which is repeated right down to the County level.

    Setting aside for the moment the discussion of whether increased Government & Bureaucracy is beneficial or merely Overhead — what other form of increased employment could compensate for further job losses due to automation?

  5. “Just a random thought ”” from the narrow perspective of a First World nation, is there any major employment-impact difference between (a) automation and (b) off-shoring of manufacturing?”

    It’s a very interesting question….Seems to me there *is* a difference. With automation, the manufacturing of the capital equipment required *may* be in the US (or other First World nation), and the installation labor certainly will be in that nation.

    If Henry Ford had been able to make cars with people getting paid 20 cents a day in Mexico..and hence had not bothered with the assembly line and other productivity improvements…would anyone have started paying US workers $5/day?

    The issue needs some serious analysis, but, intuitively, offshore does not have the same positive effect on worker standards of living that internal automation has generally proven to have.

    Also, of course, the offshoring path raises issues of dependence and national security that are not raised by internal automation.

    To tie this to another current CB discussion thread: many Southern leaders didn’t think they needed to do anything but raise cotton and other agricultural products, since everything else could be ‘outsourced’ in exchange for these goods. Didn’t work so well once they were at war.

    Similarly, a Native American approach of focusing exclusively on the beaver-fur trade:

    “In truth, my brother, the Beaver does everything to perfection. He makes for us kettles, axes, swords, knives and gives us drink and food without the trouble of cultivating the ground.”

    …probably worked just fine until the White trading partners got hostile.

  6. David Foster makes excellent points about the impacts of automation on domestic wages/living standards. But I would like to take half a step back and consider only numbers of people employed … not their living standards.

    The number of people working directly on the land declined dramatically in the 20th Century. However, many jobs were created to support the 1% who worked directly on the land — the factory jobs making farm equipment which made the farmers more productive; and the miners who provided the raw materials for the factories; etc.

    Many of those factory jobs went away after 1970 (not incidentally, when the Environmental Protection Agency set up shop) — some to automation, for sure, and many more to off-shoring. But this loss of manufacturing jobs was offset by the huge growth in bureaucracies, thus continuing to keep most of us employed.

    Personal hunch — we are about at Peak Government (because any further significant expansion of the bureaucracy will cause the collapse of society). Expansions in automation are inevitable, with associated job losses. Government will not be able to pick up the slack, so what will happen? Will we see a further expansion of the depressing European model of delayed entry into the workforce through pointlessly extended “education”, along with forced early retirement? Is there a better model to aim for?

  7. That automated gun was probably designed by Alfred Loomis’ people at Tuxedo Park – if you haven’t read the book on Tuxedo Park it is highly recommended. He was the most influential “facilitator” for science in WW2 that virtually no one knows anything about (bad sentence but it is late). He was a Wall Street financier who just loved science – helped build Laurence Livermoore labs.

    His people had an interesting quote that was on the PBS program American Experience – “The atomic bomb may have ended the war but radar won it”.

    His people took the British invention and greatly improved its range.

    I can remember with cars in the 50s and seeing gaps from doors, trunks, and hoods all uneven.

    Automation changed all that.

    “Hand built” used to connote exclusivity and quality – I suspect the connotation of quality is gone now thanks to robotic assembly.

  8. Thanks for that older link, David. Quite thought-provoking.

    Sadly, one of the thoughts that link’s discussion provoked in me was that any changes which created substantial new numbers of jobs would require political adjustments. And political adjustments are very difficult to accomplish, since the changes would probably disadvantage or put at risk the individuals who benefit from the current situation. History suggests that political adjustments tend to occur only through war or revolution.

    History also suggests that the long-term trend line for the human race is upwards — increasing percentages of the human population living longer, healthier, more comfortable lives — despite the occasional downturns of Dark Ages, plagues, wars, and depressions. So we can probably be optimistic about the long term future for the human race; but getting there may be a very rough ride.

  9. I don’t think that those past worries about automation were completely ridiculous, in retrospect. The cost of so many things have gotten insanely cheap. For example, I tend to buy button down shirts on sale for < $10 (retail was maybe $20). If you make $50 an hour (I make 2-7 times that, depending on circumstances), that means you can buy maybe 5 shirts per hour of work, if you shop sales (not economically worth it, of course), which means that it takes 12 minutes of work for a shirt. 24 minutes at retail. Contrast this to the hundreds of hours required several centuries ago, when you had to spin the thread, weave it, etc. How many shirts do you need? I may have 100, accumulated over the last 40 years, that still fit. Now, if I lose a button, it goes to Goodwill. Any fraying? Ditto. Same with slacks, jeans, sport coats, suits, shoes, etc. The two of us have 2 houses, in two states, with over 5,000 sq feet between them, stuffed with stuff, because it is so friggen cheap. Easier to buy a bigger house than go through and sort the stuff. The new house we bought last year was extremely well built, easily large enough for six, and not that expensive. Part of that, again, was due to automation. Watched them frame a house in the neighborhood in one day. Roof supports and beams the next, and the roof itself the next. Everything shows up already partially assembled. The studded walls just have to be put together like a jigsaw puzzle. Everything fits perfectly, because they were machine cut and assembled at the factory. Cheaper, faster, better. Food is the same. We go out maybe once a week, for maybe an hour's worth of work, for one of us. A couple nights a week, we order in salads for half that. And the rest of the time, we cook, for half that.

    It was a big thing when the work week was cut to 40 hours. Before that 50, 60, etc were standard. We worked to survive. Not anymore. We have record numbers of unemployed, disabled, on welfare, etc, We have them because we can. Older generations worry about the work ethic of the Millennials, who seem to often begrudge even 40 hours of work a week. Because many of them have learned that if they minimalize, they can survive quite nicely on less, leaving the rest free to, essentially, play.

    We have spent millennia defined by our work, but now, with automation, many can survive quite well without working that hard, or that long. I am not sure if that is a good thing. Maybe that is my Puritan heritage speaking. What happens in a world where work doesn't define a large percentage of the population?

Comments are closed.