The final months of World War II included the first-ever battle of robots: on one side, the German V-1 missile and on the other, an Allied antiaircraft system that automatically tracked the enemy missiles, performed the necessary fire-control computations, and directed the guns accordingly. This and other wartime projects greatly contributed to the understanding of the feedback concept and the development of automatic control technology. Also developed during the war were the first general-purpose programmable digital computers: the Navy/Harvard/IBM Mark I and the Army/MIT ENIAC…machines that, although incredibly limited by our presented-day, standards were at the time viewed with awe and often referred to as ‘thinking machines.’
These wartime innovations in feedback control and digital computation would soon have enormous impact on the civilian world.
This is one in a continuing series of posts in which I attempt to provide some historical context for today’s discussions of automation and its impact on jobs and society…a context of which people writing about this topic often seem to have little understanding.
By the early 1950s computers were being used for business as well as scientific and engineering applications, and the huge clerical staffs that had built up in many kinds of companies (in the insurance industry, for example) were an obvious target for computer-based automation. Although clerical operations often proved more difficult to automate than initially expected, it’s fair to say that by the mid-1970s most functions such as payroll, billing, and order processing had been automated to a considerable degree.
The early 1950s also saw the introduction of the numerically-controlled machine tool. Instead of the metal-cutting path of a lathe or milling machine being controlled by an operator turning wheels–or by a physical prototype as in Blanchard’s copying lathe–the process could be automatically directed by a sequence of instructions on a punched paper tape. The tape itself could be produced by a computer program that performed the necessary geometrical calculation. The advantages for increased manufacturing flexibility, as well as labor reduction, were tremendous, and much of the contemporary journalism concerning these systems was quite similar to today’s journalistic coverage of 3-D printing.
Special-purpose electronic and electromechanical systems reduced labor content in a whole range of activities. In the early 1950s, there were something like 500,000 elevator operators in the US. Most of them were soon to be replaced by automatic systems. (See this 1931 article about an early implementation of such technology: the ‘robot elevators’ of the Empire State Building. The article makes the important point that real, useful robots are likely to look nothing like the ‘tin men’ of popular imagination; indeed, still today proposals for taxing robots seem to define the concept of a ‘robot’ rather narrowly.)
Feedback control and electronic computing (initially mostly analog, with an increasing role for digital) enabled automation of refineries and chemical plants. While this automation did reduce the number of workers required, the impact was predictably overstated by the media and ‘experts’ of various kinds. The president and founder of the Institute for Cybercultural Research wrote in 1964:
In Texas and New Jersey, in the oil refineries–the silent, lifeless ghost towns of this century–crude oil is processed into different grades of gasoline and various byproducts…Crude oil is pipped in–gasoline and byproducts emerge…There are no workers, no supervisors, no executives; just a few highly trained engineers standing by in the central control room, watching their brainchild fend for itself.
Fortune editor Charles Silberman, in his 1966 book The Myths of Automation, responded: “Unfortunately (the author) is closely guarding the identity of these refinery ‘ghost towns.’ In Port Arthur, Texas, however, the Texaco refinery alone employs 5000 people, the Gulf refinery 4000.”
Silberman cites sociologist-physicist Donald Michael as coining (circa 1963) the term Cybernation, referring to the marriage of computers with automatic machinery, which combination, according to Michael, meant “an end to full employment.” An influential group called the Ad Hoc Committee on the Triple Revolution went further: in its view, ‘cybernation’ meant an end to all employment, or almost all. The committee warned President Johnson in March 1964 that “the nation will be thrown into unprecedented economic and social disorder.”
There was plenty of hype…one example was the media attention to a manufacturing system called the TransferRobot, made by a company called US Industries. A 1963 article in Life Magazine, asserted that “almost anything hands born of woman can do” the TransferRobot “can do better, faster, more cheaply.” But very few of these systems were actually sold.
Despite the excessive hype, though, there really were considerable advances in productivity-related automation. The ERMA check-sorting machine was prototyped in 1955 and production models were available by 1959….the system displaced the labor of those who had previously been required to sort checks physically and post them to individual accounts. The first automated teller machines in the US were installed in 1969, allowing customer to get money without interacting with a human teller.
Previous to 1946, airline reservations were handled by roomfuls of clerks retrieving pieces of paper from filing cabinets, checking for flight availability, and writing down new ticket sales. In that year, special-purpose electromechanical and electronic systems began enhancing the manual systems, with considerable labor savings, and in 1964 the IBM/American Airlines SABRE system, with fully computerized reservations, was operational. Air Traffic Control, also, was computerized: the labor requirements for tracking the vastly increased numbers of flights would have been insupportable with purely manual methods.
The checkout of customers in retail stores had long been partly mechanized by mechanical cash registers: labor content and skill requirements were further reduced when bar codes were introduced in the early 1970s. Although there was initially some public resistance…worries about potential price fraud and conspiracy theories involving the number 666, as well as concerns about privacy…the use of bar codes quickly spread throughout the retail landscape.
The standardization of product identifiers that was required to make bar coding work also helped to enable direct electronic communication between companies and their suppliers, greatly reducing human work required for data entry. Electronic Data Interchange, as it was called, predated the commercial Internet, and by the late 1980s was prevalent in many industries.
A huge transformation in office work, with great consequences for employment patterns, was the introduction of word processing systems, such as the dedicated systems sold very successfully by Wang Laboratories and others starting in the 1970s, followed by PC-based software such as Word Perfect and Microsoft Word. Secretaries became much-less-common in businesses and other organizations. (One company president remarked that ‘the main effect of the computer revolution to-date has been the conversion of highly-paid executives into incompetent clerk-typists.’)
The work of engineers, also, was greatly affected by automation: the early scientific computers, and even simple calculators, greatly reduced the amount of time that engineers needed to spend on number-crunching. With the introduction and improvement of computer-aided-design, the amount of engineering labor required for design projects was still further reduced.
Computer programming itself was also subject to substantial automation: compilers, which convert somewhat-English-like symbolic instructions into machine code, greatly improved the productivity of software development. Probably the first true compiler was Grace Hopper’s Flow-Matic, introduced in 1958 and followed quickly by the COBOL and FORTRAN languages. Tremendous strides in programming productivity were also accomplished by prepackaged software for such things as payroll and by problem-oriented languages that let an engineer or a financial analyst specify a problem without knowing anything about programming per se. Microsoft Excel and Google Sheets are current examples of such systems.
Again, this series of posts has been an attempt to provide some historical context for the often rather hysterical discussions of the economic and social impact of automation. Technological changes have certainly led to suffering for workers in particular industries and communities, but..so far at least..not to the overall disappearance of work for most people and consequent widespread unemployment and impoverishment projected in many dark visions. Do today’s innovations represent a sharp upward break in labor-productivity, or are they rather a continuation of a long-existing trend line? Neither the historical experience nor the quantitative statistics can provide a definitive answer to this question, but they do provide grounds for being cautious about extreme and panicky conclusions.