Retrosupercomputing

The comment thread on this post segued (oddly enough!) into a discussion of supercomputer designer Seymour Cray and a comparison of his multi-million-dollar systems with today’s ordinary personal computers. I thought it might be interesting to take a look at a supercomputer from almost 60 years ago–the Naval Ordnance Research Calculator (NORC), built by IBM for the US Navy and delivered in 1954, which held the computing speed record for several years.

NORC came only 10 years after the computer age was kicked off by the announcement of the Harvard-IBM Mark I (click here for an interesting contemporary magazine article on that innovation), but it was vastly faster and more powerful. NORC’s arithmetic was done at the rate of about 15,000 additions or 12,000 multiplications per second, and the machine could store 3600 words (16-digit decimal numbers) with a memory cycle time of 8 microseconds. Lots of NORC information and pictures at this site. Applications included hydrodynamics, weather forecasting, logistics simulations, and the motion of celestial bodies. The hydrodynamics problems included studies of torpedo cavitation and of the earth’s liquid core. (Remarks by John von Neumann at the NORC dedication, including audio, here.)

NORC’s circuits used vacuum tubes–9000 of them—and the memory was electrostatic, employing a what were basically TV picture tubes with bits stored on the face as charges and continually refreshed. This technology represented the best speed/cost tradeoff for a high-end computer at the time, but it was very sensitive–apparently, a woman wearing silk stockings walking near the computer would likely cause memory errors because of the static electricity generated. (No doubt leading to much speculation about the correlation between female hotness and computer memory error rate.)

Construction of NORC cost $2.5MM, which equates to about $20MM in 2012 dollars. Some of the cost can probably be attributed to the one-of-a-kind nature of the machine and the pull-out-all-stops-and-make-it-the-fastest spirit of its design. But even a computer intended as a standard commercial product, the roughly contemporaneous IBM 701, went for about $1 million in early 1950s money.At first glance, it seems hard to believe that such a massive investment for such relatively slow and limited machines (by our present-day standards) could have made economic sense. But consider: a calculation taking 30 minutes on NORC might have required something like 30 person-years if done by human beings using the desk calculators of the time. The economics probably did make sense if the workload was appropriate; however, I bet a fair number of these early machines served more as corporate or government-agency status symbols than as paying propositions. (As a side note, I wonder if the awe generated by early computers would have been lessened had the machines not been so physically impressive–say, if they had been about the size of a modern desktop PC?)

NORC, which was in operation through 1968, has of course been surpassed by orders of magnitude by much cheaper and more compact machines. Its computational capabilities are trivial compared with those of the computer on which you are reading this. Yet, strange as it may seem, there are a lot of problems for which today’s computer power is inadequate, and the frontiers of supercomputing continue to be pushed outwards.

While researching this post, I ran across several articles dealing with a particular highly-demanding supercomputer application currently being addressed by computer scientists. This is the modeling of the physical behavior of cloth, which is important both for creation of realistic animated movies and in the textiles/apparel industry. (See for example this paper.) Simulating the movement of a virtual actress’s dress, as she walks down the street in a light breeze, apparently requires far more computer power than did the development of America’s first hydrogen bombs.

Related post: computation and reality

11 thoughts on “Retrosupercomputing”

  1. I wonder what computational power the average touchscreen cell phone has in relation to computer history?

    Years ago during my flying days the owner of an FBO I used to use (a general aviation “service station” for aircraft) – anyway he had an interesting history and like most interesting people I find about it through a third party.

    He was short, stocky with a crew cut – unassuming in appearance, and I learn that he had a Phd from MIT – had developed some of the first compilers – telling me the days when programmers would watch every byte in the memory and free it if not needed.

    How those times have changed ;-)

    Anyway he found that sales was far more lucrative than research – started selling, or more accurately, leasing mainframes from Honeywell – made a bunch of money – got tired of the whole thing and started the flying biz.

  2. I am learning about car computers and it’s likely the 8061 at the heart of my 93 Lincoln’s EEC-IV is quite a bit stronger than the NORC. Certainly much faster.

  3. Simulating the movement of a virtual actress’s dress, as she walks down the street in a light breeze, apparently requires far more computer power than did the development of America’s first hydrogen bombs.

    As a computer scientist of sorts, I obviously need to be examining this problem more closely.

  4. David – not far in the future I read we will be able to see virtual Cary Grants – John Waynes – Marilyn Monroes – in new movies – all from computers mapping their voices – and movements – and making new actions

  5. “telling me the days when programmers would watch every byte in the memory and free it if not needed.”

    I was programming an IBM 650 in 1959. We had 2000 addressable memory spaces, each 10 digits, two for the instruction, four for the address of the data to be manipulated and four for the next instruction. We usually used addition to increase the address of the next instruction since the data took most of the room. The memory was on a rotating drum that spun at 20,000 rpm. We had another program, called SOAP, which would modify addresses of data so the memory space was under the read head (there were 20 of them) when the program called for it. “SOAPing” the program cut the time to run it way down.

    Over in the main plant, they used the IBM 704 and while I was there, they went to a transistor version called the 7040. The origin of the term bug, I was told, came from a case in which a moth got into a vacuum tube computer running COBOL and created havoc until they found it.

  6. I recently bought some processors for 40c each which are self-contained systems and are vastly more powerful than NORC. They’re 60MHz ARM7s with 64KB of SRAM and 128KB of flash memory. Power consumption is around 55 milliwatts at full speed and they have a very nice 3-phase motor controller peripheral built-in.

    Incredible, really. Those were on special as they are a bit outdated but the retail price for a Cortex M0 which is almost as fast and less power-hungry is only a couple of dollars.

  7. Seymour Cray was an eccentric genius. The tunnel was explained in the Wiki article”

    ” Another favorite pastime was digging a tunnel under his home; he attributed the secret of his success to “visits by elves” while he worked in the tunnel: “While I’m digging in the tunnel, the elves will often come to me with solutions to my problem.”[10][11]”

    At the time, there was an issue with another man who was working on design who went the “massive parallel computing” direction. His name isn’t mentioned in the Wiki article, which seems incomplete but my memory is too. He had an Asian name.

  8. The origin of the term bug, I was told, came from a case in which a moth got into a vacuum tube

    IIRC, the term ‘bug’ is already in Edison’s notebooks. I suspect it traces back to the days of the telegraph.

  9. Michael,

    I’m a lot younger than you. My earliest programming experience was on a PDP-11/40, with its generous 32 kilobytes per user, and a hard drive the size of a washing machine that could hold an impressive 2 Mbytes of storage.

    But, golly, did it have a nice instruction set. I haven’t seen its equal in all the years since.

Comments are closed.