Thinking, Making, Profiting

256 years ago this month, James Watt made the conceptual breakthrough that enabled a much more efficient steam engine…an engine that would play a major role in driving the Industrial Revolution.  He had been thinking about possibilities for improving the coal-hungry Newcomen engine, then the best available, which lost huge amounts of heat every cycle through the successive heating and cooling of the cylinder walls:

It was in the Green of Glasgow.  I had gone to take a walk on a fine Sabbath afternoon…I was thinking upon the engine at the time…when the idea came into my mind, that as steam was an elastic body it would rush into a vacuum, and if a communication was made between the cylinder and an exhausted vessel, it would rush into it, and might be there condensed without cooling the cylinder.

But in addition to the many details involved in reducing this idea to practice, there was another problem inhibiting the creation of reasonably-efficient steam engines.  The boring of the cylinders…even when the best tools and the highest skills of the day were applied…was so imprecise that considerable quantities of steam escaped around the piston, greatly lowering the overall efficiency of the engine.

Enter Matthew Boulton, who became Watt’s partner, and John Wilkinson, a Boulton associate and foundry operator who was obsessed with all things cast iron.   Boulton and Wilkinson wanted a steam engine to provide the blast for Wilkinson’s foundry, and they wanted an engine with especially-large cylinders…which made the problem of tight cylinder/piston fit even harder to solve.

Wilkinson saw that the technology he had already developed for the very precise boring of cannon could, with some modifications, be adapted to the boring of steam engine cylinders.   Amid “searing heat and grinding din,” he achieved a cylinder, four feet in diameter, which “does not err the thickness of an old shilling at any part.”  With the combination of Watt’s separate condenser and Wilkinson’s improved boring process, the steam engine was ready for the starring role that it was to hold for the next century and beyond.

Key point: It wasn’t only the design of the improved steam engine that mattered, but also the process for making it.

What if Britain had been offshoring its foundry operations, with their “searing heat and grinding din” to another country?  Spain, let’s say.  Given the importance of the interaction between the design talent and the manufacturing talent, would the improved steam engine have been developed in the 1770s timeframe at all?  And whenever it had been developed, to which individuals and countries would the financial benefits of steam power have accrued?

The present-day parallel is the relationship between microchip designers and microchip manufacturing facilities…foundries, as they are actually called.

More about John Wilkinson, here.

An Early and Excellent Example of a High-Technology Product Press Release

The poet/historian  Antipater sings the wonderfulness of the vertical waterwheel as a power source:

Cease from grinding, ye women who toil at the mill

Sleep late, even if the crowing cocks announce the dawn

For Demeter has ordered the Nymphs to perform the work of your hands

And they,  leaping down on the top of the wheel, turn its axle which

With its revolving spokes, turns the heavy concave Nisyrian millstones

Learning to feast on the products of Demeter without labour

( circa 65 bc)

 

I would so hire that man for a Marketing Communications job.

 

“You Better Go to Raw Data”

People operating complex machines and systems–ships, aircraft, and nuclear power plants, for example–are often dependent on information that has been processed or filtered in some way. The same is true of people exercising their responsibilities as citizens in a large and complex society, inasmuch as they  cannot directly and personally observe most of the relevant facts and events.  Disasters that occur in complex physical systems can serve as a metaphor to help shed light on disasters–actual and potential–in the political sphere.

On June 9, 1995, the cruise ship Royal Majesty was on a routine voyage in good weather.  The vessel was equipped with GPS, which displayed latitude and longitude position…which the crew diligently plotted..and also drove a moving map overlaid on the radar scope.

Unfortunately, the information being displayed and plotted bore little resemblance to the actual reality.

As the gray sky turned black veil, the phosphorus-lit radar map with its neat lines and digital indication seemed clearer and more inviting than the dark world outside. As part of a sophisticated integrated bridge system, the radar map had everythingfrom a crisp radar picture, to ship position, buoy renderings, and up to the last bit of data anyone could wantuntil it seemed that the entire world lived and moved transparently, inside that little green screen. Using this compelling display, the second officer was piloting a phantom ship on an electronic lie, and nobody called the bluff.

The bluff was finally called by reality itself, at 10 PM, when the ship jerked to the left with a grinding noise.  It was hard aground on the Rose and Crown Shoal, and could not be backed off.

It was quickly determined that the cable to the GPS antenna had come loose, and the system was not actually obtaining the real, current positions. The captain ran to the LORAN unit, a completely separate electronic navigation system. The position accurately displayed on the LORAN differed from the displayed GPS position by 17 miles.

The GPS unit had in fact honestly disclosed its lack of current information: it did this by displaying the characters ‘DR’…for Dead Reckoning, ie, extrapolating the current course and speed..but the annotation appeared in small characters and was not noticed. The crew thought they were getting an actual portrayal of the current reality, rather than an estimate that would progressively become a guesstimate with the passage of time.

To use the term which has become common in media and political circles, the GPS and its associated display units were creating a convincing narrative…a narrative so convincing that no one, evidently, took the trouble to cross-check it with the LORAN, or to do a celestial fix.

How many American citizens live in a media and information environment which is as closed and as convincing as what the crew of the Royal Majesty was seeing on their bridge?  Consider how quickly overwhelming media narratives were put together concerning, for example, the Hunter Biden laptop or the murders of the women in Atlanta.  In most such cases, you could watch CNN, MSNBC, and some of the old-line tv networks, you could listen to NPR, you could look at the memes being circulated on social media–and they would all be telling you the same story, an overall narrative which for most people will be as consistent and as convincing as that phantom world displayed on the Royal Majesty‘s radar scope and plotted on the paper charts was that ship’s Second Officer.

As disasters go, the Royal Majesty affair was a fairly minor one: embarrassing and expensive, but no one was killed or injured.  Here’s a case which was much worse–the approach of a Delta Airlines flight into Boston Logan Airport, on July 31, 1973.

At 11:40:07, the Captain advised the First Officer, who was doing the flying for this approach:

You better go to raw data.  I don’t trust that thing.

“That thing” was a Flight Director, an instrument which displays the calculated actions needed to follow a desired flight path.  Both Captain and the FO had become concerned about indications on this instrument which didn’t seem to make sense.

It was too late.  25 seconds later, the plane slammed into the seawall. There were no survivors.

The NTSB determined that the Flight Director’s ‘mode’ switch was incorrectly set: while the Captain and the FO believed it was displaying the calculated actions required for the airplane to follow the Instrument Landing System radio beam down to the runway, it was actually doing no such thing.  “Raw data” refers to the display of the plane’s actual, physical vertical and horizontal deviation from where it should be on the ILS beam…and would have shown that the airplane was not where it needed to be.  The Raw Data was not, however, so prominently displayed on the instrument panel as were the Flight Director commands.

Convincing displays, convincing narratives, can be very dangerous.  New information tends to be absorbed into the overall picture.  When the navigating officer of the Royal Majesty observed the radar reflection of a buoy on his radar screen, and, shortly thereafter, the passage of a buoy was reported on the ship’s port side, it confirmed in his mind that it was the ‘BA’ buoy, which marks to entrance to the Boston traffic lanes…and the whole GPS-derived picture became even more convincing.  But it wasn’t really BA–it was actually the Asia Rip buoy, anchored to a sunken wreck, which marks the Rose and Crown Shoal.

In the political/media sphere, the misleading narratives that are convincingly presented are not the matter or mechanical or human error, they are a matter of human design.  Some of the people and organizations propagating these narratives know they are false, some would rather–for career or social reasons–not think about it too deeply, and some actually believe the narratives. It happens on both/all political sides, but happens a lot more, and more effectively, on the Left, because the Left/Woke dominance of media is so nearly complete.

The pilot and copilot of Flight 723 had only a matter of seconds to question and cross-check the ‘narrative’ that they were seeing on their Flight Director.  Citizens, operating in the political/media sphere, have less time pressure…but the time available is not infinite.  Multiple sources of information are more available than at any point in history–but the Narrative of the like-thinking media and its influence strategies is overwhelming, especially for people who don’t have a lot of time to follow political matters.  Confirmation bias, too, plays a strong role.

Will a sufficient number of people, metaphorically check the displayed GPS position against the LORAN, or check the Flight Director command bars against the raw localizer and glideslope data?  And will they do so before it is too late for recovery?

(More on the Royal Majesty incident at my post here.  Detail on the Delta Flight 723 accident is provided in the NTSB report.)

 

 

To Disappear in Dreams

An article in Wired says: The future of  virtual reality is far more than just video games. Silicon Valley sees the creation of virtual worlds as the ultimate free-market solution to a political problem. In a world of increasing wealth inequality, environmental disaster, and political instability, why not sell everyone a device that whisks them away to a virtual world free of pain and suffering?

and quotes John Carmack,  Doom co-creator and the former CTO of Oculus:

People react negatively to any talk of economics, but it is resource allocation. You have to make decisions about where things go. Economically, you can deliver a lot more value to a lot of people in the virtual sense.

Actually, I doubt that there is any kind of tech-industry-wide conspiracy to cool the people out and keep them from revolting by enmeshing them into virtual worlds…mostly, this is just about making money and doing cool technical stuff…on the supply side that is.  On the demand site, it should be of more than a little concern that escapism is so important to so many.

I’m reminded of some of the reactions when the movie Avatar came out.  CNN reported at the time:

James Cameron’s completely immersive spectacle “Avatar” may have been a little too real for some fans who say they have experienced depression and suicidal thoughts after seeing the film because they long to enjoy the beauty of the alien world Pandora.

According to the article, there were more than 1000 posts to a forum for people trying to cope from the depression they experienced after seeing this film..and not being able to stay within it permanantly.

Neptunus Lex responded: “Some folks don’t get the point. You have to come home when it’s over.”

But we seem to have an increasing number of people who don’t want to come home when it’s over…who don’t want it to ever be over…but want to stay in that virtual world permanently.

And, relatedly, there is also pharmaceutical-based escapism, legal or illegal.  Various forms of addiction, already at concerning levels, have risen considerably over the last year.  And, apparently, it has long been true that considerable numbers of people find an ordinary trip on an ordinary commercial airliner to be so stressful that they medicate themselves beforehand.

In my 2010 post on the Avatar reactions, I said:

I immediately thought of the old Chinese opium dens…which were largely inhabited by people whose lives were so miserable that their desire to disappear in dreams was entirely understandable.

But what misery or bleakness are the would-be permanant habitués of the Avatar den seeking to escape?

And this question can be extended to other types of addiction-dens, as well.

The title of this post was inspired by a line in Tom Russell’s song Ambrose Larsen  and another song on the same album, The Dreamin’.

The Computer Age Turns 75

In February 1946, the first general purpose electronic computer…the ENIAC…was introduced to the public.  Nothing like ENIAC had been seen before, and the unveiling of the computer, a room-filling machine with lots of flashing lights and switches–made quite an impact.

ENIAC (the Electronic Numerical Integrator and Computer) was created primarily to help with the trajectory-calculation problems for artillery shells and bombs, a problem that was requiring increasing numbers of people for manual computations.  John Mauchly, a physics professor attending a summer session at the University of Pennsylvania, and J Presper Eckert, a 24-year-old grad student, proposed the machine after observing the work of the women (including Mauchly’s wife Mary) who had been hired to assist the Army with these calculations. The proposal made its way to the Army’s liason with Penn,  and that officer, Lieutenant Herman Goldstine,  took up the project’s cause.  (Goldstine apparently heard about the proposal not via formal university channels but via a mutual friend, which is an interesting point in our present era of remote work.)  Electronics had not previously been used for digital computing, and a lot of authorities thought an electromechanical machine would be a better and safer bet.

Despite the naysayers (including RCA, actually which refused to bid on the machine), ENIAC did work, and the payoff was in speed.  This was on display in the introductory demonstration, which was well-orchestrated from a PR standpoint.  Attendees could watch the numbers changing as the flight of a simulated shell proceeded from firing to impact, which took about 20 seconds…a little faster than the actual flight of the real, physical shell itself.  Inevitably, the ENIAC was dubbed a ‘giant brain’ in some of the media coverage…well, the “giant” part was certainly true, given the machine’s size and its 30-ton weight.

In the photo below, Goldstine and Eckert are holding the hardware module required for one single digit of one number.

The machine’s flexibility allowed it to be used for many applications beyond the trajectory work,  beginning with modeling the proposed design of the detonator for the hydrogen bomb.   Considerable simplification of the equations had to be done to fit within ENIAC’s capacity; nevertheless, Edward Teller believed the results showed that his proposed design would work. In an early example of a disagreement about the validity of model results, the Los Alamos mathematician Stan Ulam thought otherwise.  (It turned out that Ulam was right…a modified triggering approach had to be developed before working hydrogen bombs could be built.)  There were many other ENIAC applications, including the first experiments in computerized weather forecasting, which I’ll touch on later in this post.

Programming ENIAC was quite different from modern programming.  There was no such thing as a programming language or instruction set.  Instead, pluggable cable connections, combined with switch settings, controlled the interaction among ENIAC’s 20 ‘accumulators’ (each of which could store a 10-digit number and perform addition & subtraction on that number) and its multiply and divide/square-root units.  With clever programming it was possible to make several of the units operate in parallel. The machine could perform conditional branching and looping…all-electronic, as opposed to earlier electromechanical machines in which a literal “loop” was established by glueing together the ends of a punched paper tape.   ENIAC also had several ‘function tables’, in which arrays of rotary switches were set to transform one quantity into another quantity in a specified way…in the trajectory application, the relationship between a shell’s velocity and its air drag.

The original ‘programmers’…although the word was not then in use…were 6 women selected from among the group of human trajectory calculators. Jean Jennings Bartik mentioned in her autobiography that when she was interviewed for the job, the interviewer (Goldstine) asked her what she thought of electricity.  She said she’d taken physics and knew Ohm’s Law; Goldstine said he didn’t care about that; what he wanted to know was whether she was scared of it!  There were serious voltages behind the panels and running through the pluggable cables.

“The ENIAC was a son of a bitch to program,” Jean Bartik later remarked.  Although the equations that needed to be solved were defined by physicists and mathematicians, the programmers had to figure out how to transform those equations into machine sequences of operations, switch settings, and cable connections.  In addition to the logical work, the programmers had also to physically do the cabling and switch-setting and to debug the inevitable problems…for the latter task, ENIAC conveniently had a hand-held remote control, which the programmer could use to operate the machine as she walked among its units.

Notoriously, none of the programmers were introduced at the dinner event or were invited to the celebration dinner afterwards.  This was certainly due in large part to their being female, but part of it was probably also that programming was not then recognized as an actual professional field on a level with mathematics or electrical engineering; indeed, the activity didn’t even yet have a name.  (It is rather remarkable, though, that in an ENIAC retrospective in 1986…by which time the complexity and importance of programming were well understood…The New York Times referred only to “a crew of workers” setting dials and switches.)

The original programming method for ENIAC put some constraints on the complexity of problems that it could be handled and also tied up the machine for hours or days while the cable-plugging and switch-setting for a new problem was done. The idea of stored programming had emerged (I’ll discuss later the question of who the originator was)…the idea was that a machine could be commanded by instructions stored in a memory just like data; no cable-swapping necessary. It was realized that ENIAC could be transformed into a stored-program machine  with the function tables…those arrays of rotary switches…used to store the instructions for a specific problem. The cabling had to be done only once, to set the machine up for interpreting  a particular vocabulary of instructions.  This change gave ENIAC a lot more program capacity and made it far easier to program; it did sacrifice some of the speed.

Read more