Rough-Hewn Land: California to the Rocky Mountains

RoughHewnLandKeith Meldahl, a geologist and professor of geology, has written one of the most interesting books on the history of the American West I’ve ever encountered. It’s a history of how it got the way it is, physically. He covers the creation of California – it’s only recently been pasted onto North America – how the Sierra Nevada formed and what it actually is, why Nevada looks like it does, how the Colorado Plateau got there, how the Rocky Mountains were formed, and some very interesting and odd details as well. Along the way, he provides a few vignettes of the early explorers and settlers and their often brutal encounters with these features.

Probably the two most important players in all this are something you’ve never heard of, the Farallon Plate, and the North America continent itself. Long story short, 240 million years ago   the world’s landmasses had merged together into single massive conglomeration called Pangea (All Land). Prior to that time, North America had moved West to East, the East coast was the active margin and the West coast, which then ended in a line from Wyoming across Utah and through Nevada, trailed along. The eventual impact with Africa raised the Appalachians to Himalaya scale and merged us to it like India to Asia. By 150 million years ago, Pangea was breaking apart and a newly born mid-ocean ridge opened the Atlantic Ocean for the first time. As the ridge continued to build new seafloor, it spread apart. Everything east  of that ridge began being pushed to the east, and everything west  of it, including North America, began being pushed to the west. It was then that things began changing for the western states. You can page through that 100 million years at Arcadia Street for a glimpse at the plant and animal life you would have seen, had you been there.

Read more

Catch D’Wave

 

QuantumChipDWave

 

D-Wave Systems, located in British Columbia, is a builder of commercial quantum computers. It stores bits as magnetic directions in one of three states: clockwise, counterclockwise, and both directions simultaneously. The math and physics are far beyond me, but they claim to solve certain sets of optimization problems up to 100,000,000 times faster than classical computers. Customers for their computers, which cost $10 million apiece, include Lockheed Martin, an unnamed intelligence agency (NSA?), Google, JPL and NASA Ames Research.

Applications appear to be computationally intensive problems with lots of variables, and the solution involves a process called quantum annealing, where an optimal approach is found by exploring millions of solutions  simultaneously to find the most efficient solution path. I’m reminded of a discussion on the famous double slit experiment, a classic physics experiment that demonstrates photons displaying behaviors of both waves and particles, known as wave-particle duality. Most interesting is that quantum probabilistic behaviors are also observed, in that the experiment functions differently when the particle paths are observed and when they are not. When the photons in the experiment are observed, the probability function collapses and the photons behave like a particles. If they are not observed, the photons take many paths through the slits and create a dispersed pattern on the target. That behavior has been described as “spooky”, because the particles seem to know when they are being observed.  Weird, I know. It’s been said that anyone who claims to understand quantum mechanics is lying. But that doesn’t mean we can’t describe its behavior.  Richard Feynman explained that at the quantum level, every possible path a photon can take is considered, and the path chosen is a probability function, like a bell curve. As photons are emitted from a source, the most likely path is taken most often, but some photons will take slightly less probable paths, still other even less probable paths, and so on.  Quantum annealing seems to be a form of that, where many paths are simultaneously considered until a most probable path emerges, then it is chosen.

Read more

The Fermi Paradox and SETI

The Atacama Compact Array
The Atacama Compact Array

In 1950, amidst the UFO hoopla that was sweeping the world, Italian physicist Enrico Fermi posed a simple question, Where are they? By that he meant with lots of people making the argument that in a universe full of stars presumably with planets there should be lots of intelligent life out there. That seems plausible. So, he wondered, how come there isn’t a shred of evidence for it? After all, if we lived in a city full of people, wouldn’t we see them or at least see evidence of them being there? So why don’t we?

Kepler

In 1961 astronomer Frank Drake, interested in that very question, made an estimate of how many intelligent civilizations should exist inside our galaxy. The Drake Equation has seven terms, each a guess, from how many stars are born per year and how many of those have habitable planets through how many of those planets have developed technologies (like radio) that allow them to be detected. In 1961 there was not enough data to give reliable estimates to any of the terms. In the intervening 50 years we’ve accomplished enough basic research to apply actual values to the first few terms.

The Milky Way produces about seven new stars per year. Virtually every star forms within a disc of gas and rock/metal dust called a protoplanetary disc that eventually condenses into planets. According to research derived from data collected by the Kepler spacecraft, at least 22% of Sun-like G type stars have an Earth-like planet in the habitable zone, the habitable zone being defined as the distance at which water neither boils off or is continuously frozen. Result: the number of habitable Earth-like planets in the Milky Way is at least 50 billion.

Read more

The Ice Age Floods

About 18,000 years ago, the Earth began to warm substantially. That was a really big deal, because the Northern Hemisphere was in an ice age. As much as 2 mile (~ 3-4 Km) thick ice sheets blanketed the northern continent. Because so much of the global water supply was locked up in ice, sea level dropped 350 feet (~ 120 m) and beaches and coastlines would have been miles further offshore than their current locations. Coastlines on the Atlantic Seaboard, and presumably globally, contain buried river channels cut deep into the continental shelf. During the Ice Age they weren’t buried, they were river valleys to then more distant shorelines.

Last Glacial Maximum, 20.000 years ago
Last Glacial Maximum, 20,000 years ago

A wide lobe of the Cordilleran Ice Sheet crept across the valley of the Clark Fork River, eventually shutting off the flow completely, while the river pooled into the vast watershed behind it, including Missoula Valley, Flathead Valley, Thompson Valley, Mission Valley and Clearwater Valley. By 15,000-17,000 years ago the lake that was created, Glacial Lake Missoula, exceeded 2,000 feet (~ 600 m) in depth, had a surface area of ~3,000 square miles (6,500 Sq Km), and held 600 cubic miles (2,500 cubic Km) of water, as much as Lake Erie and Lake Ontario combined.

Glacial flood map, 17,000 - 15,000 years ago
Glacial flood map, 17,000 – 15,000 years ago

Read more

The crash of the XB 70 in 1966.

North American XB-70A Valkyrie just after collision. Note the F-104 is at the forward edge of the fireball and most of both XB-70A vertical stabilizers are gone. (U.S. Air Force photo)
North American XB-70A Valkyrie just after collision. Note the F-104 is at the forward edge of the fireball and most of both XB-70A vertical stabilizers are gone. (U.S. Air Force photo)

I’m getting a bit tired of politics and corruption right now. How about some aviation history? This is an interesting article on the crash of the supersonic bomber prototype.

The two test pilots were in the cockpit of a T-38 trainer flying off the left wing of the new XB-70 Valkyrie bomber, aircraft number 62-0207. They just saw the civilian registered NASA F-104N Starfighter of pilot Joe Walker slide upside down across the top of the huge white bomber, shear off both it’s twin tails and skid sideways, then break in two, killing Walker instantly. Behind the XB-70 Walker’s F-104N tumbled end over end, a pinwheel of bright orange flame nearly six hundred feet long tracing its convulsive death spiral.

The flight was a photo shoot for GE which made the jet engines of all the aircraft being photographed.

The fatal error was including an F 104 star fighter which had unreliable handling characteristics in low speed flight.

The poor safety record of the Starfighter brought the aircraft into the public eye, especially in German Air Force service. Fighter ace Erich Hartmann famously was retired from the Luftwaffe because of his protests against having to deploy the unsafe F-104s. The F-104 was also at the center of the Lockheed bribery scandals, in which Lockheed had given bribes to a considerable number of political and military figures in various nations in order to influence their judgment and secure several purchase contracts; this caused considerable political controversy in Europe and Japan.

It was considered a “widowmaker” at low speed especially takeoff and landing.

The F-104 series all had a very high wing loading (made even higher when carrying external stores). The high angle of attack area of flight was protected by a stick shaker system to warn the pilot of an approaching stall, and if this was ignored, a stick pusher system would pitch the aircraft’s nose down to a safer angle of attack; this was often overridden by the pilot despite flight manual warnings against this practice. At extremely high angles of attack the F-104 was known to “pitch-up” and enter a spin, which in most cases was impossible to recover from. Unlike the twin-engined McDonnell Douglas F-4 Phantom II for example, the F-104 with its single engine lacked the safety margin in the case of an engine failure, and had a poor glide ratio without thrust.

Read more