Are science and religion both standing on thin air?

We can argue the pros and cons of religion all we like, but I’m not sure it will tell us much more than what our own basic hunches are.

This is part I of a two part post.

Look, there have been a couple of books out recently that suggest something — the universe or universe of universes — might just have come out of nothing, without nothing having to have come out of anywhere special itself.

And there have been a couple of reviews of those books that I’ve read recently, and they don’t think that “it’s nothings, nothings all the way down” is any better than “it’s turtles, turtles all the way down”.

If you’re interested in that discussion, the next two chunks will be of interest to you — but if not, you can skip them and go right to my follow up post — That’s it in a nutshell! — with its three short quotes and a question.

So here are the two chunks — gobbits, if you’re a literary type — that I thought might be of interest.

**

The first comes from David Albert‘s review of A Universe From Nothing by Lawrence M. Krauss. Albert is a philosopher at Columbia and the author of Quantum Mechanics and Experience:

It happens that ever since the scientific revolution of the 17th century, what physics has given us in the way of candidates for the fundamental laws of nature have as a general rule simply taken it for granted that there is, at the bottom of everything, some basic, elementary, eternally persisting, concrete, physical stuff. Newton, for example, took that elementary stuff to consist of material particles. And physicists at the end of the 19th century took that elementary stuff to consist of both material particles and electro-magnetic fields. And so on. And what the fundamental laws of nature are about, and all the fundamental laws of nature are about, and all there is for the fundamental laws of nature to be about, insofar as physics has ever been able to imagine, is how that elementary stuff is arranged. The fundamental laws of nature generally take the form of rules concerning which arrangements of that stuff are physically possible and which aren’t, or rules connecting the arrangements of that elementary stuff at later times to its arrangement at earlier times, or something like that. But the laws have no bearing whatsoever on questions of where the elementary stuff came from, or of why the world should have consisted of the particular elementary stuff it does, as opposed to something else, or to nothing at all.
.
The fundamental physical laws that Krauss is talking about in “A Universe From Nothing” — the laws of relativistic quantum field theories — are no exception to this. The particular, eternally persisting, elementary physical stuff of the world, according to the standard presentations of relativistic quantum field theories, consists (unsurprisingly) of relativistic quantum fields. And the fundamental laws of this theory take the form of rules concerning which arrangements of those fields are physically possible and which aren’t, and rules connecting the arrangements of those fields at later times to their arrangements at earlier times, and so on — and they have nothing whatsoever to say on the subject of where those fields came from, or of why the world should have consisted of the particular kinds of fields it does, or of why it should have consisted of fields at all, or of why there should have been a world in the first place. Period. Case closed. End of story.

**

The second comes from Kathryn Schulz‘s review of Jim Holt‘s Why Does the World Exist? Schulz is the book critic for New York Magazine:

You can say A because B, B because C, C because D—but what explains D? If you say A, your explanation is circular. If you say because E because F because G (H, I, J, K … ), your explanation is an infinite regress: a taller stack of turtles. You might, instead, argue that D is explained by X, where X is some kind of necessary truth, a logical deduction or physical law. But this presents an interesting question: How, exactly, do you get a material universe out of a necessary truth? “Are the laws of physics somehow to inform the Abyss that it is pregnant with Being?” Holt asks. “If so, where do the laws themselves live? Do they hover over the world like the mind of God? … How do they reach out and make a world? How do they force events to obey them?”
.
Got me. Got everybody. Try as we might, we can’t find a way to tell a sound causal story about the origins of the universe. The absence of an explanation is one thing, but the absence of any imaginable form that an explanation could take is something else, and it has caused many cosmologists to throw up their hands.

**

So my general impression is that both religion and science are treading on thin air. And I don’t mind if you call it God, or Nothing, or Mystery.

I like it. No, make that: I love it.

But that’s me. Hopefully, you’re ready now for part II

Retrosupercomputing

The comment thread on this post segued (oddly enough!) into a discussion of supercomputer designer Seymour Cray and a comparison of his multi-million-dollar systems with today’s ordinary personal computers. I thought it might be interesting to take a look at a supercomputer from almost 60 years ago–the Naval Ordnance Research Calculator (NORC), built by IBM for the US Navy and delivered in 1954, which held the computing speed record for several years.

NORC came only 10 years after the computer age was kicked off by the announcement of the Harvard-IBM Mark I (click here for an interesting contemporary magazine article on that innovation), but it was vastly faster and more powerful. NORC’s arithmetic was done at the rate of about 15,000 additions or 12,000 multiplications per second, and the machine could store 3600 words (16-digit decimal numbers) with a memory cycle time of 8 microseconds. Lots of NORC information and pictures at this site. Applications included hydrodynamics, weather forecasting, logistics simulations, and the motion of celestial bodies. The hydrodynamics problems included studies of torpedo cavitation and of the earth’s liquid core. (Remarks by John von Neumann at the NORC dedication, including audio, here.)

NORC’s circuits used vacuum tubes–9000 of them—and the memory was electrostatic, employing a what were basically TV picture tubes with bits stored on the face as charges and continually refreshed. This technology represented the best speed/cost tradeoff for a high-end computer at the time, but it was very sensitive–apparently, a woman wearing silk stockings walking near the computer would likely cause memory errors because of the static electricity generated. (No doubt leading to much speculation about the correlation between female hotness and computer memory error rate.)

Construction of NORC cost $2.5MM, which equates to about $20MM in 2012 dollars. Some of the cost can probably be attributed to the one-of-a-kind nature of the machine and the pull-out-all-stops-and-make-it-the-fastest spirit of its design. But even a computer intended as a standard commercial product, the roughly contemporaneous IBM 701, went for about $1 million in early 1950s money.At first glance, it seems hard to believe that such a massive investment for such relatively slow and limited machines (by our present-day standards) could have made economic sense. But consider: a calculation taking 30 minutes on NORC might have required something like 30 person-years if done by human beings using the desk calculators of the time. The economics probably did make sense if the workload was appropriate; however, I bet a fair number of these early machines served more as corporate or government-agency status symbols than as paying propositions. (As a side note, I wonder if the awe generated by early computers would have been lessened had the machines not been so physically impressive–say, if they had been about the size of a modern desktop PC?)

NORC, which was in operation through 1968, has of course been surpassed by orders of magnitude by much cheaper and more compact machines. Its computational capabilities are trivial compared with those of the computer on which you are reading this. Yet, strange as it may seem, there are a lot of problems for which today’s computer power is inadequate, and the frontiers of supercomputing continue to be pushed outwards.

While researching this post, I ran across several articles dealing with a particular highly-demanding supercomputer application currently being addressed by computer scientists. This is the modeling of the physical behavior of cloth, which is important both for creation of realistic animated movies and in the textiles/apparel industry. (See for example this paper.) Simulating the movement of a virtual actress’s dress, as she walks down the street in a light breeze, apparently requires far more computer power than did the development of America’s first hydrogen bombs.

Related post: computation and reality

News That You Can Optionally Use

Tonight will be the occasion of a “super Moon”, which seems to be a term used by people who know a lot about the Moon to denote a full Moon that coincides with the Moon’s closest passage to Earth in its elliptical orbit. Cosmophobia aside, there is no reason to be alarmed. However, some Chicagoboyz are finding it necessary to shave more frequently than usual, and if you have a hot date tonight it might be a good idea to pack a disposable razor with your breath mints. Don’t let that stubble become any trouble. (I’m not sure if this advice applies to the ladies as well.)

Read more

Guess the Profanity

I tried to submit the following post to FT on a climate change story, GE rejects Republicans’ climate change doubts. My comment was rejected due to “suspected profanity”. For the life of me, I can’t figure out what the problem is.

There is no doubt that climate is changing. The question under doubt is whether the changes are man-made and require trillions in expenditures to address. This last one is not as well established as they are economic and political questions, not scientific ones. The scientific fact is that we are undergoing a pause in warming that is, at best, at the far end of the lower error bar bounds of the current models. Once the error bars are exceeded, you toss that model out and get something better.
 
If you’re depending on these models to justify trillions in expenditures over the next decades, you take a step back and you are a bit humble. A cold year or two and the problem isn’t even arguable anymore. It means that the science wasn’t right and we’re driving the world economy blind, without working models. We might as well call in a shaman. He would have equal scientific validity as models whose error bars are exceeded by stubborn, plain, empirically observed reality.
 
We are currently undoing major scientific damage that the climate modelers have inflicted on science by hiding their data and stonewalling independent inquiry. The Berkley BEST effort at least will be open and the skeptics can take their best shot at working out the problems. And there are problems, from Briffa’s magic tree in the Yamal data set (the one tree whose inclusion or exclusion reverses the entire conclusion of a highly influential tree ring data set) to the reversal of sign in the Tiljander sediment data (you don’t get to just reverse signs on measurements when they are inconvenient to your conclusions). Time after time data that has been stonewalled turns out to have problems when it is finally pried out into the light of public scrutiny.
 
There is nothing wrong with GE betting on efficiency and reducing pollution. That isn’t what this is about. It is about public monies and massive changes in the world economy based on science that is tough to check because the original data is often kept out of the hands of skeptics.