"Restore(s) a little sanity into current political debate" - Kenneth Minogue, TLS "Projects a more expansive and optimistic future for Americans than (the analysis of) Huntington" - James R. Kurth, National Interest "One of (the) most important books I have read in recent years" - Lexington Green
Chicago Boyz is an Amazon affiliate and earns money from any Amazon purchases you make after you click on an Amazon link on this blog.
Chicago Boyz is also a BlogAds affiliate and earns money from advertising placed on this blog through the BlogAds network.
Some Chicago Boyz advertisers may themselves be Amazon affiliates who earn money from any Amazon purchases you make after you click on an Amazon link on their ad on Chicago Boyz or on their own web sites.
Chicago Boyz will consider publishing advertisements for goods or services that in the opinion of Chicago Boyz management would benefit the readers of this blog. Please direct any inquires to
Chicago Boyz is a registered trademark of Chicago Boyz Media, LLC. All original content on the Chicago Boyz web site is copyright 2001-2014 by Chicago Boyz Media, LLC or the Chicago Boyz contributor who posted it. All rights reserved.
In the balance therefore, the probability is that the virus is not airborne — yet — but it is more dangerous than its predecessors. This would account for its ability to slip through the protocols designed for less deadly strains of the disease. It’s not World War E time, but it’s time to worry.
The results of full genetic sequencing suggest that the outbreak in Guinea isn’t related to others that have occurred elsewhere in Africa, according to an international team that published its findings online in the New England Journal of Medicine (NEJM). That report was from April 2014.
His wife, Decontee Sawyer, said that she had spoken to him a week earlier and that he had made plans to be stateside in early August to celebrate the birthdays of two of his three young daughters. She said the couple had been separated.
He is believed to be the first American to have died from the current outbreak, which has killed 672 people since March, according to World Health Organization figures.
He was American, not African.
The man who brought the Ebola virus to Nigeria probably knew he was infected. Surveillance video of Patrick Sawyer before boarding his flight at Liberia’s James Sprigg Payne’s Airport showed “Mr. Sawyer lying flat on his stomach on the floor in the corridor of the airport and seemed to be in ‘excruciating pain.’ The footage showed Mr. Sawyer preventing people from touching him.”
He collapsed upon arrival in Nigeria, after a layover in Togo and was rushed to a Nigerian hospital. Upon being told he had Ebola, he acted with what the Nigerians called “indiscipline”; a burst of rage and despair against the world and everyone in it.
Upon being told he had Ebola, Mr. Sawyer went into a rage, denying and objecting to the opinion of the medical experts. “He was so adamant and difficult that he took the tubes from his body and took off his pants and urinated on the health workers, forcing them to flee.
Amazingly, he was even then in the process of being sprung by his political connections before death intervened. Had he lived Sawyer might have gotten out and protected by the juju of expensive watches and status symbols, mingled among the muckety-mucks of ECOWAS.
“It’s a total contradiction in terms to spend your public time castigating Medicaid as something that never should have been expanded for poor people and as a broken, problem-riddled system, and then turn around and complain about the length of time to enroll people,” said Sara Rosenbaum, a member of the Medicaid and CHIP Payment and Access Commission, which advises Congress.
Most of the new enrollees are Medicaid members and those enrolled in “private insurance” learn that they have severely restricted choice of doctor or hospital.
Cash medical practice or, in the phrase favored by leftists critics, “Concierge Medicine,” seems to be growing.
Becker is shifting to a new style of practice, sometimes called concierge or retainer medicine. With the help of a company that has been helping physicians make such shifts for over 13 years, he will cease caring for a total of 2,500 patients and instead cut back to about 600. These patients will pay an annual fee of $1,650. In exchange, they will receive a two-hour annual visit with a complete physical exam, same-day appointments, 24-hour physician phone access, and personalized, web-based resources to promote wellness.
The article suggest that all these doctors choosing to drop insurance and Medicare are primary care. Many are but I know orthopedists and even general surgeons who are dropping all insurance.
The concierge model of practice is growing, and it is estimated that more than 4,000 U.S. physicians have adopted some variation of it. Most are general internists, with family practitioners second. It is attractive to physicians because they are relieved of much of the pressure to move patients through quickly, and they can devote more time to prevention and wellness.
…I wonder if we are witnessing the “death of expertise:” a Google-fueled, Wikipedia-based, blog-sodden collapse of any division between students and teachers, knowers and wonderers, or even between those of any achievement in an area and those with none at all.
By this, I do not mean the death of actual expertise, the knowledge of specific things that sets some people apart from others in various areas. There will always be doctors, lawyers, engineers, and other specialists in various fields.
Rather, what I fear has died is any acknowledgement of expertise as anything that should alter our thoughts or change the way we live. A fair number of Americans now seem to reject the notion that one person is more likely to be right about something, due to education, experience, or other attributes of achievement, than any other.
Indeed, to a certain segment of the American public, the idea that one person knows more than another person is an appalling thought, and perhaps even a not-too-subtle attempt to put down one’s fellow citizen. It’s certainly thought to be rude: to judge from social media and op-eds, the claim of expertise — and especially any claim that expertise should guide the outcome of a disagreement — is now considered by many people to be worse than a direct personal insult.
This is a very bad thing. Yes, it’s true that experts can make mistakes, as disasters from thalidomide to the Challenger explosion tragically remind us. But mostly, experts have a pretty good batting average compared to laymen: doctors, whatever their errors, seem to do better with most illnesses than faith healers or your Aunt Ginny and her special chicken gut poultice. To reject the notion of expertise, and to replace it with a sanctimonious insistence that every person has a right to his or her own opinion, is just plain silly.(emphasis added) 
I encourage visitors to the Stage to read Dr. Nichol’s entire piece. It was prompted by what has become a common experience every time he (or fellow UNWC professor and former NSA employee John Schindler) decides to publish a new essay or speak publicly about a pressing issue of the day. Soon after his work is published a flood of acrimonious tweets and e-mails follow, declaring that he does not really understand how American intelligence agencies, the Kremlin, or the Obama administration actuallywork.
Most of these responses are misinformed. Many are simply rude and mean. They are not an impressive example of what laymen commentators can add to America’s political discourse. Dr. Nichols suggests four rules of thumb for engaged citizens that he believes would improve matters:
1.The expert isn’t always right.
2. But an expert is far more likely to be right than you are.
3. Your political opinions have value in terms of what you want to see happen, how you view justice and right. Your political analysis as a layman has far less value, and probably isn’t — indeed, almost certainly isn’t — as good as you think it is.
4. On a question of factual interpretation or evaluation, the expert’s view is likely to be better-informed than yours. At that point, you’re best served by listening, not carping and arguing. 
The trouble with this advice is that there are plenty of perfectly rational reasons to distrust those with political expertise. Read the rest of this entry »
Today, I find a nice discussion of global warming and cooling over the past epoch. The Greenland ice cores are, or should be, the gold standard of temperature measurement. For example.
Records of past temperature, precipitation, atmospheric trace gases, and other aspects of climate and environment derived from ice cores drilled on glaciers and ice caps around the world. Parameter keywords describe what was measured in this data set. Additional summary information can be found in the abstracts of papers listed in the data set citations.
Posted by Michael Kennedy on 15th August 2013 (All posts by Michael Kennedy)
My sentiments on the whole drug question have been influenced by some experience with the medical aspect of the problem. Drugs are slipping out of any control due to developments in synthetic variations of older substances that stimulate brain chemistry, sometimes in unknown ways. The traditional drugs, if we can use that term, are also slipping out of control with Mexican drug wars replacing the Columbian cartels even more violent than their predecessors.
What about marijuana ? It is widely used by the younger generation and, while I do think there are some harmful consequences, especially in potential schizophrenics, the fact is that the laws are widely ignored and do little good and much harm. First, what about the link to psychosis ?
Epidemiological studies suggest that Cannabis use during adolescence confers an increased risk for developing psychotic symptoms later in life. However, despite their interest, the epidemiological data are not conclusive, due to their heterogeneity; thus modeling the adolescent phase in animals is useful for investigating the impact of Cannabis use on deviations of adolescent brain development that might confer a vulnerability to later psychotic disorders. Although scant, preclinical data seem to support the presence of impaired social behaviors, cognitive and sensorimotor gating deficits as well as psychotic-like signs in adult rodents after adolescent cannabinoid exposure, clearly suggesting that this exposure may trigger a complex behavioral phenotype closely resembling a schizophrenia-like disorder. Similar treatments performed at adulthood were not able to produce such phenotype, thus pointing to a vulnerability of the adolescent brain towards cannabinoid exposure.
The night of October 20-21, 2012 was a good time to see Orionid meteors, which are bits of Halley’s Comet that hit the Earth’s atmosphere every October and appear to originate from the constellation Orion. With Jay Manifold’s expert advice and a star map that he emailed to me, I drove to the darkest convenient place and spent a few hours taking pictures.
This coming Sunday’s full moon will be a supermoon, a full moon that coincides with the moon’s closest passage of the year to the Earth. The moon will appear to be a bit bigger in the sky than it does at other times. Might be worth a look. Also could be a good time to go surfing.
(I usually need to shave at least two or three times on a supermoon day but maybe that’s just me. If you have similar issues you might find this to be helpful, or in extreme cases one of these.)
Much of what medical researchers conclude in their studies is misleading, exaggerated, or flat-out wrong. So why are doctors—to a striking extent—still drawing upon misinformation in their everyday practice?
The arguments presented in this article seem like a good if somewhat long presentation of the general problem, and could be applied in many fields besides medicine. (Note that the comments on the article rapidly become an argument about global warming.) The same problems are also seen in the work of bloggers, journalists and “experts” who specialize in popular health, finance, relationship and other topics and have created entire advice industries out of appeals to the authority of often poorly designed studies. The world would be a better place if students of medicine, law and journalism were forced to study basic statistics and experimental design. Anecdote is not necessarily invalid; study results are not necessarily correct and are often wrong or misleading.
None of this is news, and good researchers understand the problems. However, not all researchers are competent, a few are dishonest and the research funding system and academic careerism unintentionally create incentives that make the problem worse.
(Thanks to Madhu Dahiya for her thoughtful comments.)
Anderson is not merely making a technologically oriented argument , but a profoundly cultural one. In his view, the existence of the Maker movement, operating on the collaborative, “open-source” ethos is an iterative, accelerative driver of economic change that complements the technology. Anderson writes: “…In short, the Maker Movement shares three characteristics, all of which are transformative:
I recently reviewed Chris Anderson’s book Makers. What 3 D printing needs is the affordable, user-friendly, versatile device to move 3 D printing from the arcane realm of techno-hobbyist geeks to the general population’s “early adapters”, which will put the next “consumer model” generation on everyone’s office desk; eventually as ubiquitous as cell phones or microwaves.
Formlabs should send one of these to John Robb and Shloky for a product review.
Posted by Michael Kennedy on 15th December 2012 (All posts by Michael Kennedy)
There is information still coming to light about this awful case. Early reports, such as the name of the shooter and the alleged murder of the father, were predictably wrong. It turns out that the shooter, named Adam Lanza, a 20 year old with a history of odd behavior and some evidence of mental illness, such as autism, was living with his mother who was his first victim. There are a number of suggestive reports, that she decided to “stay home to care for” her 20 year old son.
The treatment of severe mental illness in this country has been altered for the worse by a movement that began in the 1960s when mental illness began to be described as a “civil rights ” issue. Several books and movies described abuse of power in commitment of the mentally ill. The first such movie was “The Snake Pit” in which a young woman is committed for what sounds like schizophrenia. The treatment of the time (1948) can be seen as barbaric but there was nothing else available. She did recover, although we know that without adequate treatment, recovery from schizophrenia is unlikely.
Posted by Ginny on 7th December 2012 (All posts by Ginny)
The musings on the random and tragic nature of life remind us of how little we know – and control. But it reminded me of the marketing of a step toward more control: how good are the DNA products? My daughter’s friend, visiting for Thanksgiving, sent her spit to 23andme. The results included a genetic tendency toward weight-related diseases, which led her to a diet and gym membership. Not surprisingly, it linked her with her mother, but also with a cousin neither she nor her mother knew existed. They met, looked each other over, compared notes: they were cousins.
Anyway, she sat in our living room flipping through her smart phone (it gives monthly updates); she was vulnerable to diabetes but less so to Parkinson’s. Genetic weaknesses are becoming obvious as we near retirement; unfortunately, we learn our vulnerabilities at every office visit.
Still, has anyone done this or similar ones? How accurate, how useful, and how much does this (or do others) add to the cloud-knowledge of genes & disease? (Other friends used a different site, but learned what human history would say – that they were both from England and before that Africa.)
Of course, whether it is worth the money or not, whether it is accurate or not, ignores the big question: does such knowledge lead us to believe we have an autonomy still not – never will be – ours? Will knowing more of “who we are” mislead or arm us?
“On the afternoon of December 2, 1942, the Atomic Age began inside an enormous tent on a squash court under the stands of the University of Chicago’s Stagg Field. There, headed by Italian scientist Enrico Fermi, the first controlled nuclear fission chain reaction was engineered. The result—sustainable nuclear energy—led to creation of the atomic bomb and nuclear power plants—two of the twentieth century’s most powerful and controversial achievements.”
I was there halfway between then and now. I am a by-product of the Manhattan Project, being the son of a onetime rifleman in an infantry platoon who was on a troopship in the Pacific on August 6, 1945, in transit for Operation Downfall. He went to the Philippines instead, and never heard a shot fired in anger. I did not matriculate at Chicago to repay a debt – which is fortunate, because as things went, the University spent a good deal of money on me for (so far) no return whatsoever.
I didn’t hear all that much that was new, but I didn’t expect to. It was well worth going, however; I suppose the biggest “delta” was about how his writing changed after he had children and especially when two of them served in the military in WWII. She also pointed out that all the heroic leaders in the trilogy lead from the front, while the villainous leaders are far in the rear, the equivalent of the “chateau generals.”
Another insight was how much the “black breath” and Frodo’s melancholia resemble PTSD. In combination with her remarks about parent-child relationships, this caused me to ask a question about what turns out to be Letter #74, written to Stanley Unwin on 29 June 1944, which includes the sentence: “I have at the moment another son, a much damaged soldier, at Trinity trying to do some work and recover a shadow of his old health.” – a reference to his son Michael, who was pretty severely PTSD’d for a while. So out of slightly morbid curiosity, I asked if she knew anything more about that episode. She did not but said that there are probably more letters, unpublished, that would have details, and perhaps they will eventually see the light of day.
Scripture reading in church this morning was Isaiah 2:1-5. Verse 4 is of course poignant in light of today’s anniversary. If we really are entering the Crisis of 2020, those swords won’t be beaten into plowshares any time soon. Indeed, some future analog of December 2nd, 1942, presumably involving nanomachinery rather than tons of graphite blocks and lumps of enriched uranium, will happen in a laboratory somewhere in the world in another decade or so.
I have thought for some time that life on Mars is going to consist of microorganisms and be buried several feet below the surface of the planet soil. I have even blogged about it before.
Now, there is a possibility of a nucleotide sequencer that could go to Mars on the next probe in 2018.
In what could become a race for the first extraterrestrial genome, researcher J. Craig Venter said Tuesday that his Maryland academic institute and his company, Synthetic Genomics, would develop a machine capable of sequencing and beaming back DNA data from the planet.
“We want to make sure an Ion Torrent goes to Mars,” Rothberg told Technology Review.
Although neither team yet has a berth on a Mars rocket, their plans reflect the belief that the simplest way to prove there is life on Mars is to send a DNA sequencing machine.
“There will be DNA life forms there,” Venter predicted Tuesday in New York, where he was speaking at the Wired Health Conference.
Venter said researchers working with him have already begun tests at a Mars-like site in the Mojave Desert. Their goal, he said, is to demonstrate a machine capable of autonomously isolating microbes from soil, sequencing their DNA, and then transmitting the information to a remote computer, as would be required on an unmanned Mars mission. Heather Kowalski, a spokeswoman for Venter, confirmed the existence of the project but said the prototype system was “not yet 100 percent robotic.”
Doing this on Mars would avoid the problem of contamination by earth organisms. New life forms that don’t use DNA might be a problem but most people who have thought about this believe that DNA is the genetic material of all life forms. Of course, protein, which may have been the original genetic material on earth could also be the Martian equivalent.
We are starting to see commercial spacecraft develop and one was used to reach the international space station recently. A Mars mission is another order of complexity but by 2018, it may be an option.
So a little over six weeks to go until Election Day; I guess we can call this the final heat. Texas is pretty much a red state stronghold, although there are pockets of blue adherents throughout. Yes, even in my neighborhood, there are a handful of defiant Obama-Biden yard signs visible, although outnumbered at least two to one by Romney-Ryan signs. It amounts to about a dozen, all told; I think that most of my neighbors prefer keeping their political preferences this time around strictly to themselves. Read the rest of this entry »
Sorry I haven’t been blogging any during the last eight months but the truth is that I’ve been wrestling with a big decision that affects everyone and I didn’t quite know how to explain it. Now, I’ve come to a decision and I think it only right that I inform you all of it so that you have some time to prepare yourself.
Here goes… I’m turning off the Universe.
Yep, that’s right, the whole shebang, from littlest Higgs Boson to the greatest galaxy clusters. Say goodnight, Gracie.
I know this will be hard to accept, but, you see, you’re not real. I mean, you are real as far as the experience of yourself and every other human being you know of but you aren’t, you know, real real.
I’m not explaining this very well.
You see, I wrote you. That is to say I programmed you. I programmed you and every other person, place and thing in your universe. You’re just a simulation, a very big video game, based loosely on once-real people, places and things that I created. Not only did I create the simulation but I can start it, stop it, rewind it and alter it at will.
And all that kinda makes me god. I mean not GOD god but just god of the universe you experience. Let’s just say, “god as far as you are concerned.”
The recent discovery of a huge pumice island that is larger than the state of Israel has led to scientific excitement, but not just scientific excitement. The questions arise who owns these rocks, and can one live on one of these pumice islands?
We can argue the pros and cons of religion all we like, but I’m not sure it will tell us much more than what our own basic hunches are.
This is part I of a two part post.
Look, there have been a couple of books out recently that suggest something — the universe or universe of universes — might just have come out of nothing, without nothing having to have come out of anywhere special itself.
And there have been a couple of reviews of those books that I’ve read recently, and they don’t think that “it’s nothings, nothings all the way down” is any better than “it’s turtles, turtles all the way down”.
If you’re interested in that discussion, the next two chunks will be of interest to you — but if not, you can skip them and go right to my follow up post — That’s it in a nutshell! — with its three short quotes and a question.
So here are the two chunks — gobbits, if you’re a literary type — that I thought might be of interest.
It happens that ever since the scientific revolution of the 17th century, what physics has given us in the way of candidates for the fundamental laws of nature have as a general rule simply taken it for granted that there is, at the bottom of everything, some basic, elementary, eternally persisting, concrete, physical stuff. Newton, for example, took that elementary stuff to consist of material particles. And physicists at the end of the 19th century took that elementary stuff to consist of both material particles and electro-magnetic fields. And so on. And what the fundamental laws of nature are about, and all the fundamental laws of nature are about, and all there is for the fundamental laws of nature to be about, insofar as physics has ever been able to imagine, is how that elementary stuff is arranged. The fundamental laws of nature generally take the form of rules concerning which arrangements of that stuff are physically possible and which aren’t, or rules connecting the arrangements of that elementary stuff at later times to its arrangement at earlier times, or something like that. But the laws have no bearing whatsoever on questions of where the elementary stuff came from, or of why the world should have consisted of the particular elementary stuff it does, as opposed to something else, or to nothing at all.
The fundamental physical laws that Krauss is talking about in “A Universe From Nothing” — the laws of relativistic quantum field theories — are no exception to this. The particular, eternally persisting, elementary physical stuff of the world, according to the standard presentations of relativistic quantum field theories, consists (unsurprisingly) of relativistic quantum fields. And the fundamental laws of this theory take the form of rules concerning which arrangements of those fields are physically possible and which aren’t, and rules connecting the arrangements of those fields at later times to their arrangements at earlier times, and so on — and they have nothing whatsoever to say on the subject of where those fields came from, or of why the world should have consisted of the particular kinds of fields it does, or of why it should have consisted of fields at all, or of why there should have been a world in the first place. Period. Case closed. End of story.
You can say A because B, B because C, C because D—but what explains D? If you say A, your explanation is circular. If you say because E because F because G (H, I, J, K … ), your explanation is an infinite regress: a taller stack of turtles. You might, instead, argue that D is explained by X, where X is some kind of necessary truth, a logical deduction or physical law. But this presents an interesting question: How, exactly, do you get a material universe out of a necessary truth? “Are the laws of physics somehow to inform the Abyss that it is pregnant with Being?” Holt asks. “If so, where do the laws themselves live? Do they hover over the world like the mind of God? … How do they reach out and make a world? How do they force events to obey them?”
Got me. Got everybody. Try as we might, we can’t find a way to tell a sound causal story about the origins of the universe. The absence of an explanation is one thing, but the absence of any imaginable form that an explanation could take is something else, and it has caused many cosmologists to throw up their hands.
So my general impression is that both religion and science are treading on thin air. And I don’t mind if you call it God, or Nothing, or Mystery.
I like it. No, make that: I love it.
But that’s me. Hopefully, you’re ready now for part II…
The comment thread on this post segued (oddly enough!) into a discussion of supercomputer designer Seymour Cray and a comparison of his multi-million-dollar systems with today’s ordinary personal computers. I thought it might be interesting to take a look at a supercomputer from almost 60 years ago–the Naval Ordnance Research Calculator (NORC), built by IBM for the US Navy and delivered in 1954, which held the computing speed record for several years.
NORC came only 10 years after the computer age was kicked off by the announcement of the Harvard-IBM Mark I (click here for an interesting contemporary magazine article on that innovation), but it was vastly faster and more powerful. NORC’s arithmetic was done at the rate of about 15,000 additions or 12,000 multiplications per second, and the machine could store 3600 words (16-digit decimal numbers) with a memory cycle time of 8 microseconds. Lots of NORC information and pictures at this site. Applications included hydrodynamics, weather forecasting, logistics simulations, and the motion of celestial bodies. The hydrodynamics problems included studies of torpedo cavitation and of the earth’s liquid core. (Remarks by John von Neumann at the NORC dedication, including audio, here.)
NORC’s circuits used vacuum tubes–9000 of them—and the memory was electrostatic, employing a what were basically TV picture tubes with bits stored on the face as charges and continually refreshed. This technology represented the best speed/cost tradeoff for a high-end computer at the time, but it was very sensitive–apparently, a woman wearing silk stockings walking near the computer would likely cause memory errors because of the static electricity generated. (No doubt leading to much speculation about the correlation between female hotness and computer memory error rate.)
Construction of NORC cost $2.5MM, which equates to about $20MM in 2012 dollars. Some of the cost can probably be attributed to the one-of-a-kind nature of the machine and the pull-out-all-stops-and-make-it-the-fastest spirit of its design. But even a computer intended as a standard commercial product, the roughly contemporaneous IBM 701, went for about $1 million in early 1950s money.At first glance, it seems hard to believe that such a massive investment for such relatively slow and limited machines (by our present-day standards) could have made economic sense. But consider: a calculation taking 30 minutes on NORC might have required something like 30 person-years if done by human beings using the desk calculators of the time. The economics probably did make sense if the workload was appropriate; however, I bet a fair number of these early machines served more as corporate or government-agency status symbols than as paying propositions. (As a side note, I wonder if the awe generated by early computers would have been lessened had the machines not been so physically impressive–say, if they had been about the size of a modern desktop PC?)
NORC, which was in operation through 1968, has of course been surpassed by orders of magnitude by much cheaper and more compact machines. Its computational capabilities are trivial compared with those of the computer on which you are reading this. Yet, strange as it may seem, there are a lot of problems for which today’s computer power is inadequate, and the frontiers of supercomputing continue to be pushed outwards.
While researching this post, I ran across several articles dealing with a particular highly-demanding supercomputer application currently being addressed by computer scientists. This is the modeling of the physical behavior of cloth, which is important both for creation of realistic animated movies and in the textiles/apparel industry. (See for example this paper.) Simulating the movement of a virtual actress’s dress, as she walks down the street in a light breeze, apparently requires far more computer power than did the development of America’s first hydrogen bombs.
Tonight will be the occasion of a “super Moon”, which seems to be a term used by people who know a lot about the Moon to denote a full Moon that coincides with the Moon’s closest passage to Earth in its elliptical orbit. Cosmophobia aside, there is no reason to be alarmed. However, some Chicagoboyz are finding it necessary to shave more frequently than usual, and if you have a hot date tonight it might be a good idea to pack a disposable razor with your breath mints. Don’t let that stubble become any trouble. (I’m not sure if this advice applies to the ladies as well.)
I tried to submit the following post to FT on a climate change story, GE rejects Republicans’ climate change doubts. My comment was rejected due to “suspected profanity”. For the life of me, I can’t figure out what the problem is.
There is no doubt that climate is changing. The question under doubt is whether the changes are man-made and require trillions in expenditures to address. This last one is not as well established as they are economic and political questions, not scientific ones. The scientific fact is that we are undergoing a pause in warming that is, at best, at the far end of the lower error bar bounds of the current models. Once the error bars are exceeded, you toss that model out and get something better.
If you’re depending on these models to justify trillions in expenditures over the next decades, you take a step back and you are a bit humble. A cold year or two and the problem isn’t even arguable anymore. It means that the science wasn’t right and we’re driving the world economy blind, without working models. We might as well call in a shaman. He would have equal scientific validity as models whose error bars are exceeded by stubborn, plain, empirically observed reality.
We are currently undoing major scientific damage that the climate modelers have inflicted on science by hiding their data and stonewalling independent inquiry. The Berkley BEST effort at least will be open and the skeptics can take their best shot at working out the problems. And there are problems, from Briffa’s magic tree in the Yamal data set (the one tree whose inclusion or exclusion reverses the entire conclusion of a highly influential tree ring data set) to the reversal of sign in the Tiljander sediment data (you don’t get to just reverse signs on measurements when they are inconvenient to your conclusions). Time after time data that has been stonewalled turns out to have problems when it is finally pried out into the light of public scrutiny.
There is nothing wrong with GE betting on efficiency and reducing pollution. That isn’t what this is about. It is about public monies and massive changes in the world economy based on science that is tough to check because the original data is often kept out of the hands of skeptics.
Health care reform is no laughing matter, but MIT economist Jonathan Gruber’s new comic book on the subject aims to communicate some pretty complicated policy details in a way that, if not exactly side-splitting, is at least engaging.
In Health Care Reform: What It Is, Why It’s Necessary, How It Works, Gruber steps into the pages of a comic book to guide readers through many of the major elements of the law, including the individual mandate to buy insurance, the health insurance exchanges where people will be able to buy coverage starting in 2014 and how the law tackles controlling health care costs.
While I was buying a copy of Persepolis from a real-life book store a few years ago, a young woman at the sales counter mentioned that there was a “great” graphic novel about North Korea that I might like. I’m not a graphic novel reader and I think Persepolis is it for me unless I decide to review the health care book, but it interested me that she seemed so enthusiastic about the topic of North Korea and graphic novels. I guess it makes sense given our “information overload” society. I don’t know. Why not look for clarity?
PS: Linking is not endorsement and all that.
PPS: What’s the “all that” about? Eh, I’ve been burning the candle at both ends for the past week or so and my blogging has been pretty terrible because of it. I linked the health care graphic novel because it amused me, not because I am simpatico with the message. I think you all knew that already….