Chicago Boyz

                 
 
 
 

 
  •   Problem? Question?
  •   Contact Contributors:
  •   Please send any comments or suggestions about America 3.0 to:

  • CB Twitter Feed
  • Lex's Tweets
  • Jonathan's Tweets
  • Blog Posts (RSS 2.0)
  • Blog Posts (Atom 0.3)
  • Incoming Links
  • Recent Comments

    • Loading...
  • Authors

  • Notable Discussions

  • Recent Posts

  • Blogroll

  • Categories

  • Archives

  • Archive for the 'Systems Analysis' Category

    “Lightning Fall” – Review

    Posted by Jay Manifold on 20th April 2014 (All posts by )

    While this will not be a uniformly positive review, I must immediately note that the purely literary quality of Bill Quick’s Lightning Fall (subtitled either “A Novel of Destruction” or “A Novel of Disaster,” depending on whether one is looking at the spine or the cover of the paperback edition) ranks it alongside Pat Frank’s Alas, Babylon and comes within metaphorical striking distance of Larry Niven and Jerry Pournelle’s Lucifer’s Hammer. It is a classic page-turner and a serious threat to a good night’s sleep; I began reading it after awakening shortly before 3:00 AM one morning, expecting to drift off in a few minutes, and eventually noticed that I was somewhere around page 250 and the time was after 6:00 AM. This sort of thing has not happened to me more than a handful of times in a half-century of reading, and I read a lot.
    Other reviews have included – well, not exactly spoilers, but more specifics about the events in the novel than I intend to provide here. I will mention three things that I think it useful for prospective readers to know, and then use the general thrust of the novel as a springboard for extended commentary of my own.
    First, those who self-identify as Democrats, liberals, or progressives should certainly be forewarned that every such character in this novel, major or minor, is portrayed as a brigand or a fool. (To paraphrase a well-known slogan: I report, you decide.)
    Second, the attack on the US that provides the setup for the novel’s action has three parts. One is, unsurprisingly, an EMP strike. Another is similar to the first, but aimed in such a way as to decapitate national leadership. The third is straight out of the Einstein–Szilárd letter: “A single bomb of this type, carried by boat and exploded in a port, might very well destroy the whole port together with some of the surrounding territory.” Two of the three parts of the attack succeed; to find out which two, you’ll have to read the book.
    Third, persons with an aversion to reading lots of cussing might need to skip this book. No sex scenes, though.
    And now for my customary barrage of disclaimers before following through on the aforementioned threat of extended commentary. Bill Quick obviously worked very hard on this book and is widely acknowledged as a subject-matter expert on disaster preparation. He is probably smarter than me and certainly far more knowledgeable about the hardware required for off-grid survival. None of what follows is intended to impugn either his accomplishments or his motivations.
    I nonetheless offer what I think of as a more nuanced set of predictions:
    0. As regular readers know all too well, I subscribe to a cyclical theory of American history which predicts a “secular crisis,” designated the “Crisis of 2020,” not occurring in a single year, but in the sense that the Great Depression and World War II could be said to have constituted the “Crisis of 1940.” To that extent, I find much of the novel’s near-future setting entirely plausible; the Crisis of 2020 is obviously already underway, and seems certain to involve both a domestic economic component (in progress) and a geopolitical military component (which, if I had to guess, will indeed begin in the year specified in the novel). Given the technologies available, the urge many people feel to somehow escape the next world war is entirely understandable.
    1. Where I first part ways with the novel’s scenario is in its – probably necessary, if only due to considerations of length – portrayal of a relative uniformity of effects. An entire region of the US is rendered, with only minor exceptions, utterly uninhabitable, and the remainder of the country is, again almost completely, hit very hard; total casualties are projected to be a noticeable fraction of the population.
    The actual situation in the event of such an attack, however, would be significantly more complex, for physical – but also sociological – reasons. Physically, due to characteristics of both the electrical grid and soil types, EMP effects will vary widely on scales as small as tens of kilometers. In the northeastern US in particular, a surprising number of virtually unaffected counties could form a sort of checkerboard pattern among those that lose electric power for two years or more.
    The sociological issue is touchier. There is something to the progressive critique of American inequality; they of course fail in prescribing Federal action to alleviate it – the DC metro is already, and notoriously, the wealthiest in the country, Versailles-on-the-Potomac – but differences in net worth, in particular, vary enormously on surprisingly small scales. In the metro area I live in, there are entire square miles in the inner city with less aggregate wealth than single households in the tonier areas. I lack progressive credentials; I strongly believe these inequities to be an emergent property of the overall system, an artifact of culture and especially generational temperament rather than anything readily meliorated by the proper legislation. The Silent Generation (birth years 1925-42) was deeply concerned with equality. The first wave of the baby boomers (’43-’51) was somewhat less so, the last-wave boomers (’52-’60, which includes me) much less so, and the Gen Xers (’61-’81) scarcely at all. [All dates from Strauss and Howe, and note that these are cultural, not demographic, generations, thus the departure from the usual ’46-’64 definition for the boomers.]
    So, to bring this home, and I encourage readers to plug analogous neighborhoods in their own cities into this paragraph, the east side of KC – which as I commented recently, has a homicide rate around 80 per 100,000 per year, so high as to be characteristic of failed states elsewhere in the world – might very well experience a population crash from starvation and disease, while southern Johnson County on the Kansas side lost 1% or less of its people. And that is likely to be true even if the physical infrastructure of both areas is equally affected. Relative wealth connotes many other kinds of preparedness and resiliency, including the psychological.
    The implications of all this are that refugee movements in large (six-digit) numbers but on short (much less than the range of a single tank of fuel in an automobile) distance scales will need to be dealt with in many places, simultaneously. A significant revival of Civil Defense seems in order, as well as general improvements in civil society, about which more below.
    2. I strongly believe that recovery will be, for some industries in some places, surprisingly rapid, contrary to the novel’s portrayal of (almost) uniform prostration for an indefinite period. The real-world example is obviously Cantor Fitzgerald, seemingly annihilated on 9/11/2001 but fully operational before the end of that week. To generalize, all data centers of large organizations have disaster-recovery sites, sometimes in deliberately undisclosed locations. They certainly know about EMP. Not all will succeed, of course – see the point about inequality vs uniformity above – but many will. Given that nuclei of recovery will attract refugees (assuming situational awareness, which people have a way of acquiring), we are again back to the civil defense / overall health of civil society issue.
    3. Now to veer sharply in a nonobvious but, if I may say so, rather insightful direction: suppose there were an existing society in which the electrical grid chronically malfunctions, there is no regular supply of potable water, availability of motorized transport is scant, malnutrition is a constant backdrop, and a variety of illnesses (often vector-borne) are at pandemic levels.
    According to the survivalist/prepper model, the inhabitants of that society should be dying in heaps, not least from slaughtering one another. Furthermore, the safest people in that society should be the most remote, studiously avoiding human contact and devoting their energies to becoming, and remaining, entirely self-reliant.
    But that society does exist, and in it, other than a tiny elite, the largest number of people living in some (admittedly by North American standards rather slight) comfort are those with the greatest degree of interaction with others. The worst off, at imminent risk of death, are the rural isolates – and the wealthiest elites are not found in the deep countryside, but on the very outskirts of the largest city in the nation. Also, there’s no slaughtering going on.
    Well, that’s Haïti, and my time there over the past three years has convinced me that the oft-extolled strategy of holing up somewhere as far away from other people as you can get, with everything you think you’re going to need, is nothing more than elaborate and painful suicide. In a decade or two, someone will find your bones in your hideout and wonder what the hell you could possibly have been thinking.
    4. A detailed course of action does not derive from this, but the general answer, upon which we must elaborate for our individual situations and with which readers of Matt Ridley’s The Rational Optimist will be familiar, is the exact opposite of isolation and self-reliance: trade and specialization. As Julian Simon masterfully documented, people are the ultimate resource. In anticipation of a significant disruptive event, therefore, we would do well to look toward (to borrow a term) community organizing. How well do you know your neighbors, and what can you offer them, whether material or informational, in trade?
    To be clear, I do think that some geographical locations are worse than others. Be wary of resource constraints, and especially of a lack of redundancy. Areas with few roads (and no interstate highways), few reliable water sources, and a low number of high-voltage generator step-up transformers per capita seem likely to be particularly problematic. As implied above, low-population-density areas will be most vulnerable, and will recover last. Cities will recover first.
    5. To return to the topic of emergent behavior: a healthier and more resilient, ideally even an antifragile society, must be built up from healthier priorities and actions on the part of its individual members. My critiques above notwithstanding, “Lightning Fall” is ultimately a salutary attempt to encourage individual preparation for otherwise poorly-managed risks to civilization stemming from exogenous violence. If enough people think this through and prepare properly – which as I have argued, usually means preparing in, with, and for your existing neighborhood – the recovery time may be reduced by a full order of magnitude. The hardest thing will not be procuring and installing hardware or even acquiring unfamiliar skills; it will be inculcating the proper attitude and values in the population to keep too many people from ending up dying alone in worthless hideouts, when they could have been thriving and helping others to thrive in large communities. We need some engineers, but we really need some evangelists.
    6. I must also mention a “stretch objective,” as we say in Corporate America – in this case, a deliberate tendency to move toward the sound of gunfire, metaphorical or otherwise; to prepare not only for your immediate vicinity, but for the eventuality of large, nearby refugee movements, and perhaps even to work in a nearby community that has obvious vulnerabilities. This may mean volunteering with a Community Emergency Response Team, or in a ministry that works in distressed areas. It will certainly not be something everyone is capable of doing, for many reasons, some more forgivable than others (viewing your fellow human beings primarily as a source of problems would be one of the less forgivable ones).
    7. Finally, liberal patriotism is a complex – and, no doubt, somewhat self-contradictory – phenomenon, but complex with a dash of self-contradiction ≠ nonexistent, and I cannot merely dismiss it as unhelpful. The defense of civilization will cross ideological boundaries in surprising ways. This point could easily be expanded into a full-length post, but I’d certainly want to talk it over with a few of my liberal friends first.

    Posted in Book Notes, Civil Society, Human Behavior, International Affairs, National Security, Predictions, Society, Systems Analysis, Terrorism, USA, War and Peace | No Comments »

    My health care posts from 2013

    Posted by Michael Kennedy on 2nd January 2014 (All posts by )

    David has a good idea. I often read the archives of my personal blog to see how I did in forecasting the future or understanding the present. A major concern of mine is, of course, health care and what is happening. When I retired from surgery after my own back surgery, I spent a year at Dartmouth Medical School’s center for study of health care. My purpose was to indulge an old hobby. How do we measure quality in health care ? I had served for years on the board of a company called California Medical Review, Inc. It was the official Medicare review organization for California. For a while I was the chair of the Data Committee. It seems to have gone downhill since I was there. First, it changed its name in an attempt to get more business from private sources. Then it lost the Medicare contract.

    Lumetra, which lost a huge Medicare contract last November, is changing its name and its business model as it seeks to replace more than $20 million in lost revenue.
    The San Francisco-based nonprofit’s revenue will shrink this year from $28 million last fiscal year, ending in March 2009, to a projected $4.5 million, CEO Linda Sawyer told the Business Times early this week.
    That’s in large part because it’s no longer a Medicare quality improvement contractor, formerly its main line of work. And in fact, the 25-year-old company’s revenue has been plummeting since fiscal 2007, when it hit $47 million.

    I see no sign that it is involved with Obamacare which is being run from Washington with a state organization that seems no better run than the parent organization.

    Beginning Jan. 1, 2015, the Affordable Care Act no longer will provide federal grants to fund state health exchanges. In addition, California law prohibits using the state’s general fund to pay for the exchange.

    Anyway, for what it is worth, here are the links to the 2013 health posts.

    The Lost Boys

    Alternatives to Obamacare.

    Why the Obamacare Site Isn’t Working.

    Where Healthcare May be Going.

    Conservatives Invented the Mandate; say the Democrats.

    A Critical Insight.

    A Rolling Catastrophe.

    Why Health Care is in Trouble.

    Where Do We Go Now ?

    Building the Airplane During Takeoff.

    Posted in Blogging, Current Events, Health Care, Medicine, Obama, Politics, Systems Analysis | 17 Comments »

    Interesting Post

    Posted by David Foster on 5th December 2013 (All posts by )

    Bruce Webster writes about the parallels (and differences) between the design of legislation and the design of software systems.

    (via a thread at Bookworm)

    Posted in Health Care, Law, Political Philosophy, Politics, Systems Analysis, Tech | 3 Comments »

    Building the airplane during takeoff.

    Posted by Michael Kennedy on 19th November 2013 (All posts by )

    Henry-Chao

    UPDATE: The Wall Street Journal on how to fix the Obamacare crisis.

    What can be done is Congress creating a new option in the form of a national health insurance charter under which insurers could design new low-cost policies free of mandated benefits imposed by ObamaCare and the 50 states that many of those losing their individual policies today surely would find attractive.

    What’s the first thing the new nationally chartered insurers would do? Rush out cheap, high-deductible policies, allaying some of the resentment that the ObamaCare mandate provokes among the young, healthy and footloose affluent.

    These folks could buy the minimalist coverage that (for various reasons) makes sense for them. They wouldn’t be forced to buy excessive coverage they don’t need to subsidize the old and sick.

    Who knows ? Maybe Jenkins reads this blog. It’s so obvious that the solution should be apparent even to Democrats.

    We are now learning that a large share of the Obamacare structure is still unbuilt. This is not the website but the guts of the system.

    The revelation came out of questioning of Mr. Chao by Rep. Cory Gardner (R., Colo.). Gardner was trying to figure out how much of the IT infrastructure around the federal insurance exchange had been completed. “Well, how much do we have to build today, still? What do we need to build? 50 percent? 40 percent? 30 percent?” Chao replied, “I think it’s just an approximation—we’re probably sitting between 60 and 70 percent because we still have to build…”

    Gardner replied, incredulously, “Wait, 60 or 70 percent that needs to be built, still?” Chao did not contradict Gardner, adding, “because we still have to build the payment systems to make payments to insurers in January.”

    This is the guy who is the chief IT guy for CMS.

    If the ability to pay the insurance companies is not yet written, how can anybody sign up ?

    Gardner, a fourth time: “But the entire system is 60 to 70 percent away from being complete.” Chao: “There’s the back office systems, the accounting systems, the payment systems…they still need to be done.”

    Gardner asked a fifth time: “Of those 60 to 70 percent of systems that are still being built, how are they going to be tested?”

    The answer was the same way the rest was tested.

    Read the rest of this entry »

    Posted in Big Government, Health Care, Medicine, Obama, Politics, Systems Analysis | 8 Comments »

    Musings on Tyler’s Technological Thoughts

    Posted by David Foster on 18th November 2013 (All posts by )

    Tyler Cowen, in his recent book Average Is Over, argues that computer technology is creating a sharp economic and class distinction between people who know how to effectively use these “genius machines” (a term he uses over and over) and those who don’t, and is also increasing inequality in other ways. Isegoria recently excerpted some of his Tyler’s comments on this thesis from a recent New Yorker article.

    I read the book a couple of months ago, and although it’s worth reading and is occasionally thought-provoking, I think much of what Tyler has to say is wrong-headed. In the New Yorker article, for example, he says:

    The first (reason why increased inequality is here to stay) is just measurement of worker value. We’re doing a lot to measure what workers are contributing to businesses, and, when you do that, very often you end up paying some people less and other people more.

    The second is automation — especially in terms of smart software. Today’s workplaces are often more complicated than, say, a factory for General Motors was in 1962. They require higher skills. People who have those skills are very often doing extremely well, but a lot of people don’t have them, and that increases inequality.

    And the third point is globalization. There’s a lot more unskilled labor in the world, and that creates downward pressure on unskilled labor in the United States. On the global level, inequality is down dramatically — we shouldn’t forget that. But within each country, or almost every country, inequality is up.

    Taking the first point: Businesses and other organizations have been measuring “what workers are contributing” for a long, long time. Consider piecework. Sales commissions. Criteria-based bonuses for regional and division executives. All of these things are very old hat. Indeed, quite a few manufacturers have decided that it is unwise to take the quantitative measurement of performance down to an individual level, in cases where the work is being done by a closely-coupled team.

    It is true that advancing computer technology makes it feasible to measure more dimensions of an individual’s work, but so what? Does the fact that I can measure (say) a call-center operator on 33 different criteria really tell me anything about what he is contributing the the business?

    Anyone with real-life business experience will tell you that it is very, very difficult to create measurement and incentive plans that actually work in ways that are truly beneficial to the business. This is true in sales commission plans, it is true in manufacturing (I talked with one factory manager who said he dropped piecework because it was encouraging workers to risk injury in order to maximize their payoffs), and it is true in executive compensation. Our blogfriend Bill Waddell has frequently written about the ways in which accounting systems can distort decision-making in ultimately unprofitable ways. The design of worthwhile measurement and incentive plans has very little to do with the understanding of computer technology; it has a great deal to do with understanding of human nature and of the deep economic structure of the business.

    Read the rest of this entry »

    Posted in Book Notes, Business, Economics & Finance, Management, Systems Analysis | 14 Comments »

    On Being an IT Project Manager

    Posted by Jay Manifold on 23rd October 2013 (All posts by )

    My profession is much in the news at the moment, so I thought I would pass along such insights as I have from my career, mostly from a multibillion-dollar debacle which I and several thousand others worked on for a few years around the turn of the millennium. I will not name my employer, not that anyone with a passing familiarity with me doesn’t know who it is; nor will I name the project, although knowing the employer and the general timeframe will give you that pretty quickly too.
    We spent, I believe, $4 billion, and garnered a total of 4,000 customers over the lifetime of the product, which was not aimed at large organizations which would be likely to spend millions on it, but at consumers and small businesses which would spend thousands on it, and that amount spread out over a period of several years. From an economic transparency standpoint, therefore, it would have been better to select 4,000 people at random around the country and cut them checks for $1 million apiece. Also much faster. But that wouldn’t have kept me and lots of others employed, learning whatever it is we learn from a colossally failed project.
    So, a few things to keep in mind about a certain spectacularly problematic and topical IT effort:

    • Large numbers of reasonably bright and very hard-working people, who have up until that point been creating significant wealth, can unite in a complete flop. Past performance is no guarantee, and all that. Because even reasonably bright, hard-working people can suffer from failures of imagination, tendencies to wishful thinking, and cultural failure in general.
    • Morale has got to be rock-bottom for anybody with any degree of self-awareness working on this thing. My relevant moment was around the end of ’99 when it was announced, with great fanfare, at a large (200+ in attendance) meeting to review progress and next steps, that we had gotten a single order through the system. It had taken various people eight hours to finish the order. As of that date, we were projecting that we would be doing 1,600 orders a day in eight months. To get an idea of our actual peak rate, note the abovementioned cumulative figure of 4,000 over the multi-year lifespan of the project.
    • Root cause analysis is all very well, but there are probably at least three or four fundamental problems, any one of which would have crippled the effort. As you may infer from the previous bullet point, back-office systems was one of them on that project. Others which were equally problematic included exposure to the software upgrade schedule of an irreplaceable vendor who was not at all beholden to us to produce anything by any particular date, and physical access to certain of our competitors’ facilities, which they were legally required to allow us into exactly two (2) days per year. See also “cultural failure,” above; most of us were residing and working in what is one of the most livable cities in the world in many ways, but Silicon Valley it ain’t.
    • Not to overlook the obvious, there is a significant danger that the well-advertised difficulties of the website in question will become a smokescreen for the fundamental contradictions of the legislation itself. The overall program cannot work unless large numbers of people act in a counter-incentived (possibly not a word, but I’m groping for something analogous to “counterintuitive”) fashion which might politely be termed “selfless” – and do so in the near future. What we seem likely to hear, however, is that it would have worked if only certain IT architectural decisions had been better made.

    This thing would be a case study for the next couple of decades if it weren’t going to be overshadowed by physically calamitous events, which I frankly expect. In another decade, Gen-X managers and Millennial line workers, inspired by Boomers, all of them much better at things than they are now, “will be in a position to guide the nation, and perhaps the world, across several painful thresholds,” to quote a relevant passage from Strauss and Howe. But getting there is going to be a matter of selection pressures, with plenty of casualties. The day will come when we long for a challenge as easy as reorganizing health care with a deadline a few weeks away.

    Posted in Big Government, Book Notes, Commiserations, Current Events, Customer Service, Health Care, Internet, Law, Medicine, Personal Narrative, Politics, Predictions, Systems Analysis, Tech, USA | 6 Comments »

    Upcoming talk at the Chicago Council on Global Affairs: STRATEGY: FROM THE WAR ROOM TO THE BOARD ROOM Sir Lawrence Freedman

    Posted by onparkstreet on 12th October 2013 (All posts by )

    STRATEGY: FROM THE WAR ROOM TO
    THE BOARD ROOM

    Sir Lawrence Freedman, Professor of War Studies, and Vice-Principal, King’s College London

    What do modern military and corporate strategy have in common with Achilles, Sun Tzu, and primates? The answer is fluidity, flexibility, and pure unpredictability. Every day we make decisions that are built on our theory of what will give us the outcome we want. Sir Lawrence Freedman proposes that throughout history strategy has very rarely gone as planned, and that constant evaluation is necessary to achieve success—even today. Join The Chicago Council for a centuries-spanning discussion explaining how the world’s greatest minds navigate toward success.

    For interested parties. Sir Lawrence Freedman has quite a few talks posted on YouTube too. Worth checking out.

    Posted in History, Military Affairs, Systems Analysis | 2 Comments »

    Social-Media Corruption

    Posted by Jonathan on 14th August 2013 (All posts by )

    What proportion of all social-media communication is by bots, spammers, people with agendas who misrepresent themselves, or severely dysfunctional people who pass as normal online? I suspect it’s a large proportion.

    There’s not much hard evidence, but every once in a while something like this turns up. I’m guessing it’s the tip of an iceberg. See also this. And who can overlook the partisan trolls who show up on this and other right-of-center blogs before elections. Where do they come from?

    None of this apparently widespread Internet corruption should come as a surprise. Given the low costs and lack of barriers to entry it would be surprising if attempts to game the system were less frequent than they appear to be. Nonetheless it’s prudent to keep in mind that a lot of what appears online is probably fake and certainly misleading.

    (Via.)

    Posted in Business, Human Behavior, Internet, Systems Analysis | 14 Comments »

    “Studies Show” – Widespread Errors in Medical Research

    Posted by Jonathan on 17th June 2013 (All posts by )

    Much of what medical researchers conclude in their studies is misleading, exaggerated, or flat-out wrong. So why are doctors—to a striking extent—still drawing upon misinformation in their everyday practice?

    The arguments presented in this article seem like a good if somewhat long presentation of the general problem, and could be applied in many fields besides medicine. (Note that the comments on the article rapidly become an argument about global warming.) The same problems are also seen in the work of bloggers, journalists and “experts” who specialize in popular health, finance, relationship and other topics and have created entire advice industries out of appeals to the authority of often poorly designed studies. The world would be a better place if students of medicine, law and journalism were forced to study basic statistics and experimental design. Anecdote is not necessarily invalid; study results are not necessarily correct and are often wrong or misleading.

    None of this is news, and good researchers understand the problems. However, not all researchers are competent, a few are dishonest and the research funding system and academic careerism unintentionally create incentives that make the problem worse.

    (Thanks to Madhu Dahiya for her thoughtful comments.)

    Posted in Academia, Medicine, Science, Statistics, Systems Analysis, Video | 13 Comments »

    Institutions, Instruments, and the Innovator’s Dilemma

    Posted by T. Greer on 16th June 2013 (All posts by )

    I have written several posts that use Carroll Quigley’s “institutional imperative” as a lens for understanding contemporary events. [1] Mr. Quigley suggests that all human organizations fit into one of two types: instruments and institutions. Instruments are those organizations whose role is limited to the function they were designed to perform. (Think NASA in the 1960s, defined by its mission to put a man on the moon, or the NAACP during the same timeframe, instrumental to the civil rights movement.) Institutions, in contrast, are organizations that exist for their own state; their prime function is their own survival.

    Most institutions start out as instruments, but as with NASA after the end of the Cold War or the NAACP after the victories of the civil rights movement, their instrumental uses are eventually eclipsed. They are then left adrift, in search of a mission that will give new direction to their efforts, or as happens more often, these organizations begin to shift their purpose away from what they do and towards what they are. Organizations often betray their nature when called to defend themselves from outside scrutiny: ‘instruments’ tend to emphasize what their employees or volunteers aim to accomplish; ‘institutions’ tend to emphasize the importance of the heritage they embody or even the number of employees they have.

    Mr. Quigley’s institutional imperative has profound implications for any democratic society – especially a society host to so many publicly funded organizations as ours. Jonathan Rauch’s essay, “Demosclerosis” is the best introduction to the unsettling consequences that come when public organizations transform from instruments into institutions. [2] While Mr. Rauch does not use the terminology of the Institutional Imperative, his conclusions mesh neatly with it. Describing the history and growth of America’s bureaucratic class, Mr. Rauch suggests its greatest failing: a bureaucracy, once created, is hard to get rid of. To accomplish whatever mission it was originally tasked with a bureaucracy must hire people. It must have friends in high places. The number of people who have a professional or economic stake in the organization’s survival grows. No matter what else it may do, it inevitably becomes a publicly sponsored interest group. Any attempt to reduce its influence, power, or budget will be fought against with ferocity by the multitude of interests who now depend on it. Even when it becomes clear that this institution is no longer an instrument, the political capital needed to dismantle it is just too high to make the attempt worth a politician’s time or effort. So the size and scope of bureaucracies grow, encumbering the country with an increasing number of regulations it cannot change, employees it does not need, and organizations that it cannot get rid of.

    I used to think that the naked self-interest described by Mr. Rauch was the driving force behind the Institutional Imperative. It undoubtedly plays a large role (particularly when public funds are involved), but there are other factors at play. One of the most important of these is what business strategists call Marginal Thinking.
    Read the rest of this entry »

    Posted in Business, Markets and Trading, Politics, Systems Analysis | 16 Comments »

    Top-Down Failure, and the Alternative

    Posted by Jonathan on 6th April 2013 (All posts by )

    Wretchard discusses recent notorious Type II system failures. The Colorado theater killer’s shrink warned the authorities to no avail. The underwear bomber’s father warned the authorities to no avail. The Texas army-base jihadist was under surveillance by the authorities, who failed to stop him. Administrators of the Atlanta public schools rigged the academic testing system for their personal gain at the expense of students and got away with it for years. Wretchard is right to conclude that these failures were caused by hubris, poor institutional design and the natural limitations of bureaucracies. The question is what to do about it.

    The general answer is to encourage the decentralization of important services. If government institutions won’t reform themselves individuals should develop alternatives outside of those institutions. The underwear bomber’s fellow passengers survived because they didn’t depend on the system, they took the initiative. That’s the right approach in areas as diverse as personal security and education. It’s also the approach most consistent with American cultural and political values. It is not the approach of our political class, whose interests are not aligned with those of most members of the public.

    The Internet is said to route itself around censorship. In the coming years we are going to find out if American culture can route itself around the top-down power grabs of our political class and return to its individualistic roots. Here’s hoping.

    Posted in America 3.0, Human Behavior, National Security, Political Philosophy, RKBA, Society, Systems Analysis, Terrorism, Tradeoffs | 7 Comments »

    Quote of the Day

    Posted by Jonathan on 25th March 2013 (All posts by )

    The low rate of overt accidents in reliable systems may encourage changes, especially the use of new technology, to decrease the number of low consequence but high frequency failures. These changes maybe actually create opportunities for new, low frequency but high consequence failures. When new technologies are used to eliminate well understood system failures or to gain high precision performance they often introduce new pathways to large scale, catastrophic failures. Not uncommonly, these new, rare catastrophes have even greater impact than those eliminated by the new technology. These new forms of failure are difficult to see before the fact; attention is paid mostly to the putative beneficial characteristics of the changes. Because these new, high consequence accidents occur at a low rate, multiple system changes may occur before an accident, making it hard to see the contribution of technology to the failure.

    From:

    How Complex Systems Fail (pdf)
    (Being a Short Treatise on the Nature of Failure; How Failure is Evaluated; How Failure is Attributed to Proximate Cause; and the Resulting New Understanding of Patient Safety)
    Richard I. Cook, MD
    Cognitive technologies Laboratory University of Chicago

    Posted in Quotations, Systems Analysis | 5 Comments »

    “Gawande’s Kitchen”

    Posted by Jonathan on 24th August 2012 (All posts by )

    An insightful critique:

    But there is a much more important question being ignored by Gawande — How well does The Cheesecake Factory analogy really apply to health care? We can see how similar the kitchen is to an operating room — lots of busy people rushing about in a sterile environment, each concentrated on a task. But what about the rest of the “system?”
     
    At The Cheesecake Factory, the customer is the diner. That’s who orders the service, pays the bill, and comes back again if he is happy. That is who all of the efficient, standardized food preparation is designed to please.
     
    In Gawande’s ideal health care model, however, the customer isn’t the patient, but the third-party payer, be it an insurer or government. Let’s call that entity the TPP. The TPP never enters the kitchen. The TTP has no idea what happens in there, and doesn’t really care as long as the steak is cooked to his satisfaction and the tab is affordable.
     
    In this model, the patient is actually the steak. It is the steak who is processed in the kitchen. It is the steak that is cut and cooked and placed on a platter. The steak doesn’t get a vote. Nobody cares if the steak is happy. The steak doesn’t pay the bill. The steak isn’t coming back again.
     
    So here we are in Dr. Gawande’s kitchen, where you and I are slabs of meat and Chef Gawande will cook us to the specifications of his TPP customers — satisfaction guaranteed.

    Worth reading in full.

    (Via The Right Coast.)

    Posted in Management, Medicine, Politics, Systems Analysis | 3 Comments »

    TSA fail: the end of the road is now visible

    Posted by TM Lutas on 2nd August 2012 (All posts by )

    There was an attack in Saudi Arabia using internally placed explosives up the lower GI tract. These explosives cannot be detected by pat downs, metal detectors, or millimeter wave machines. Much more powerful scanning machines would be required or a cavity search. But no follow up bombs have happened using this method. I’d always wondered why. Now things are becoming clear. Apparently there’s been something of a theological problem. It appears that butt bombs are not permitted due to Islam’s prohibition of sodomy. But that prohibition seems to be loosening.

    It will take years for the theologians to digest this new complication but once it has been let loose, it is clearly foreseeable that some portion of islamic scholars will hold this position. The consequences for our travel security regime are rather scary. We’re going to have reached the end of the line because routine x-rays at each flight segment are just not going to happen. The accumulated radiation would cause too many cancers. And cavity searches are simply unreasonable. So where does that leave TSA’s current security strategy?

    Like most of their terror innovations, I expect that this will take some time for them to organize. It looks like they’ve already put 4 years into it. It may take them another 4 before they’ve worked the theological problems out sufficient to recruit bombers. But then what?

    Posted in Systems Analysis, Terrorism | 19 Comments »

    (A) Quote of the Day

    Posted by Jonathan on 4th July 2012 (All posts by )

    Wretchard:

    But though they may hate the Pax Americana, the Greens probably can’t live without it. Can’t live without the Ipods, the connectivity, the store-bought food, the cafe-bought lattes — all the ugly things made by private industry. And by paring down the redundancies in the system as wasteful and unsightly; by reducing the energy reserves of the system in favor of such fairy schemes as windmills and carbon trading the Greens have made the system far less robust than it could have been. Because they are never going to need the Design Margin. Ever. Until they do.

    Posted in Civil Society, Economics & Finance, Human Behavior, Leftism, Management, Quotations, Systems Analysis, Tradeoffs | 14 Comments »

    Estimating Odds

    Posted by Jonathan on 22nd March 2012 (All posts by )

    From a comment by “Eggplant” at Belmont Club:

    Supposedly the US has war gamed this thing and the prospects look poor. A war game is only as good as the assumptions programmed into it. Can the war game be programmed to consider the possibility that a single Iranian leader has access to an ex-Soviet nuke and is crazed enough to use it?
     
    Of course the answer is “No Way”.
     
    A valid war game would be a Monte Carlo simulation that considered a range of possible scenarios. However the tails of that Gaussian distribution would offer extremely frightening scenarios. The Israelis are in the situation where truly catastrophic scenarios have tiny probability but the expectation value [consequence times probability] is still horrific. However “fortune favors the brave”. Also being the driver of events is almost always better than passively waiting and hoping for a miracle. That last argument means the Israelis will launch an attack and probably before the American election.

    These are important points. The outcomes of simulations, including the results of focus groups used in business and political marketing, may be path-dependent. If they are the results of any one simulation may be misleading and it may be tempting to game the starting assumptions in order to nudge the output in the direction you want. It is much better if you can run many simulations using a wide range of inputs. Then you can say something like: We ran 100 simulations using the parameter ranges specified below and found that the results converged on X in 83 percent of the cases. Or: We ran 100 simulations and found no clear pattern in the results as long as Parameter Y was in the range 20-80. And by the way, here are the data. We don’t know the structure of the leaked US simulation of an Israeli attack on Iran and its aftermath.

    It’s also true, as Eggplant points out, that the Israelis have to consider outlier possibilities that may be highly unlikely but would be catastrophic if they came to pass. These are possibilities that might show up only a few times or not at all in the output of a hypothetical 100-run Monte Carlo simulation. But such possibilities must still be taken into account because 1) they are theoretically possible and sufficiently bad that they cannot be allowed to happen under any circumstances and 2) the simulation-based probabilities may be inaccurate due to errors in assumptions.

    Posted in Human Behavior, Iran, Israel, National Security, Predictions, Quotations, Statistics, Systems Analysis, War and Peace | 16 Comments »

    “Statistical Quality Control Meets the NYPD”

    Posted by Jonathan on 20th March 2012 (All posts by )

    An excellent post by Mark Draughn that reminds how we get the behavior we incentivize. In this case the NYC govt incentivized its police to ignore violent crimes and to make bogus arrests to boost their cleared-case stats:

    This is a standard recipe for disaster in quality control — and CompStat is at heart a statistical quality control program. Take a bunch of people doing a job, make them report quality control data, and put pressure on them to produce good numbers. If there is little oversight and lots of pressure, then good numbers is exactly what they’ll give you. Even if they’re not true.

    Worth reading in full.

    Posted in Human Behavior, Law Enforcement, Management, Systems Analysis | 15 Comments »

    Adventure

    Posted by Jonathan on 13th March 2012 (All posts by )

    Turner River Kayak Trip

     

    Many people canoe and kayak in the Florida Everglades’ extensive inland waterways, which are beautiful, full of interesting plants and animals and easily accessible. I couldn’t refuse an invitation to join friends for a day trip down the Turner River in the Big Cypress area. My friends arranged for me to borrow a kayak but its owner backed out of the trip at the last minute. Fortunately, the guy who organized the trip offered me the use of a kayak that he owns.

    Read the rest of this entry »

    Posted in Human Behavior, Personal Narrative, Systems Analysis | 20 Comments »

    Mapping our interdependencies and vulnerabilities [with a glance at Y2K]

    Posted by Charles Cameron on 28th September 2011 (All posts by )

    [ cross-posted from Zenpundit -- mapping, silos, Y2K, 9/11, rumors, wars, Boeing 747s, Diebold voting machines, vulnerabilities, dependencies ]


    www.fun1001.com | Send this image to your friend

    The “bug” of Y2K never quite measured up to the 1919 influenza bug in terms of devastating effect — but as TPM Barnett wrote in The Pentagon’s New Map:

    Whether Y2K turned out to be nothing or a complete disaster was less important, research-wise, than the thinking we pursued as we tried to imagine – in advance – what a terrible shock to the system would do to the United States and the world in this day and age.

    1.

    My own personal preoccupations during the run-up to Y2K had to do with cults, militias and terrorists — any one of which might have tried for a spectacle.

    As it turned out, though, Al Qaida’s plan to set off a bomb at Los Angeles International Airport on New Year’s Eve, 1999 was foiled when Albert Ressam was arrested attempting to enter the US from Canada — so that aspect of what might have happened during the roll-over was essentially postponed until September 11, 2001. And the leaders of the Ugandan Movement for the Restoration of the Ten Commandments of God, acting on visionary instructions (allegedly) from the Virgin Mary, announced that the end of the world had been postponed from Dec 31 / Jan 1 till March 17 — at which point they burned 500 of their members to death in their locked church. So that apocalyptic possibility, too, was temporarily averted.

    2.

    Don Beck of the National Values Center / The Spiral Dynamics Group, commented to me at one point in the run-up:

    Y2K is like a lightening bolt: when it strikes and lights up the sky, we will see the contours of our social systems.

    – and that quote from Beck, along with Barnett’s observation, pointed strongly to the fact that we don’t have anything remotely resembling a decent global map of interdependencies and vulnerabilities.

    What we have instead is a PERT chart for this or that, Markov diagrams, social network maps, railroad maps and timetables… oodles and oodles of smaller pieces of the puzzle of past, present and future… each with its own symbol system and limited scope. Our mapping, in other words, is territorialized, siloed, and disconnected, while the world system which is integral to our being and survival is connected, indeed, seamlessly interwoven.

    I’ve suggested before now that our mapping needs to pass across the Cartesian divide from the objective to the subjective, from materiel to morale, from the quantitative to the qualitative, and from rumors to wars. It also needs a uniform language or translation service, so that Jay Forrester system dynamic models can “talk” with PERT and Markov and the rest, Bucky Fuller‘s World Game included.

    I suppose some of all this is ongoing, somewhere behind impenetrable curtains, but I wonder how much.

    3.

    In the meantime, and working from open source materials, the only kind to which I have access – here are two data points we might have noted a litle earlier, if we had decent interdependency and vulnerability mapping:

    quo-vulnerabilities.gif

    Fear-mongering — or significant alerts? I’m not tech savvy enough to know.

    4.

    Tom Barnett’s point about “the thinking we pursued as we tried to imagine – in advance – what a terrible shock to the system would do to the United States and the world in this day and age” still stands.

    Y2K was what first alerted me to the significance of SCADAs.

    Something very like what Y2K might have been seems to be unfolding — but slowly, slowly.

    Are we thinking yet?

    Posted in Predictions, Systems Analysis, Tech | 7 Comments »