Stalling Progress in Aviation — It’s Time for a Breakthrough

Six Hour Radius for Commercial Airliners, 1940-1990

This diagram shows the stalling progress in the speed of air travel.

The inner ring, the range of a DC-3 in 1940, was substantially improved upon by the Lockheed Constellation in 1950, and much more so with the Boeing 707 in 1960. That was twenty years. But from 1960 to 1990, only the small outer circle was gained. And in the quarter century since, it has not expanded at all.

Technology has advanced in small things — small in size, not in importance — like electronics. But in big, macroscopic things, the world of “stuff”, it seems that there has been stasis for two generations. In a recent post, I linked to a video where Peter Thiel made this point. Theil may have overstated his case, but in the case of aviation he certainly appears to be correct. (Incidentally, my copy of Theil’s new book, Zero to One: Notes on Startups, or How to Build the Future arrived yesterday.)

One theory is that only defense-related spending is sufficiently large and removed from market considerations to lead to truly massive breakthroughs in technology. This view is espoused by Peter J. Hugill, in his book World Trade Since 1431: Geography, Technology, and Capitalism Paperback, a brilliant book which I heartily commend to you. However, I am not convinced that this is true in every case. In the case of aviation, the basic scientific insights exist, so government-financed development may not be necessary to reach the next breakthrough in aircraft performance.

My coauthor Jim Bennett notes:

We may soon see transsonic aircraft operating commercially. These will fly just above the speed of sound, where the sonic boom can be minimized by a number of design tricks. These could operate at airspeeds of around 700 knots, compared to the 500-550 at which most airliners are operated today. They could go faster but they are deliberately slowed down to reduce fuel consumption.

According to Jim, true supersonic or hypersonic aircraft “will be limited to transoceanic routes by sonic boom restrictions, or depend on new approaches which have yet to be fully tested.”

While a 20% increase in speed will be nice to have, I am eager to see these massive, disruptive changes in aviation speed — multiples of the present speed, not just incremental increases.

In America 3.0 we predict a breakdown of the regulatory machinery that is stalling technological progress in many areas, including improved aircraft performance. We speculate about what much faster commercial air travel will allow in terms of, for example, locating retirement housing in Cuba and Mexico, with rapid access by air.

Seniors are able to stay at home, both with mechanical assistance and with many people specializing in providing elder care, or move into modularized units easily attached to the their adult childrens’ homes. Retirement communities in Cuba, the Central Highlands of Mexico and the Mexican border zoner are becoming popular. Hypersonic air travel, until recently only used by the very wealthy or government officials, is slowly coming down in price, as aerospacelines compete for business, thus making visits back and forth to visit Grandma far easier.

Just as driverless cars will make exurban development feasible, as we describe in America 3.0, routine, affordable supersonic air travel will make remote locations useable for business and housing that are not feasible now.

A world that it is half or a third the size it is now, in terms of travel time, opens up opportunities that we cannot even conceive of now.

(The map above is from Prime Movers of Globalization: The History and Impact of Diesel Engines and Gas Turbines by Vaclav Smil.)

54 thoughts on “Stalling Progress in Aviation — It’s Time for a Breakthrough”

  1. The laws of physics in an atmosphere will preclude routine high speed air travel. Drag is not linear with speed, it goes up with the square of the velocity.There are other terms that can be fiddled with, like density, but there is a limit to how high a jet engine can efficiently operate, and it costs in fuel to get to higher altitudes.

    I remember as a kid in the mid 1960’s, I would hear AF T-38’s from Webb AFB creating sonic booms several times a day. During the oil embargo in 1973, that suddenly stopped as fuel costs became a large part of operating costs. Now a sonic boom is a rarity. Experience has shown that running between 500-550 mph at about 20000 to 30000 ft is the best compromise between speed and fuel economy. Unless there is undiscovered physics law that will be found, I expect this to hold in the future. SST’s did not die because of the enviro-whacko opposition or immature technology, they died because of economics.

  2. I am still sort of blown away that you can fly from Chicago to Grand Cayman in around 3 or 4 hours, pretty cheaply to boot. Cayman is openly discussed as part of my retirement plan but that is a long time from now. Not much beats getting on a plane when it is zero degrees f and getting off and it is 80.

  3. Growing up in coastal New England one could hear the twice a day boom of the Concorde leaving JFK. I heard that was a quick trip over the Atlantic. Why did they cancel that again?

  4. Transportation is a series of nodes, faster aircraft still need to arrive at an airport; land, taxi, unload, and let the cattle proceed through the lovely security and airport facilities. The benefit of significantly faster aircraft will be muted in that the air transport time is only a single factor in the logistical realities of personal travel.

  5. Growing up in coastal New England one could hear the twice a day boom of the Concorde leaving JFK. I heard that was a quick trip over the Atlantic. Why did they cancel that again?

    The Concorde was very expensive to operate, mostly due to high fuel consumption. The engines were optimized for supersonic flight, which in itself has a very high specific fuel consumption and this did not improve with subsonic flight. To cover fuel costs, ticket prices were very high, at least $2k for a round trip ticket with the British/French government subsidy. The planes were old and starting to be high maintenance items, as after the disaster at DeGualle Airport, it was deceded to retire them.

  6. Technology has advanced in small things, electronics, but in big, macroscopic things, the world of “stuff”, it seems that there has been stasis for two generations

    There are sound engineering and economic reasons for this “stasis”.

    That is mainly due to no dramatic material breakthroughs. For example, large steam boilers have not made it past 1050 deg F in 40 years due to no economically vaible materials for the tubes. Potomac Power built an experimental coal fired station in the early 1960’s that used 1100 F temperatures, but it suffered from early boiler tube failures that caused them to back down to 1050 F. We have made progress in getting pressures up to where some newer plants are operating at 4000 psi.

    We could make a large heavy lift rocket like the Saturn 5 even more powerful now with slightly better metals and drastically better electronics, but it would only be an incremental improvement in performance. SpaceX has done this with their Falcon rockets. The basic operating parameters of the LOX/kerosene engines are no different from those built in the 1960’s, but he’s made operations cheaper with modern electronics and manufacturing techniques.

    Nuclear plants are also in a similar situation. We are still using PWR/BWR uranium/plutonium types because the infrastructure for supporting them exists, mainly due to the military sharing the costs of building it. There are some tweaks made for the new plants being built, but an engineer from the 1960’s would still recognize them. Thorium reactors, HTGR pebble bed reactors, and other types have had prototypes built, but no one is willing to build the infrastructure to support scaled up utility size projects. Fusion power plants would be the next big thing, but they always seem to be 20 years out and have been at that point since I graduated from engineering school in 1979.

    In transportation, nothing has proven more economical and efficient than the gasoline/diesel engine system we currently use. Much money has been spent since the 1970’s on developing alternatives, and after over 40 years, nothing still comes close, hell, nothing even comes within as 50% as effective. By any metric you care to use, cars nowdays are much, much better than those built 40 years ago. Railroads still exist because nothing has proven cheaper for overland transport of freight.

    The utility electric grid system is still the cheapest way to transmit electricity to your home/business. It does suffer from a lack of investment in new facilities, but that is from the political interference from a flawed attempt at de-regulation, not technical capabilities.

  7. So here’s a question I have posited for some time now –

    Go back 25 years or so. You have $20,000 to buy the best sounding stereo you can find.. speakers, amp, woofers…. all that stuff. It’s 25 years ago.

    Now, do the same thing today. Is your product even slightly better sounding? At all? Oh, the storage, data transmission, size of the product and the like – its physical properties – have changed, but does it sound even slightly better, from speaker to ear — even the tiniest bit? And if it does sound better, is it a “better” that could only really be identified by an audio technician or similar expert?

    And if it has really not improved….. well…. are we done? Is that it? Other than plugging the sound directly into your neurons, have we now completed that particular technology, and is there nowhere else to go? Period?

    What happens when we hit that ceiling in other technologies, or even in art and culture conceivably?

    Such would be a rather unprecedented state of affairs for the human race.

    I have heard very little discussion of this, and when I bring it up, I find great emotional and little factual resistance to the idea. I am interested in any thoughts others might have on this.

  8. On that, you have reached the limits of the human brain to process the sound signals, which is another thing entirely from hitting a material limit of steel or silicon. The only way around that would be to get the ability to implant and integrate a computer processor chip to assist the brain, which given the way hackers get into my PC, I’d be leery of them hacking into my brain via a networked processor implanted into and connected to my brain.

  9. JW – Agreed.

    Which means we are now “done” with that particular technology, and can put it on the shelf for daily use, but direct improvement efforts elsewhere.

    But that’s kind of a weird and unprecedented place to be, and I have noted how many are passionately resistant to the idea when I bring it up.

  10. “Thorium reactors, HTGR pebble bed reactors, and other types have had prototypes built, but no one is willing to build the infrastructure to support scaled up utility size projects.”

    This is an example of where the technology is available, but the regulatory and political regimes are holding back progress. Thorium reactors were built in the 1960s but were abandoned because the fissile byproducts couldn’t easily be made into weapons grade material. Myopic inertia is now the reason why safe, proven designs aren’t now considered.

    Actually, the Chinese are working on one that will be complete in a couple years. Like in many other areas, the innovation is happening someplace else.

  11. I wonder if one point people wondered if the experience of listening to a full symphony had reached its peak – all the possible musical combinations had been tried, all the possible instruments had been used, all the great composers had lived and died.

  12. Grurray,

    What the Chinese and Indians are doing with the Thorium reactors is not innovation. They are conducting narrowly focused R&D on scaling up the small reactor built at Oak Ridge back in the late 1950’s using data from the short lived conversion of the old Shippingport plant to Thorium back in the late 1970’s. I would expect the first reactor to be in the 300-500 MWth range. It will be operated for at least 5 years to see how it scales up, then a bigger one will be built in the 1000-2000 MWth range which will be the true prototype commercial version, producing 300-600 MWe. It will then be operated for 5 or so years before scaling up even more. They’ll probably build 4 to 5 units of this one. If they decide in the meantime to go this route, then this 5 year period will be when the build the rudimentary infrastructure to support commercialization. This will include the fuel and component manufacturing, training of operations, engineering and maintenance staffs, the training centers, and the regulatory infrastructure.

    The same will hold true for the pebble bed reactors if they decide to go this route too. They won’t get developed and commercialized overnight. You proceed conservatively in scaling up. You never know what are little or non-existant problems at low levels become big ones as you increase the scale. Space prevents me from listing the ones that bit GE, Combustion Engineering, B&W, and Westinghouse in the 1970’s as they quickly went from 400-500 MWe plants to 1000 to 1200 MWe units.

  13. The benefit of significantly faster aircraft will be muted in that the air transport time is only a single factor in the logistical realities of personal travel.

    For long distance travel, exo-atmospheric transportation provides speed and range. We don’t care about atmospheric drag when we’re outside the atmosphere. The issue is the fuel cost and carried fuel weight to get outside the atmosphere. There it seems like very long electromagnetic rails might provide an answer to a high speed, low gee launch.

    Here’s an Electromagnetic Aircraft Launch System throwing an F-18 into the air:
    This is designed to fit on aircraft carrier deck, the Ford Class and forward.

    That a fairly short rail. Imagine one several miles long that sloped up a hillside or mountain side. Build a dedicated nuclear reactor to provide the juice for the whole facility.

    Maglev track could launch spacecraft into orbit:

    Now all that is well and good for the military and NASA style customers. What I want to see is these technologies applied to commercial transportation.

  14. Suborbital rocket transport will be expensive. Probably on the order of Concorde expensive, even with the EM rail launcher. That BTW coupled with the dedicated nuke reactor would make enviro-whacko heads explode all over the world……

  15. “That BTW coupled with the dedicated nuke reactor would make enviro-whacko heads explode all over the world……”

    So the Chinese will do it first. Then we will race to catch up and have e second-mover advantage.

  16. Lex, the Chinese have serious quality control problems in all their industries. The rampant corruption makes it hard to get quality products all the time. They cannot even make good copies of Russian jet engines and their heavy forged products are suspect. I have a firm suspicion that the country will experience severe upheaval as the commies running the place are not able to control the rampant corruption within the very heavy bureaucracy. I think this upheaval will take place sometime within the next 5 years, and in fact the current problems in Hong Kong may be the precursor.

    They do not have the experience and technical ability to build a steam catapult, much less an EM one, and the current government will not have the time to get there before it falls.

  17. Joe, joking about the Chinese, more or less. My point is that the USA may have to embark on technical improvements because foreigners are challenging us by building their own, first.

    We shall see about China falling apart. The corruption and shoddy products and fake statistics and all of that have been known for many years yet they keep chugging along.

    We shall also see how the Hong Kong episode plays out. My guess is the Communists will escalate the use of force until they regain control. They cannot compromise with this sort of resistance. They will pay the price in unpopularity and verbal criticism from foreigners rather than risk their regime. So, if the protesters don’t disperse soon, expect a much heavier-handed response. They are more sophisticated than they were at Tiananmen, but they will use lethal force if necessary. They learned from Russia — Perestroika, yes; Glasnost, NEVER.

  18. I haven’t heard anything that makes me think that fusion power will ever be more than a theoretical possibility. If I’m surprised someday, terrific, that will be a “nice to have.”

  19. I thought a version of the 777 had a 12,000 mile range – but 9,300 was all I could find in a quick search. We really need more speed – a breakthrough in supersonic design.

  20. “I have heard very little discussion of this, and when I bring it up, I find great emotional and little factual resistance to the idea. I am interested in any thoughts others might have on this.”

    The next big technological breakthrough, I think, will be in vision. Macular degeneration is a huge aging problem. Present therapy is invasive and expensive. The next step will be stem cells.

    The next step beyond that may be visual prostheses. There is a lot of work going on and some of it is quite novel.

    The composite device has a miniature video camera mounted on the patient’s eyeglasses, which captures images and passes them to a microprocessor that converts the data to an electronic signal. This signal, in turn, is transmitted to an array of electrodes placed on the retinal surface, which transmits the patterned signal to the remaining viable secondary neurons. These neurons (ganglion, bipolar cells, etc.) begin processing the signal and pass it down the optic nerve to the brain for final integration into a visual image. Many groups in different countries have different versions of the device, including brain implants and retinal implants, the latter having epiretinal or subretinal placement. The device furthest along in development is an epiretinal implant sponsored by Second Sight Medical Products (SSMP). Their first-generation device had 16 electrodes with human testing in a Phase 1 clinical trial beginning in 2002. The second-generation device has 60+ electrodes and is currently in Phase 2/3 clinical trial. Increased numbers of electrodes are planned for future versions of the device. Testing of the device’s efficacy is a challenge since patients admitted into the trial have little or no vision.

    Moore’s Law should be in effect once the concept is proven.

  21. MikeK – this being the only venue I can reach you, and you being interested in all things medical, have you read the new Forbes article on Patrick Soon-Shiong? Title of the piece is Medicine’s Manhattan Project

    The long and the short of it is that he has a lot of innovative things but is almost perceived as a huckster – but if just a few of t=his things coime to fruitation it will really help the state of medicine.

    Oh, and he is a billionaire – the richest Dr in the world.

  22. My hypothesis is that every specific technology goes through a s shaped yeast growth curve. It begins with a take-off phase that may be prolonged.

    Just think about how long inventors took to build a flying machine that worked at all. The technology then goes into an explosive exponential growth phase. That phase has a notional time span of about 60 years. In that time, airplanes went from the Wright Flyer to transatlantic passenger jets and the SR-71. The first 747s flew in 1968 a mere 65 years after Kitty Hawk.

    As Mr. Wooten points out above, further development of aircraft is constrained by physical trade offs between the various parameters that govern flight. I am sure that there will never be a civilian airliner that can perform like an SR-71. Conditions at that speed and altitude are far too desperate to contemplate subjecting civilians to them.

    Time frames for development can be lengthened or shortened by external factors. Defense spending in the World Wars and Cold War undoubtedly accelerated airplane development, but the same factors probably retarded automobile development.

    The automobile was first instantiated in the mid 1880s, but I would say that the type that showed almost every feature one finds in current production models was in the 1957 Cadillac Eldorado, even things like memory seats. That is a 70+ year period.

    The first integrated circuit was invented in 1958. We are now at the so called 14 nm process. I would expect us to hit the wall around 2020, when companies are projecting the 5 nm process to be available. At some point around then, Moore’s law will come to an end and the integrated circuit will be a mature technology.

    A final reference for this comment will be on energy technology. Clearly, nuclear technology has been strangled in its cradle by the combined efforts of the Soviet Unions dezinformatsia campaigns and its useful American Idiots — the Watermelons — who want to stop global warming by imposing communism. If they really wanted to control CO2, they would insist on nuclear energy. A few true believers, such as James Hansen, press for nuclear, but the real core of the movement is “¡Socialismo O Muerte!“. Not abundant clean energy through atomic power. So, yes, current developments are using proposals from the 1950s, but given the rancid politics, that is as much as we can expect.

    Under my analysis, expect no meaningful technological changes from wind mills (ancient technology), solar cells (1954), or electro-chemical batteries (~1800). In other words the entire “renewable energy” package is DoA. OTOH, fusion is still in the pre-take-off stage. We can see that there is a physical phenomenon that we could harness, but no one has built the Wright Flyer yet. Perhaps ITER will turn the trick. But if it does, that is a development for the Century XXII, and it won’t get us through the Obama Mal-Administration.

  23. I really do not see a fusion plant coming in my lifetime.If ITER works out as promised, it will result in a capital intensive power plant that will make the current fission power plant technology look like a natural gas plant. It will not be cheap to build at all, and since power costs have to include capital recovery, it will not be any cheaper than a fission plant. Fuel costs will be slightly cheaper but in a fission plant fuel is about 5% or less of annual operating costs.

    As far as space travel goes, we can build a fusion driven ship right now, using the Orion concept, but the test ban treaty would forbid using it even in space.

  24. Mr. Wooten:

    You are quite right about fusion. The ITER plan

    From the Wikipedia article “ITER”:

    The facility is now expected to finish its construction phase in 2019. It will start commissioning the reactor that same year and initiate plasma experiments in 2020, but there is no plan to begin full deuterium-tritium fusion until 2027—if the ITER team can solve the technical challenges involved. … The first commercial demonstration fusion power plant, named DEMO, is proposed to follow on from the ITER project.

    From the Wikipedia article “DEMO”:

    The following timetable was presented at the IAEA Fusion Energy Conference in 2004 … for the contruction and operation of DEMO:

    Conceptual design is to be complete by 2017
    Engineering design is to be complete by 2024
    The first construction phase is to last from 2024 to 2033
    The first phase of operation is to last from 2033 to 2038
    The plant is then to be expanded/updated
    The second phase of operation is to last from 2040 onwards

    OTOH, something better might come along to pull fusion out of its place at the foot of the S curve.

  25. Only a government bureaucracy would come up with a development schedule that lasts 25 years. And that’s probably optimistic for them.

  26. It is the nature of most technologies to reach a design optimum where further improvement in one characteristic only comes at the expense of others. At that point there will tend to be an economically optimal balance, which may shift over time. Improvements past that point tend to be in terms of cost, rather than performance. For airplanes, the 727 was actually faster than most of its successors – because of the fuel tradeoffs.

    How long it takes to reach this point depends on the complexity of the technology and the resources available to work on it. As the global economy grows the resources available to work on new technologies are large enough that most are pretty thoroughly developed within a couple of decades.

    Most of what you handle on a day-to-day basis is technologically mature. Food, clothing, shelter – there have really been only a handful of new “niche” technologies in these areas in my lifetime. Costs have changed, some dramatically, changing what people actually consume, but the basic elements haven’t changed since the development of synthetic fibers.

    So what you get with new technologies as they hit their ceilings will be much the same as what you see for the bulk of what you currently have. Steady, but gradually slowing declines in cost and once costs get low enough, a profusion of design and style elements to allow products to be customized to the needs/preferences of ever smaller market segments.

  27. Costs can matter as much as speed. In 1920, You could have easily shipped freight across the oceans at speeds equal to or better than today’s container ships…turbine-powered liners clipped along at 20 knots or so, and could have been reconfigured for all-freight rather than mostly-passengers. But you couldn’t have achieved the low costs of today’s container freight, both because of ship/fuel/crew cost on the voyage itself, and loading/unloading costs and delays in a pre-containerization era. It was the lowered costs of transport that enabled today’s extensive globalization of supply chains.

  28. Improved telecommunications also had much to do with supply-chain globalization, but once again this was largely a matter of costs. In 1920, telegraph service was available in all commercial centers, but it wasn’t cheap.

  29. “Only a government bureaucracy would come up with a development schedule that lasts 25 years.”

    A good example was The Human Genome Project which was a classic federal bureaucracy run by James Watson who had little to do with Crick’s discovery of the double helix.

    The project was proposed and funded by the US government; planning started in 1984, the project got underway in 1990, and was declared complete in 2003. A parallel project was conducted outside of government by the Celera Corporation, or Celera Genomics, which was formally launched in 1998. Most of the government-sponsored sequencing was performed in twenty universities and research centers in the United States, the United Kingdom, Japan, France, Germany, and China.[3]

    If you read the Wiki entry, you would not know that the “parallel project by Celera” was actually the one that deciphered the Human Genome. It was run by Craig Venter.

    In 1998, a similar, privately funded quest was launched by the American researcher Craig Venter, and his firm Celera Genomics. Venter was a scientist at the NIH during the early 1990s when the project was initiated. The $300,000,000 Celera effort was intended to proceed at a faster pace and at a fraction of the cost of the roughly $3 billion publicly funded project.

    Notice the 14 years between the start of the federal project and Venter’s.

    Many believe that the competition proved to be very good for the project, spurring the public groups to modify their strategy in order to accelerate progress. The rivals at UC Santa Cruz initially agreed to pool their data, but the agreement fell apart when Celera refused to deposit its data in the unrestricted public database GenBank. Celera had incorporated the public data into their genome, but forbade the public effort to use Celera data.

    Clinton announced that a patent on genes would not be allowed although he didn’t specify any legislation. He was unhappy with Venter after Celera analyzed the semen stains on the Lewinsky dress.

    Craig Venter is amazing guy and was not “one of the first to sequence the human genome,” but the first. Clinton fuzzed the role of the federal project the way he fuzzed the discoverer of the AIDS virus when a French team had actually discovered it and then were cheated by an NIH researcher who stole their cultures.

    Years later, the French team was awarded the Nobel Prize and the NIH researcher who stole their culture was excluded finally.

    In 1989, the investigative journalist John Crewdson[23] suggested that Gallo’s lab might have misappropriated a sample of HIV isolated at the Pasteur Institute by Montagnier’s group.[24] Investigations by the National Institutes of Health (NIH) and the HHS ultimately cleared Gallo’s group of any wrongdoing [22][25] and demonstrated that they had numerous isolates of HIV of their own. As part of these investigations, the United States Office of Research Integrity at the National Institutes of Health commissioned Hoffmann–La Roche scientists to analyze archival samples established at the Pasteur Institute and the Laboratory of Tumor Cell Biology (LTCB) of the National Cancer Institute between 1983 and 1985. They concluded that the virus used in Gallo’s lab had come from Montagnier’s lab

    It was not until Clinton was gone that the controversy was finally solved. Government science, as we see with Global Warming, is shot through with conflicts of interest and self dealing.

  30. Robert Schwartz, your hypothesis has been proposed before that technology diffusion follows the S curve common to population growth.

    However, twenty years ago or so, Robert Christensen found the s curve to be
    firm specific rather then a uniform industry phenonomon.

    If he’s correct, our supposed stagnation may be more of a problem of political structure. This goes along with another theory of diffusion, Fisher-Pry model which states that the rate of substitution of the new technology for the current one is proportional to the remaining amount of the old left to be substituted. The resulting curve would look like many smaller S curves interlocking making up one big S curve.

    There are a lot of other theories on diffusion and also on transition, which is what we appear to be in now.

  31. Another notable theory is the sailing ship effect, so named for the clipper ships that were developed after steam ships hit the scene.
    It occurs when new technologies, while still immature, threaten the incumbent enough that they shift the S curve over in a burst of late stage innovation. This could be what’s happening in the fossil fuel industry, among others.

  32. Bill, I read the article on “Patrick Soon-Shiong Title of the piece is Medicine’s Manhattan Project” and there might be something to it but it is a long way from anything I think is practical. In the 1980s, when I was still in practice, I was very enthusiastic about electronic medical records. I thought computer aided decision support would be the Next Big Thing in critical care medicine.

    This is that story.

    Now, I try to help medical students deal with clunky and counter-intuitive EMR systems that require you to enter a diagnosis before it will accept any data. When you finally figure out the real diagnosis, it won’t let you delete the first guess.

    Transplanting islet cells, for example, is useless because type I diabetes is an autoimmune disease. They are quickly destroyed unless immunosuppression is used. Insulin is safer.

    Computers in medicine have been way over-hyped.

  33. The numbers you’re using are wrong.

    707*: 5,700nmi

    747-400*: 7,700 nmi.

    777* – 9,400 nmi

    In fact, the jump from the Constellation (4,700nmi) to the longest ranges 707 version is smaller than the jump in range from longest ranges 747-400 version to the 707.

    The 747-400 circle should extend well below Brazilia into Southern Brazil. The 777 by comparison should extend beyond Buenos Aires into the center of Argentina.

    And these planes can do the trip faster, with much higher cargo capacity. I don’t see any stagnation here.

    * Maximum take-off weight for all, comparing only the longest-ranged versions of each type.

  34. I flew a 747-SP on a trip to Australia in 1988. It was always a puzzle why that airplane was smaller than the 747 and had less range (We had to stop in Tahiti for fuel) but was considered an improvement. At about that time. QUANTAS had a 747 using naval aviation fuel that could go from Sydney to London nonstop. A flight of about 20 hours. When we came back from Sydney to LA, it was about 16 hours and, a few years later, even going to Sydney was nonstop at 14 hours in a 747-400.

  35. “The caption says it’s the range after six hours of average cruising speed, not maximum range.
    The increase in max range is squeezing more blood from a stone, or process innovation rather than product innovation.”

    If it’s only flight range after 6 hours of flight time, then it’s meaningless. That’s a function of speed. There’s no reason to expect that a 747 should be that much faster than a 707. A 707 had jet engines, while the Constellation didn’t. Hence the grater gain in speed.

    Airlines figured out long ago that speed wasn’t as important as range and carrying capacity. Hence, they focus on improving those metrics rather than speed.

    It certainly isn’t “process innovation”, however. It’s product innovation. It’s an improvement in the technical capabilities. Process innovation applies to the production methods of the aircraft, which have also seen considerable improvement.

    Either way, I fail to see the technological stagnation here. It’s only “stagnation” along 1 parameter: speed, but there’s no reason to assume that speed is an important parameter.

  36. “I flew a 747-SP on a trip to Australia in 1988. It was always a puzzle why that airplane was smaller than the 747 and had less range (We had to stop in Tahiti for fuel) but was considered an improvement. At about that time. QUANTAS had a 747 using naval aviation fuel that could go from Sydney to London nonstop. A flight of about 20 hours. When we came back from Sydney to LA, it was about 16 hours and, a few years later, even going to Sydney was nonstop at 14 hours in a 747-400.”

    The 747-SP was an improvement in range compared to the previous 747-100 versions. It gained more range at the expense of carrying capacity. 747-400 was a much later development that provided the same range as 747SP, without sacrificing carrying capacity.

    Of course, by 1988 the 747SP was already outdated and other 747 versions outperformed it. But that’s why the 747SP had a very short production run.

    PS: To understand why…speed…is not an important performance metric for airlines, look at the dismal commercial failure that was the Concorde. And the newest aircraft developed, the 787 and A350, all focus on increasing fuel efficiency: i.e. carrying capacity and range. Airlines focus on metrics that make them money, and speed isn’t one of them.

  37. The easy way to save time is to abolish the security theatre.

    We have a winner! However, time lost via security theater is a constant not dependent on distance, so it has a much bigger effect on the short-haul trips than on the longer ones.

    But that’s kind of a weird and unprecedented place to be

    Not at all, we’re there in optics too–where diffraction, not quality of the glass, limits the resolution achievable.

    That BTW coupled with the dedicated nuke reactor would make enviro-whacko heads explode all over the world……

    Enough of the benefits; surely there are some costs, too?

    there’s no reason to assume that speed is an important parameter.

    Apparently you never travel.

  38. “That BTW coupled with the dedicated nuke reactor would make enviro-whacko heads explode all over the world……”
    Enough of the benefits; surely there are some costs, too?

    Brains all over bystanders’ clothing????

  39. Gurray. Thanks for the link. I shall have to read it. OTOH, I think the examples of airplanes and automobiles capture what I was looking at, and both of them have had many industrial exponents across many countries.

    Clearly, My theory only applies to single technologies. Real disruption comes from out of the box applications. I wonder when people first realized that the computer would kill off the typewriter. Even there there are limits to technology imposed by natural law. We have exploited many of the phenomena discovered by scientists over the last couple of centuries and new areas may be hard to come-by.

    I developed my theory in response to the multitudinous claims that energy supply problems could be solved by more research into, you name it: batteries, solar cells, wind turbines, and that we needed a new Manhattan Project or a new Apollo Program.

    I think those claims are delusional. The only energy technologies with unexplored areas are fission and fusion. The reason they were not explored is political. What the sun worshipers and wind bags are trying to do is distract the peasants long enough so that they can cement their political power, and leave us with a pre-industrial world. Preferably one where billions of brown people have to die. (Environmentalism is the last socially acceptable form of racism.)

  40. I really think we are circling the subject of technological pessimism about future economic progress. I would like to recycle my comment from a thread discussing that subject on this blog 11 months ago:

    I personally (IANAE) am inclined not to believe the tech-pessimists. I think that there is a decent argument that technology must reach the top part of its S curve. However, I think there are more pressing reasons for sub-par growth, including:

    1. Fiscal incontinence: Enormous deficits with no clear path to bring them under control.

    2. Tax increases. Not only did the tax law adopted last New Year (1/1/13) increase the top marginal tax rates, but Obamacare imposed new taxes on capital income.

    3. Regime uncertainty. Well, the country is being run by an utter incompetent, who is all the more more dangerous because he is in the grip of the Dunning–Kruger effect. His signature accomplishments are Obamacare and Dodd-Frank. Obamacare has gone from regulatory nightmare to complete fiasco. Regulators are years behind on Dodd-Frank, and there is no reason to believe it will work any better than Obamacare. Further, the EPA is still honing its axe with which to shut down industries all over the country.

    4. The Fed, which has continued its uncertain foray into expanding its balance sheet like the universe expanded after the Big Bang. The Federal Reserve system is now carrying $50 of assets on each $ of capital, a ratio that would cause it to put a commercial bank into receivership. Further its ultra-low interest rate policy acts as a confiscatory tax on savers. How they can climb down from this situation without a catastrophe is not apparent.

    5. Institutional sclerosis in major chunks of the economy. Health care spending is more than one sixth of the GDP. In no other OECD country does it exceed one eighth. It is not a function of wealth. By most measures Switzerland has a per-capita GDP equal to or greater than the US, but its health care system is only 11.5% of GDP. Bringing the US back into line with the rest of the OECD means either an enormous cut back in health care expenditures or an enormous expansion of growth in other sectors. Neither seems likely. Education, Law, and Government all have symptoms of institutional problems. (1T$ of student loans, 40% of grads in law unemployed, bankruptcy of Detroit).

    I am not an economist, but John Taylor is. Here is his comment about slow growth on his blog today [11/18/2013]:

    In my view, deviations from good economic policy have been responsible for the very poor performance over the past decade. Such policy deviations created a boom-bust cycle, and were a significant factor in the crisis and slow recovery.

    Examples include the Fed’s low interest rate policy in 2003-2005 and the lax enforcement of financial regulations—both deviations from rules-based policies that had worked in the past. These were largely responsible for the boom and the high level of risk taking, which ended in the bust in 2007 and 2008.

    Other more recent examples are the hundreds of new complex regulations under Dodd-Frank, the vast government interventions related to the new health care law, the temporary stimulus packages such as cash for clunkers which failed to sustain growth, the exploding federal debt that raises questions about how it will be stopped, and a highly discretionary monetary policy that has generated distortions and uncertainty.

  41. “the Fed’s low interest rate policy in 2003-2005 and the lax enforcement of financial regulations—both deviations from rules-based policies that had worked in the past. These were largely responsible for the boom and the high level of risk taking, which ended in the bust in 2007 and 2008.”

    Not just the 2005 collapse. Read about the 1920s and Benjamin Strong’s policy of low interest rates. They were intended to aid European recovery but they led to 1929, a year after Strong died and his policies were not modified.

    Strong’s new monetary policies not only stabilized U.S. prices, they encouraged both U.S. and world trade by helping to stabilize European currencies and finances. However, with virtually no inflation, interest rates were low and the U.S. economy and corporate profits surged, fueling the stock market increases of the late 1920s. This worried him, but he also felt he had no choice because the low interest rates were helping Europeans (particularly Great Britain) in their effort to return to the gold standard.

    Economic historian Charles P. Kindleberger states that Strong was one of the few U.S. policymakers interested in the troubled financial affairs of Europe in the 1920s, and that had he not died in 1928, just a year before the Great Depression, he might have been able to maintain stability in the international financial system;[5] although economist Murray Rothbard claimed that it was Strong’s manipulations that caused the Depression in the first place.

    President Coolidge recognized the dangers but believed that the President had no power. The control of the New York stock market should reside with the Governor of New York, Franklin Roosevelt.

  42. Regarding the physical and resulting economic obstacles to higher rates of air speed, one of the areas of possible innovation is making the in flight time more productive so it becomes less isolated and unproductive. This is an area where info technology has and will make the in flight time less of an issue. The terminal time and transit time to them are issues that are becoming more of the issue even in global travel. Being productive while undergoing security screening processes is probably an inherent contradiction. Can’t we have a real fast track for those that qualify? We could call it Affirmative Screening or some such progressive sounding moniker. Just kidding. Equal persecution before the law is our guiding principle.


  43. “We could call it Affirmative Screening or some such progressive sounding moniker. Just kidding. Equal persecution before the law is our guiding principle.”

    Elites, politicians and the rich have just such a system called “private jets.” Some of our favorite politicians, like Mary Landrieu, have been particularly successful in getting taxpayers to pay for them .

Comments are closed.