"Restore(s) a little sanity into current political debate" - Kenneth Minogue, TLS "Projects a more expansive and optimistic future for Americans than (the analysis of) Huntington" - James R. Kurth, National Interest "One of (the) most important books I have read in recent years" - Lexington Green
Chicago Boyz is a member of the Amazon Associates, B&H Photo, Newsmax and other affiliate programs. Your Amazon and B&H purchases made after clicking those businesses' links, and your clicks on Newsmax links, help to support this blog.
Some Chicago Boyz advertisers may themselves be members of the Amazon Associates and/or other affiliate programs and benefit from any relevant purchases you make after you click on an Amazon or other link on their ad on Chicago Boyz or on their own web sites.
Chicago Boyz occasionally accepts direct paid advertising for goods or services that in the opinion of Chicago Boyz management would benefit the readers of this blog. Please direct any inquiries to
Chicago Boyz is a registered trademark of Chicago Boyz Media, LLC. All original content on the Chicago Boyz web site is copyright 2001-2017 by Chicago Boyz Media, LLC or the Chicago Boyz contributor who posted it. All rights reserved.
Chinese Premier Li Keqiang has lamented China’s inability to “make ballpoint pens with a smooth writing function.” After five years of research, a state-owned steel company now says it can.
WSJ notes that 80% of the world’s ballpoint pens are made in China…but that thus far, China has not been making all of the pen’s components. Specifically:
The tip of a high-quality ballpoint demands metal work involving high-precision machinery and very hard, ultrathin steel plates. So 90% of pens made in China have imported tips. China’s leaders want “self-sufficiency,” in pens as in semiconductors. Now they claim they’ll have it.
This little story is interesting from at least three angles.
First–as the WSJ story points out, China’s desire to control the entire ballpoint pen supply chain indicates that their leaders still value economic autarky, and that Chinese leadership denunciation of President Trump on grounds of his insufficient respect for free trade carry more than a whiff of hypocrisy.
Second–the ballpoint pen example makes the point that the apparent simplicity of a product does not necessarily reflect the complexity or lack thereof involved in manufacturing it. American economic commentators often fail to grasp this point when they assert that America’s future must lie in producing “advanced high-technology products.”
Third–the example should also clarify the point that the highest value in a product supply chain does not necessarily lie in the assembly of the final product. The final product assembly is usually the most visible part of the supply chain, but very often the creation of components that go into that chain involves more complexity and requires more skill than the final assembly process itself. It’s considerably more difficult to make integrated circuits, for example, than to assemble those chips onto circuit boards and to assemble the boards into a plastic or metal case.
Edward Porter Alexander, who was Lee’s artillery commander at Gettysburg, became a railroad president after the war. His experiences in running a major transportation system probably had something to do with the evolution of his thoughts regarding state’s rights:
Well that (state’s rights) was the issue of the war; & as we were defeated that right was surrendered & a limit put on state sovereignty. And the South is now entirely satisfied with that result. And the reason of it is very simple. State sovereignty was doubtless a wise political instution for the condition of this vast country in the last century. But the railroad, and the steamboat & the telegraph began to transform things early in this century & have gradually made what may almost be called a new planet of it… Our political institutions have had to change… Briefly we had the right to fight, but our fight was against what might be called a Darwinian development – or an adaptation to changed & changing conditions – so we need not greatly regret defeat.
I think a lot of the belief in unlimited globalization is implicitly driven by an extension of Alexander’s argument, with the jet plane, the container ship, and the Internet taking the place of the railroad, steamboat, and telegraph.
How far does this extension make sense? If the ability of locomotives could pull trains across the United States in three days meant that full sovereignty for individual states was obsolete, does the ability of jet airplanes to carry passengers and freight anywhere in the world in less than one day similarly imply that full sovereignty for nations is obsolete?
I suspect that most people at this site will not agree with a transportation-based argument for the elimination of national sovereignty. So, what is valid and what is invalid about Alexander’s analysis, and what are the limits for the extension of its geographical scope? Discuss.
In a recent post I discussed the spate of updates that have occurred in my Apple products including a new iOS for my work and home phone, a new iOS for my iPad, a new iOS for my Apple Watch, and a new operating system for my Mac.
Let’s start with the Apple Watch. The Apple Watch is an evolutionary product and the jury is out on whether or not it will be a giant part (“move the needle”) of the Apple portfolio. Personally, I find the Apple Watch to be very useful because I can get notifications when big events occur (for instance, I was the first to say “Prince is dead” in a big meeting) or just to be reminded when texts happen and I don’t have my phone on. It also is good for sports score notifications and tracking workouts. Finally, you can also always know if someone is calling you even if the ringer on your phone is off, and you can answer it “Dick Tracy Style” on your wrist (if you want to annoy everyone around you). Here is my review of the Apple Watch from 2015 when I bought it.
Apple Watch iOS 3.0 is OK. The watch seems a bit faster. They made it easier to utilize some popular apps like the workout app and incorporated some other improvements here and there. I can’t take advantage of all the iOS 3.0 features because my older Apple watch doesn’t have some of the features like the built in GPS that comes with the new watch.
Mac OS Sierra
There has been a lot of noise in the press about Apple not updating their core computers and even letting Microsoft steal their thunder with the new Surface tablet. However, Apple deserves immense credit for making their OS upgrades work effectively even on older model machines – for instance the Macbook that I am writing this blog post on is from 2011 (my friend Brian installed an SSD and more memory which I documented here).
The most important elements from my perspective are the continued integration of the Mac OS with the iPad and iPhone devices. With this upgrade I now can easily share a single photo stream (which will get its own post since it is so complicated), use Apple music easily across devices, and use key apps like messenger, notes, ibooks, contacts and Facetime (mostly) seamlessly. Siri also works on the Mac now which is fine for most people but I don’t use Siri much so it is irrelevant to me.
Gone too is the iconic firm’s appliances business, which was sold to Chinese firm Haier. This is really a progression of the economic cycle. While folks like President-elect Donald Trump and financial provocateur Peter Schiff lament that Americans just don’t make stuff anymore, at a certain point, advanced economies should outsource physical work to less-advanced countries. It’s not so much a matter of ability as it is financial efficiency.
Does this writer believe that GE should also divest the jet engine business, the power generation business, and the transportation (locomotive) business? All of these businesses make physical things, and make substantial amounts of those physical things in the US.
The idea that manufacturing is devoid of intellectual content and hence unworthy of advanced economies is fallacious and has done serious harm–see my post Faux Manufacturing Nostalgia. Happily, this attitude has turned around substantially since I wrote the linked post..to the point that manufacturing is being practically over-romanticized…but islands of the “who needs it?” view still exist.
GE’s reasoning for divesting Appliance seems to have been centered on a desire to focus the company on business-to-business markets rather than consumer markets and, and also, I think, on a perception that there was not sufficient room in the appliance world for product differentiation and a technology edge. “Technology edge,” rightly understood, includes the complexity/difficulty of manufacturing something, not just the intellectual property embedded in the product itself. It certainly did not reflect any conclusion that manufacturing is inherently a low-value function.
It would be silly to argue that a computer programmer in a bank is a “knowledge worker” and a programmer in manufacturing is not. It would be equally silly to argue that a bank branch manager is inherently performing a more highly-skilled job than a shift supervisor in a factory, or that a first-level customer service rep for Amazon is performing a more advanced kind of work than an assembly line worker, or that an operations research expert doing inventory studies for a manufacturing firm is less of a knowledge worker than his equivalent doing inventory studies for Target. But this is implicitly the argument that many of the ‘we don’t need manufacturing here’ crew have been making.
This dismissive attitude toward a vast and complex industry which supports millions of people represents one more example of the constellation of attitudes against which many people rebelled when choosing to vote for Donald Trump.
In my previous post of this series, I remarked that most discussion of the employment effects of robotics/artificial intelligence/etc seems to be lacking in historical perspective…quite a few people seem to believe that the replacement of human labor by machinery is a new thing.
This post will attempt to provide some historical perspective on today’s automation technologies by sketching out some of the past innovations in the mechanization of work, focusing on “robots,” broadly-defined…ie, on technologies which to some degree involve the replacement or augmentation of human mind/eye/hand, rather than those that are primarily concerned with the replacement of human and animal muscular energy…and will discuss some of the political debate that took place on mechanization & jobs in the 1920s through 1940s.
Throughout most of history, the production of yarn for cloth was an extremely labor-intensive process, done with a device called a distaff, almost always employed by women, and requiring many hours per day to generate a little bit of product. (There even exists a medieval miniature of a woman spinning with the distaff while having sex…whether this is a comment on the burdensomeness of the yarn-making process, or a slam at the love-making skills of medieval men, I’m not sure—-probably both.) Eventually, probably around 1400-1500 in most places in Europe, the spinning wheel came into use, improving the productivity of yarn-making by a factor estimated from 3:1 to as much as ten or more to one.
Gutenberg’s printing press was invented somewhere around 1440. I haven’t seen any estimates of its effect on labor productivity, compared with the then-prevailing method of hand copying of manuscripts, but surely it was at least 1000 to 1 or more.
The era from 1700-1850 was marked by tremendous increases in the productivity of the textile trades. The flying shuttle and other advances greatly improved the weaving process; this created a bottleneck in the supply of yarn, which was partly addressed by the invention of the Spinning Jenny–a foot-powered device that could improve the yarn production of one person by 5:1 or better. Power spinning and power looms yielded considerable additional productivity improvements.
An especially interesting device was the Jacquard Loom (1802), which used punched cards to direct the weaving of patterned fabrics. In its initial incarnation, the Jacquard was a hand loom: its productivity did not come from the application of mechanical power but rather from the automation of the complex thread-selection operations previously carried out by a “Draw Boy.”
Turning now to woodworking: in 1818, Blanchard’s Copying Lathe automated the production of complex shape–a prototype was automatically traced and copied. It was originally intended for making gunstocks, but also served in producing lasts for shoemakers, and I believe also chair and table legs.
Another major advancement in the clothing field was the sewing machine. French inventory Barthelemy Thimonnier invented a machine in 1830, but was driven out of the country by enraged tailors and political instability. The first commercially-successful machines were invented/marketed by Americans Walter Hunt, Elias Howe, and Isaac Singer, and were in common use by the 1850s.
By the late Victorian period the sewing machine had been hailed as the most useful invention of the century releasing women from the drudgery of endless hours of sewing by hand. Factories sprung up in almost every country in the world to feed the insatiable demand for the sewing machine. Germany had over 300 factories some working 24 hours a day producing countless numbers of sewing machines.
The beginnings of data communications could be seen in gold ticker and stock ticker systems created by Edison and others (circa 1870) , which relayed prices almost instantaneously and eliminated the jobs of the messenger boys who had previously been the distribution channel for this information. Practical calculating machines also appeared in the 1870s. But the big step forward in mechanized calculation was Hollerith’s punched card system (quite likely inspired in part by the Jacquard), introduced in 1890 and used for the tabulation of that year’s census. These systems were quickly adopted for accounting and record keeping purposes in a whole range of industries and government functions.
Professor Amy Sue Bix, in her book Inventing Ourselves out of Jobs?, describes the fear of technological unemployment as silent movies were replaced by the ‘talkies’. “Through the early 1920s…local theaters had employed live musicians to provide accompaniment for silent pictures. Small houses featured only a pianist or violinist, but glamorous ‘movie places’ engaged full orchestras.” All these jobs were threatened when Warner Brothers introduced its Vitaphone technology, with prerecorded disks synchronized to projectors. “Unlike other big studios, Warner did not operate its own theater chains and so had to convince local owners to screen their productions. Theater managers would be eager to show sound movies, Harry Warner hoped, since they could save the expense of hiring musicians.”
The American Federation of Musicians mounted a major PR campaign in an attempt to convince the public that ‘living music’ was better than ‘canned sound.’ A Music Defense League was established, with membership reaching 3 million…but the ‘talkies’ remained popular, and the AFM had to admit defeat. A lot of musicians did lose their jobs.
Here’s a new factory for making automobile frames, specifically designed to minimize the need for human labor. The CEO of the company that built it actually said, “We set out to build automobile frames without people.”
At the start of the process, rough steel plates are inspected by electronic sensors, automatically pushing aside any that deviate from tolerances. Conveyors take the plates through punching, pressing, assembling, and nailing machines, as well as a machine that can insert 60 rivets simultaneously in each frame. A set of finishing machines then rinse, dry, spray-paint, and cool the frames. Aside from a few men moving frames between conveyor belts, the floor routine of the plant requires almost no hand labor.
And today’s robotics and artificial-intelligence advances go far beyond automating routine manufacturing labor and take over the kind of cognitive functions once thought to be exclusive to human beings. Here, for example, is a new AI-based system that displaces much of the thought-work which has been required of the people operating railway switch and signal installations:
The NX control machine is in effect the “brain” of the system. It automatically selects the best optional route if the preferred route is occupied. It will allow no conflicting routes to be set up. It eliminates individual lever control of each switch and signal.
Pretty scary from the standpoint of maintaining anything like full employment, don’t you think?
As smartphones become more powerful and more connected there are subtle phenomenon that are very powerful that can go by unnoticed. For years I either walked to work or took public transit but now in the Pacific Northwest I commute by car. Since the surroundings are new I pay much more attention to what is going on than I used to in Chicago.
In Chicago, there aren’t a lot of opportunities to optimize your travel if you are driving alongside major roads such as I290 or the Dan Ryan. Unless you really, really know what you are doing it is not recommended to get off the highway in many Chicago neighborhoods and just to follow your mobile navigation blindly. Thus in Chicago when I was in bad traffic it pretty much looked like this – a speed of zero and stuck crawling ahead.
The first generation of car navigation tools told you how to get somewhere with the most efficient route, taking standard traffic into account. The new generation of navigation apps, however, have real-time information and continuously re-adjust the “recommended” route based on traffic, accidents and construction. Read the rest of this entry »
Writing in today’s WSJ, Peggy Noonan says: “This year I am seeing something, especially among the young of politics and journalism. They have received most of what they know about political history through screens They’re college graduates…they’re bright and ambitious, but they have seen the movie and not read the book….They learned through sensation, not through books, which demand something deeper from your brain. Reading forces you to imagine, question, ponder, reflect…Watching a movie about the Cuban Missile Crisis shows you a drama. Reading about it shows you a dilemma.”
The article reminded me of Neal Stephenson’s book and of this post, which I originally ran in late 2007.
My post today is inspired by In the Beginning was the Command Line, by Neal Stephenson, a strange little book that will probably be found in the “computers” section of your local bookstore. While the book does deal with human interfaces to computer systems, its deeper subject is the impact of media and metaphors on thought processes and on work.
Stephenson contrasts the explicit word-based interface with the graphical or sensorial interface. The first (which I’ll call the textual interface) can be found in a basic UNIX system or in an old-style PC DOS system or timesharing terminal. The second (the sensorial interface) can be found in Windows and Mac systems and in their respective application programs.
As a very different example of a sensorial interface, Stephenson uses something he saw at Disney World–a hypothetical stone-by-stone reconstruction of a ruin in the jungles of India. It is supposed to have been built by a local rajah in the sixteenth century, but since fallen into disrepair.
The place looks more like what I have just described than any actual building you might find in India. All the stones in the broken walls are weathered as if monsoon rains had been trickling down them for centuries, the paint on the gorgeous murals is flaked and faded just so, and Bengal tigers loll among stumps of broken columns. Where modern repairs have been made to the ancient structure, they’ve been done, not as Disney’s engineers would do them, but as thrifty Indian janitors would–with hunks of bamboo and rust-spotted hunks of rebar.
In one place, you walk along a stone wall and view some panels of art that tell a story.
…a broad jagged crack runs across a panel or two, but the story is still readable: first, primordial chaos leads to a flourishing of many animal species. Next, we see the Tree of Life surrounded by diverse animals…an obvious allusion (or, in showbiz lingo, a tie-in) to the gigantic Tree of Life that dominates the center of Disney’s Animal Kingdom…But it’s rendered in historically correct style and could probably fool anyone who didn’t have a PhD in Indian art history.
The next panel shows a mustachioed H. sapiens chopping down the Tree of Life with a scimitar, and the animals fleeing every which way. The one after that shows the misguided human getting walloped by a tidal wave, part of a latter-day Deluge presumably brought on by his stupidity.
The final panel, then, portrays the Sapling of Life beginning to grow back, but now man has ditched the edged weapon and joined the other animals in standing around to adore and praise it.
Clearly, this exhibit communicates a specific worldview, and it strongly implies that this worldview is consistent with traditional Indian religion and culture. Most viewers will assume the connection without doing further research as to its correctness or lack thereof.
I’d observe that as a general matter, the sensorial interface is less open to challenge than the textual interface. It doesn’t argue–doesn’t present you with a chain of facts and logic that let you sit back and say, “Hey, wait a minute–I’m not so sure about that.” It just sucks you into its own point of view.
I started out as a Windows user and was actually a Windows programmer (using MS Access) for quite a long time. I resisted the siren call of Apple products and stuck with Windows for years and years, for work and for personal use.
Finally, I gave in and bought a MacBook Pro in 2011 which turned out to be a great purchase (and got rid of my Windows Desktop PC). I always had an iPhone for my personal cell phone and when I turned in my work Blackberry (a sad day at the time) for an iPhone, that meant that I had two iPhones. For a while I also used a Mac at work, although I ended up switching back to a Windows laptop because password resets, system upgrades and a lack of compatibility for applications built for Windows made it too much of a pain in the rear. Mac laptops still struggle in the corporate world.
Then over the years I of course bought an iPad and then upgraded that iPad, and an Apple Watch, which I really like (although the jury is mixed on that one). Here is an Apple Watch article and review that I wrote.
Thus I now have five (5) Apple products – a MacBook Pro, an iPad, an Apple Watch, and two iPhones. And now it is time for all the updates… iOS 10 is out now which means I need to update my iPad and both iPhones. Apple Watch OS 3 is also out and I am downloading that right now (downloading the operating system into the watch, from the iPhone, seems to take a long time). My MacBook Pro will get updated to the new Sierra OS when it comes out on Tuesday, September 20th.
There probably aren’t too many TV series centered around a CNC machine shop…but there’s at least one, and it’s called Titans of CNC. The producer and central figure, Titan Gilroy–yes, that’s his real name–grew up in rough circumstances, spent some time in prison, and eventually learned machine-tool operation and CNC programming. With these skills in hand, he built a pretty substantial business, Titan America, which is focused on precision machining, mainly producing components of products being made by larger companies.
The program is about the challenges involved in the operation of Titan America and a portrait of some of its employees and customers. It is also a passionate argument for the importance of manufacturing in America. Sponsors include Autodesk, IMCO Carbide Tools, Haas Automation and GoEngineer.
The series was made for a cable channel called MATV, which is owned by Lucas Oil Products and is targeted towards car people. It’s available on Amazon streaming, which is where I’ve been watching it.
Due to the fact that computing power continues to increase exponentially, devices that once were out of reach for the general population are now becoming mainstream. I wrote about Netatmo, a device that measures temperature, humidity and sound (indoor and outdoor) here. Due to the internet, these devices can also be connected together in order to see a real-time version of the country, without having to look at a weather forecast.
Recently I saw an article in an MIT journal about indoor air quality which described how cooking eggs aggravated the authors’ asthma and they were able to take specific actions because they were able to pinpoint the source of the spike in unclear air. The name of the company that created the monitor is called Speck and it was sold for approximately $200 so I thought that was a decent price point for me to join the air quality monitoring revolution. I am specifically most interested in INDOOR air quality but I will explain the broader context and then come back to the specific items I am reviewing (basically you can get official measurements of air quality in the US from public sources).
If you want to slay the mistaken talk about the end of human employment, hold a contest. Come up with labor demand boosting ideas that we do not engage in today because we either don’t have enough people or don’t have enough money to do it. Weigh jobs that don’t require much intelligence or education as more valuable than those requiring high education/intelligence. Within a year I predict enough entries to be submitted to put the entire world to work multiple times over.
It is a bit embarrassing to think about things we are too poor to do. This makes these jobs invisible to us today. By creating a contest and an artificial market for these ideas, they become visible and we turn from despair at the jobless future to wondering how we can become efficient enough to afford to do all these wonderful things.
Let’s prototype the contest here, among friends (and a few special adversaries and maybe even some enemies), and maybe we can roll it out later on a larger scale. The winner will receive a microscopic amount of fame, and also a virtual certificate, not suitable for framing.
What are the things that we collectively and individually can’t afford–but might be able to afford given higher levels of productivity and national income–that would meaningfully affect well-being and human satisfaction? Define “things” as broadly as you like. Consider both things that could become more affordable due to productivity improvements in a specific industry, and things whose creation might not by itself be meaningfully improvable from a productivity standpoint but which people could better afford given an upward trend in overall productivity and income.
Every day, there are articles and blog posts about how quickly robots are replacing jobs, particularly in manufacturing. These often include assertions along the lines of “robots are replacing human labor so rapidly and so completely that it doesn’t really matter whether the factories are in the US or somewhere else.” There are also many assertions that robotics and artificial intelligence will triumph so completely that we must accept that we will permanently have a huge unemployed population who will need to be paid a “basic income” of some sort from the government.
This May, there were breathless headlines about how Foxconn, which is Apple’s primary contract manufacturer, was replacing 60,000 workers with robots–indeed, in some tellings, had already replaced them. If you google “foxconn 60000 workers”, you will get about 130,000 hits.
But the story, however, is false; indeed, it did not even originate with Foxconn but rather with some local Chinese government officials who wanted to promote their area as “innovative.”
There has also been a lot of coverage of robotics at Adidas, which is trying to use automation to improve the labor productivity of shoe-making to the point that it can be done economically in high-wage countries such as Germany. This article on Adidas also cites the Foxconn “60,000 jobs” assertion.
One key pair of numbers is missing from the stories I’ve seen on the Adidas project: the ratio of human workers to shoes produced, with and without the addition of the robotics. You can’t really judge the labor-reducing impact of the project without these numbers. In this Financial Times article, Adidas is quoted as saying, entirely reasonably, that they will need to get further into production with their new factory before developing meaningful productivity numbers. The article also cites Boston Consulting Group as estimating that by “2025 advanced robots will boost productivity by as much as 30 per cent in many industries.” Thirty percent is a very significant number, but it’s a long, long way from a productivity increase that would imply that factory jobs don’t matter, or that we’re going to inevitably have a very large permanently-unemployed population.
There are a lot of very significant innovations taking place in robotics and AI, but the hype level is getting a little out of hand. And it’s important to remember that automation is not a new phenomenon. For example, a CNC (computer numerically controlled) machine tool is a robot, albeit it might not look like the popular conception of one, and these machines, together with their predecessor NC (numerically controlled) machines, have been common in industry since the 1970s. One thing that articles and blog posts on the topic of robotics/AI/jobs could benefit from is a little historical perspective: do today’s innovations really represent a sharp break upwards in labor productivity, or are they more of a continuation of a long-term trend? And how, if it all, is the effect of these technologies appearing in the productivity statistics?
Automated systems need to be supervised by humans, and not just any humans, as Stanislav Petrov’s story makes clear. Individuals and bureaucracies that themselves behave in a totally robotic fashion cannot be adequate supervisors of the automation. See also my post Blood on the tracks for an additional example.
Posted by Trent Telenko on 10th June 2016 (All posts by Trent Telenko)
It is amazing the things you find out while writing a book review. In this case, a review of Phillips Payson O’Brien’s How the War Was Won: Air-Sea Power and Allied Victory in World War II. The book is thoroughly revisionist in that it posits that there were no real decisive land battles in WW2. The human and material attrition in those “decisive battles” was so small relative to major combatants’ production rates that losses from them were easily replaced until Anglo-American air-sea superiority — starting in the latter half of 1944 — strangled Germany and Japan. Coming from the conservative side of the historical ledger, I had a lot of objections to O’Brien’s book starting with some really horrid mistakes on WW2 airpower in the Pacific.
However, my independent research on General MacArthur’s Section 22 radar hunters in the Philippines proved one of the corollaries of O’Brien’s thesis — Namely that the Imperial Japanese were a fell WW2 high tech foe, punching in a weight class above the Soviet Union — was fully validated with a digitized microfilm from the Air Force Historical Research Agency (AFHRA) at Maxwell AFB, Alabama detailing the size, scope and effectiveness of the radar network Imperial Japan deployed in the Philippines.
The microfilm reel A7237 photoshop below is a combination of three scanned microfilm images of an early December 1944 radar coverage map of the Philippines. It shows 32 separate Imperial Japanese Military radar sites that usually had a pair of Japanese radars each (at lease 64 radars total), based upon the Japanese deployment patterns found and documented in Section 22 “Current statements” from January thru March 1945 elsewhere in the same reel.
This is a early December 1944 Japanese radar coverage map made by Section 22, GHQ, South West Pacific Area. It was part of the Section 22 monthly report series.
This Section 22 created map — taken from dozens of 5th Air Force and US Navy aerial electronic reconnaissance missions — showed Japanese radar coverage at various altitudes and was used by Admiral Halsey’s carrier fleet (See route E – F on the North Eastern Luzon area of the map) to strike both Clark Field and Manila Harbor, as well as by all American & Australian military services to avoid Japanese radar coverage to strike the final Japanese military reinforcement convoys, “Operation TA”, of the Leyte campaign. Read the rest of this entry »
Over at The Lexicans, Bill Brandt posted an item about an 8-part TV series titled ‘American Genius’…it is about a selection of inventors and entrepreneurs who have had a major impact on technology, society, and history. It sounded worthwhile and I’ve watched about half of the episodes–thanks, Bill!…definitely worth watching, but OTOH I think there are a few things in the series that should have been covered a little differently.
Edison vs Tesla is about the AC-vs-DC power wars, and correctly reports on the sleazy fearmongering tactics that Edison used in his unavailing attempt to maintain DC’s dominance. The show referred to George Westinghouse, who was Tesla’s sponsor in this battle, as “sort of a railroad baron,” completely ignoring the fact that Westinghouse was himself a major American inventor. Most people would think of a ‘railroad baron’ as someone who owns or manages railroads, not someone who invented the air brake.
Farnsworth vs Sarnoff is about the battle to dominate the emerging television industry. It was presented as a David-versus-Goliath story–though Goliath was in this case named David (Sarnoff)–individual inventor versus ruthless tycoon. Sarnoff was indeed ruthless, indeed could be fairly referred to as a prototypical crony capitalist…but it would have been interesting to point out that he wasn’t always a Goliath, wasn’t born to that position, but had in fact come to this country as an impoverished Russian Jewish immigrant and had encountered severe and career-threatening anti-Semitism on his path to Goliath-dom.
Space Race is focused on two individuals, the German/American Wernher von Braun and the Soviet rocket designer Sergei Korolev. Korolev was played by an actor who looked a little too young for the role at the subject time period: more importantly, it should have been mentioned that Korolev had been arrested and sent to the Gulag, where he lost most of his teeth due to the brutal labor-camp conditions. There were psychological scars as well–Boris Chertok , who worked closely with Korolev for years, said that there was only one single time that he saw the man really happy. In a series focused primarily on the leading characters and their conflicts rather than on technical details, these things deserved to be covered.
The program refers to a successful Soviet test in 1957 of a missile with intercontinental range, shortly before the launch of Sputnik. Actually, the test was a failure because the warhead disintegrated on reentry…and reentry, while a critical factor for ICBMs, is not important at all for one-way satellite launches. The American belief that Sputnik meant all of our cities were vulnerable to Soviet missiles was a little premature–not much.
I thought Wernher von Braun got off too easily in this program. The show did mention that the V-2 missile was assembled by slave labor in an underground factory adjacent to a concentration camp: the truly horrific nature of V-2 manufacturing (this was possibly the only weapons system ever made that killed more people in its making than in its employment) could have gotten more emphasis, and the evidence is that von Braun was fully aware of what was going on in this place.
I’m also not convinced that von Braun was as absolutely critical to US missile and space programs as the show implies. The program to build the Atlas missile, which was developed in roughly the same time period as Korolev’s R-7, was directed by USAF General Bernard Schriever, with technology expertise provided largely by the newly-formed Ramo-Wooldgridge Corporation and by Convair. I see no reason why this team could not also have conducted a Moon program, had they been so chartered.
The show does point out that von Braun, in addition to his technical and management contributions, played an important role in popularizing the ideas of rocketry and space travel…I had been unaware of his work with Disney to this end. So, in addition to being a genuine rocket scientist (and, arguably, a war criminal in at least a moral sense), von Braun was also one of the great PR men of the century.
Again, with the omissions and missed opportunities, the series is still very much worth watching.
In broken-windows policing the cops go after guys who jump subway turnstiles and commit other minor crimes. This is because the policing of low-level crimes tends to lead to reductions in serious crimes. Not only are minor criminals disproportionately responsible for felonies as compared to the general population, the fact that the police are seen not to ignore the small stuff creates a virtuous cycle by deterring other crimes and increasing the public’s confidence in civic authority.
I thought of this issue when I noticed that a sophisticated Java program that I use on my PC has serious bugs that are never corrected. For example, opening an Excel tie-in in the Java program kills all of the open Excel processes on my PC. I’ve complained several times but nothing gets fixed. Meanwhile there are simple apps on my phone that get updated frequently so that annoying little problems disappear over time. The fancy Java software has many more features but which software would I rather use?
Another Chicagoboy adds: The problem is that many companies view software updates as a cost rather than a feature. Software upgrades in response to customer complaints should be a trumpeted feature, because they are a way of convincingly communicating that the company shares its customers’ values about what matters, and therefore that it’s safe for the customers to invest their time in the company’s products as opposed to competing products.
It’s steps like this that move the space program forward. Notice this wasn’t done by NASA or ULA or the ESA. It was done by a private company that didn’t exist 15 years ago. 37 minutes, including the launch, recovery of the 1st stage, and deployment of the Dragon capsule.
BTW, very cool to me that Spacex did not require the help of a traditional media company for any of this. And it’s actually much better than anything they typically produce. In addition, the people in this video are in the Hawthorne, California, SpaceX facility where these rockets are designed and produced. They designed and built this rocket. And they’re watching it perform almost real time. How amazing is that?
They are mostly Sanders supporters. And they feel oppressed by the industry that they are in, and especially by the VCs who fund the companies where they work. Here’s the complaint of a 26-year-old software engineer:
“They sell you a dream at startups – the ping-pong, the perks – so they can pull 80 hours out of you. But in reality the venture capitalists control all the capital, all the labor, and all the decisions, so yeah, it feels great protesting one.”
“Tech workers are workers, no matter how much money they make,” said another guy, this one a PhD student at Berkeley.
Now, one’s first instinct when reading this story–at least my first instinct–is to feel contempt for these whiners. Most of them are far better off financially than the average American, even after adjusting for the extremely high costs of living in the Bay area. And no one forced any of them to work at startups, where the pressures are well-known to be extreme. They could have chosen IT jobs at banks or retailers or manufacturing companies or government agencies in any of a considerable number of cities.
Looked at from a broader perspective, though, the story reminded me of something Peter Drucker wrote almost 50 years ago:
I’ve previously written about the failure of the “Advanced Automation System,” an FAA/IBM effort to create a new-generation system for air traffic control: the story of a software failure. (The post excerpts the thoughts of Robert Britcher, who was deeply involved in the effort and is an excellent writer–very much worth reading.) The AAS project has been called “the greatest debacle in the history of organized work”–there are a lot of contenders for that honor, though, and here’s another one…
I have been considering “disruption”, including what is hype and what is real. Here is one on the cab industry where it occurred, in the electric and gas utility industry which has proven resilient in its current business model, and retail which is in the process of being disrupted.
My theory under these posts is that increasing supply (broadly defined) has been the key to whether or not “disruption” is truly real or not occurring. I don’t know if it will play out that way or not in the end but this is a starting point.
I have been interested in the airline industry for decades… in high school for my statistics class I built a model which correlated the profits of United Airlines with the price of oil. As an auditor and consultant I spent hours every week on a plane crossing the country serving utilities. And ever since I have traveled at least ten times a year for business or pleasure. So perhaps I would not consider myself an expert on the airline industry but certainly an interested observer.
The airline industry famously de-regulated in 1978. From 1978 to 2010 the airline industry added myriad new entrants and saw them fail along with much of the old guard. Wikipedia summarized this era here. In recent years, through bankruptcy and mergers, the US airline industry consolidated into four major carriers – American, United, Delta and Southwest. These four carriers control the vast majority of gates at major cities and effectively operate as an oligopoly. Now these four carriers are in rude health, as you can see in the stock chart below. Their stock prices have increased between 135% to 355% over the last 5 years. As an investor I bought Southwest after 9/11 and held on to it for years as the price languished; unfortunately I exited the stock before they became today’s oligopoly.
Another contributor to these gains is the collapse in oil prices. During the “peak oil” era, the airlines profits were strangled by the high cost of fuel – today they benefit immensely from today’s commodity price crash. This article describes how lower fuel costs saved them $4.3B in the third quarter 2015 alone and these lower costs have generally not come through to end users as price decreases – the airlines have banked the money or used them for dividends and capital improvements.
D-Wave Systems, located in British Columbia, is a builder of commercial quantum computers. It stores bits as magnetic directions in one of three states: clockwise, counterclockwise, and both directions simultaneously. The math and physics are far beyond me, but they claim to solve certain sets of optimization problems up to 100,000,000 times faster than classical computers. Customers for their computers, which cost $10 million apiece, include Lockheed Martin, an unnamed intelligence agency (NSA?), Google, JPL and NASA Ames Research.
Applications appear to be computationally intensive problems with lots of variables, and the solution involves a process called quantum annealing, where an optimal approach is found by exploring millions of solutions simultaneously to find the most efficient solution path. I’m reminded of a discussion on the famous double slit experiment, a classic physics experiment that demonstrates photons displaying behaviors of both waves and particles, known as wave-particle duality. Most interesting is that quantum probabilistic behaviors are also observed, in that the experiment functions differently when the particle paths are observed and when they are not. When the photons in the experiment are observed, the probability function collapses and the photons behave like a particles. If they are not observed, the photons take many paths through the slits and create a dispersed pattern on the target. That behavior has been described as “spooky”, because the particles seem to know when they are being observed. Weird, I know. It’s been said that anyone who claims to understand quantum mechanics is lying. But that doesn’t mean we can’t describe its behavior. Richard Feynman explained that at the quantum level, every possible path a photon can take is considered, and the path chosen is a probability function, like a bell curve. As photons are emitted from a source, the most likely path is taken most often, but some photons will take slightly less probable paths, still other even less probable paths, and so on. Quantum annealing seems to be a form of that, where many paths are simultaneously considered until a most probable path emerges, then it is chosen.