Mark Zuckerberg as Political and Social Philosopher

A long essay by the founder of Facebook includes this assertion:

History is the story of how we’ve learned to come together in ever greater numbers — from tribes to cities to nations.

To which Steve Sailer responds:

As we all know, independence and diversity have always been the enemy of progress.

For example, that’s why Thomas Jefferson wrote  The Declaration of Dependence  submitting the American colonies to the British Empire.

Similarly, the father of history, Herodotus, wrote to celebrate the mighty Persian Empire’s reduction of the various Greek city-states to a satrapy ruled from Babylon.

Likewise, every year Jews gather to admit that their stiff-neckedness provoked the Roman Empire into, rightfully, smashing the Temple in Jerusalem on the holy day of We-Had-It-Coming.

And, of course, who can forget Shakespeare’s plays, such as  Philip II  and  Admiral-Duke of Medina Sidonia, lauding the Spanish Armada for conquering the impudent English and restoring to Canterbury the One True Faith?

Similarly, Oswald Mosley’s prime ministership (1940-1980) of das englische Reich is justly admired for subordinating England’s traditional piratical turbulence to the greater good of Europe.

Likewise, who can not look at the 49 nations currently united by their adherence to the universalist faith of Islam and not see that submission is the road to peace, prosperity, and progress? If only unity had prevailed at Tours in 732 instead of divisiveness. May that great historical wrong be swiftly rectified in the decades to come!

(links via Isegoria)

Zuckerberg’s assertion about history being about “coming together in ever larger numbers”…with the implication that this is inherently in a good thing…is quite reminiscent of the views of Edward Porter Alexander, a Confederate general and later a railroad president…as excerpted in my post What are the limits of the Alexander analysis?

Following his initial snarkiness, Steve Sailer goes on to point out that “consolidation is some times a good thing, and other times independence or decentralization is a better thing. Getting the scale of control right all depends upon the circumstances. It’s usually a very interesting and complicated question that is the central issue of high statesmanship.”

Thoughts?

Seth Barrett Tillman: Eisenhower (WWII) and MacArthur (Korea): the Limits of Civilian Control

Excerpt:

At the very outset of creating the first integrated Anglo-American command structure in 1942, Eisenhower made it clear that he would not tolerate any diminution of his own authority and responsibility as supreme commander. The British War Office had issued its own directive to General Sir Kenneth Anderson, the British land force commander, which simply repeated the terms of that given to Haig in the Great War, authorising Anderson to appeal to his own government if and when he believed that an order from Eisenhower endangered his army. Such a directive stood in blatant contradiction to the new integrated command structure, whereby Eisenhower was serving as an Allied commander responsible to an Allied authority, the combined chiefs of staff, and thence to the prime minister and president jointly.

[Emphasis in original.]

Read the whole thing.

History Friday — Imperial Japan’s Philippine Radar Network 1944-45

It is amazing the things you find out while writing a book review. In this case, a review of Phillips Payson O’Brien’s How the War Was Won: Air-Sea Power and Allied Victory in World War II. The book is thoroughly revisionist in that it posits that there were no real decisive land battles in WW2. The human and material attrition in those “decisive battles” was so small relative to major combatants’ production rates that losses from them were easily replaced until Anglo-American air-sea superiority — starting in the latter half of 1944 — strangled Germany and Japan. Coming from the conservative side of the historical ledger, I had a lot of objections to O’Brien’s book starting with some really horrid mistakes on WW2 airpower in the Pacific.

You can see a pretty good review of the book at this link — How the War Was Won: Air-Sea Power and Allied Victory in World War II, by Phillips Payson O’Brien

However, my independent research on General MacArthur’s Section 22 radar hunters in the Philippines proved one of the corollaries of O’Brien’s thesis — Namely that the Imperial Japanese were a fell WW2 high tech foe, punching in a weight class above the Soviet Union — was fully validated with a digitized microfilm from the Air Force Historical Research Agency (AFHRA) at Maxwell AFB, Alabama detailing the size, scope and effectiveness of the radar network Imperial Japan deployed in the Philippines.

The microfilm reel A7237 photoshop below is a combination of three scanned microfilm images of an early December 1944 radar coverage map of the Philippines. It shows 32 separate Imperial Japanese Military radar sites that usually had a pair of Japanese radars each (at lease 64 radars total), based upon the Japanese deployment patterns found and documented in Section 22 “Current statements” from January thru March 1945 elsewhere in the same reel.

This is a early December 1944 Japanese radar coverage map made by Section 22, GHQ, South West Pacific Area. It was part of the Section 22 monthly report series.
This is a early December 1944 Japanese radar coverage map made by Section 22, GHQ, South West Pacific Area. It was part of the Section 22 monthly report series.

This Section 22 created map — taken from dozens of 5th Air Force and US Navy aerial electronic reconnaissance missions — showed Japanese radar coverage at various altitudes and was used by Admiral Halsey’s carrier fleet (See route E – F on the North Eastern Luzon area of the map) to strike both Clark Field and Manila Harbor, as well as by all American & Australian military services to avoid Japanese radar coverage to strike the final Japanese military reinforcement convoys, “Operation TA”, of the Leyte campaign.

Read more

Socio-Economic Modeling and Behavioral Simulations

SimulationsIn his Foundation series of books, Isaac Asimov imagined a science, which he termed psycho-history, that combined elements of psychology, history, economics, and statistics to predict the behaviors of large population over time under a given set of socio-economic conditions. It’s an intriguing idea. And I have no doubt much, much more difficult to do than it sounds, and it doesn’t sound particularly easy to begin with.

Behavioral modeling is currently being used in many of the science and engineering disciplines. Finite element analysis  (FEA), for example, is used to model electromagnetic effects, thermal effects and structural behaviors under varying conditions. The ‘elements’ in FEA are simply building blocks, maybe a tiny cube of aluminum, that are given properties like stiffness, coefficient of thermal expansion, thermal resistivity, electrical resistivity, flexural modulus, tensile strength, mass, etc. Then objects are constructed from these blocks and, under stimulus, they take on macro-scale behaviors as a function of their micro-scale properties.  There are a couple of key ideas to keep in mind here, however. The first is that inanimate objects do not exercise free will. The second is that the equations used to derive effects are based on first principles, which is to say basic laws of physics, which are tested and well understood. A similar approach is used for computational fluid dynamics (CFD), which is used to model the atmosphere for weather prediction, the flow of water over a surface for dam design, or the flow of air over an aircraft model.  The power of these models lies in the ability of the user to vary both the model and the input stimulus parameters and then observe the effects. That’s assuming you’ve built your model correctly. That’s the crux of it, isn’t it?

I was listening to a lecture on the work of a Swiss team of astrophysicists the other day called the  Quantum Origins of Space and Time. They made an interesting prediction based on the modeling they’ve done of the structure of spacetime. In a result sure to disappoint science fiction fans everywhere, they predict that wormholes do not exist. The reason for the prediction is simply that when they allow them to exist at the quantum level, they cannot get a large scale universe to form over time. When they are disallowed, the same models create De Sitter universes like the one we have.

It occurred to me that it would be interesting to have the tools to run models with societies. Given the state of a society X, what is the economic effect of tax policy Y. More to the point, what is cumulative effect of birth rate A, distribution of education levels B, distribution of personal debt C, distribution of state tax rates D, federal debt D, total cost to small business types 1-100 in tax and regulations, etc.  This would allow us to test the effects of our current structure of tax, regulation, education and other policies. Setting up the model would be a gargantuan task. You would need to dedicate the resources of an institute level organization with expertise across a wide range of disciplines. Were we to succeed in building even a basic functioning model, its usefulness would be beyond estimation to the larger society.

It’s axiomatic that anything powerful can and will be weaponized. It is also completely predictable  that the politically powerful would see this as a tool for achieving their agenda. Simply imagine the software and data sets under the control of a partisan governing body. How might they bias the data to skew the output to a desired state? How might they bias the underlying code? Might an enemy state hack the system  with the goal to have you adopt  damaging policies, doing the work of social destruction  at no  expense or risk to them?

Is this achievable? I think yes. All or most of the building blocks exist: computational tools, data, statistical mathematics and economic models. We are in the state we were in with regard to computers in the 1960s, before microprocessors. All the building blocks existed as separate entities, but they had not been integrated in a single working unit at the chip level. What’s needed is the vision, funding and expertise to put it all together. This might be a good project for DARPA.

Worthwhile Reading

Content abundance and curation  in the media industry

18th-century Scotland  had an interesting system for paying for college

Has  getting things done in business…hiring new employees, finalizing business-to-business sales deals…become slower?

Rejecting one’s country for aesthetic reasons

Overconfident students major in political science

This should be obvious, but to many people it’s unfortunately not:  why the best hire might not have the perfect resume

Interesting thoughts:  how debt/equity mix affects the trajectory of oil prices

This writer is  pessimistic about pessimism