Worthwhile Reading and Viewing

Much political anger is based on attributing to opponents views that they don’t actually hold, according to this study, summarized and discussed on twitter here.

Paul Graham, who himself writes some interesting essays, says:

No one who writes essays would be surprised by this. When people attack an essay you’ve written, 95% of the time they do it by making up something you didn’t actually say, and then attacking that.

The skill of surgeons varies tremendously, with bottom quartile surgeons having over 4x as many complications as the best surgeons in the same hospital…so says this study.   And surgeons are keenly aware of who is good & who is bad – their rankings of others are very accurate.   Summarized and discussed on twitter here, where there is also a reference to the classic study   showing 10X range among programmers, and another study measuring the impact of managers on revenue performance in the game industry.

Some innovation stories from small US manufacturers, and a shop-floor driven tooling innovation at GE Aviation.

Speaking of tools, here’s a study suggesting that using mechanical tools improves language skills.

The limits of narrative, at Quillette.

Ryan Peterson, CEO of the digital freight forwarder Flexport, discovered an AI tool that lets you create art without being an artist, and has been having fun with it.

The Dictatorship of Theory

(Here’s something I wrote several years ago…I was reminded of it by a current post and discussion discussion at Quillette, so thought I”d post it here and link it there.   The references aren’t current, but the issues raised remain very real.)

Professor “X” teaches at a prominent private university. Recently, he taught a course on “Topics in Theory and Criticism.” He thought the class was going poorly–it was difficult to get the students to talk about the material–but on the last day of class, he received an ovation.

“I didn’t understand what was going on until a few days later,” he writes (in an e-mail to  Critical Mass.) “Several students came to see me during office hours to tell me that they had never taken a course quite like this one before. What they had expected was a template-driven, “here’s how we apply ****ist theory to texts” approach, because that is how all of their classes are taught in the English department here…Not a single one of these students had ever read a piece of theory or criticism earlier than the 1960s (with the exception of one who had been asked to read a short excerpt from Marx.) They simply had never been asked to do anything other than “imitate without understanding.””

In university humanities departments,  theory  is increasingly dominant–not theory in the traditional scholarly and scientific sense of a tentative conceptual model, always subject to revision, but theory in the sense of an almost religious doctrine, accepted on the basis of assertion and authority. To quote Professor “X” once again: “Graduate “education” in a humanities discipline like English seems to be primarily about indoctrination and self-replication.”

The experiences of Professor “X” are far from unique. Professor “Y,” chair of an English department, describes his experiences in interviewing for a new job (also in an e-mail to  Critical Mass). “How truthful could I afford to be about my growing dissatisfaction with theory? Should I trump up some ghastly theoretical allegiances, or should I just come clean about my desire to leave theory behind to try to become genuinely learned?” He decided to do the latter, cautiously. In his job talk, he said:

“The writings I’ve published draw on a number of different theoretical perspectives…the overarching goal I’ve set for myself in my scholarship, though is gradually to lessen my reliance on the theories of others…” He sensed at this point that he had lost the support of about three quarters of his audience, and he was not offered the job. Those who did like the statement were older faculty members–one of whom later told Prof “Y” that she hadn’t heard anyone say something like this in  twenty years.

Why is  theory  (which would often more accurately be called meta-theory) so attractive to so many denizens of university humanities departments? To some extent, the explanation lies in simple intellectual fad-following. But I think there is a deeper reason. Becoming an alcolyte of some all-encompassing theory can spare you from the effort of learning about anything else. For example: if everything is about (for example) power relationships–all literature, all history, all science, even all mathematics–you don’t need to actually learn much about medieval poetry, or about the Second Law of thermodynamics, or about isolationism in the 1930s. You can look smugly down on those poor drudges who  do  study such things, while enjoying “that intellectual sweep of comprehension known only to adolescents, psychopaths and college professors” (the phrase is from Andrew Klavan’s unusual novel  True Crime.)

The dictatorship of theory has reached its greatest extremes in university humanities departments, but is not limited to these. Writing 50 years ago, C S Lewis says the following about his sociologist hero in the novel  That Hideous Strength:

“..his education had had the curious effect of making things that he read and wrote more real to him than the things he saw. Statistics about agricultural laboureres were the substance: any real ditcher, ploughman, or farmer’s boy, was the shadow…he had a great reluctance, in his work, to ever use such words as “man” or “woman.” He preferred to write about “vocational groups,” “elements,” “classes,” and “populations”: for, in his own way, he believed as firmly as any mystic in the superior reality of the things that are not seen.”

It’s unlikely that the phenomenon Lewis describes has become any less prevalent in the intervening half-century. But in the social sciences, there is at least some tradition of empiricism to offset an uncontrolled swing to pure theory.

The theoretical obsession has even made a transition from academia into the business world, via MBA programs. Many newly-graduated MBAs have in their head some strategic “paradigm,” into which they will fit any business reality like a Procrustean bed. The 4X4 strategic grid, or the mathematical decision tool, are far more real to them than the actual details of manufacturing and selling a particular product. Like Lewis’ sociologist, they believe in “the superior reality of things not seen.” The attractions of theory-driven kind of thinking in business are similar to those that make it attractive in university humanities departments. By emphasizing theoretical knowledge, an MBA with little experience can convince himself (and possibly others) that he deserves more authority than those with broad experience and “tacit knowledge” in a particular business.

I’m not arguing that theory is useless in business management, any more than I’m arguing that it’s useless in academia. I am arguing that theory should be balanced by factual knowledge and empiricism, and that it should never be allowed to degenerate into dogma.

There’s an old saying:  when the only tool you have is a hammer, everything looks like a nail. In today’s world, we have an epidemic of people metaphorically trying to use hammers to drive nails, or to use saws to weld metal. Academia bears a grave responsibility for this situation. Too often, professors have acted not like true scholars, but like preachers believing that their salvation lies in getting people to accept the One True Doctrine, entire and unmodified–or like salesmen who have only one product to sell and will do their best to sell it to you, regardless of whether it has anything to do with your actual needs or not.

See also Studying ‘Frankenstein’ Without Reading ‘Frankenstein’.

The French Army in 1940…and the American CDC in 2021

Andre Beaufre, later a general, was in 1940 a young Captain on the French general staff.  He had been selected for this organization a few years earlier, and had originally been very pleased to be in such elevated company…but:

I saw very quickly that our seniors were primarily concerned with forms of drafting. Every memorandum had to be perfect, written in a concise, impersonal style, and conforming to a logical and faultless planbut so abstract that it had to be read several times before one could find out what it was about…”I have the honour to inform you that I have decided…I envisage…I attach some importance to the fact that…” Actually no one decided more than the barest minimum, and what indeed was decided was pretty trivial.

The consequences of that approach became clear in May 1940.

It is interesting that Picasso had somehow observed the same problem with French military culture that then-captain Beaufre had seen. As the German forces advanced with unexpected speed, Picasso’s friend Matisse was shocked to learn that the enemy had already reached Reims.

“But what about our generals?” asked Matisse. “What are they doing.”

Picasso’s response: “Well, there you have it, my friend. It’s the Ecole des Beaux-Arts”

…ie, formalists who had learned one set of rules and were not interested in considering deviations from same.

I was reminded of this history by a sequence of posts at twitter.  Joanna Masel, a theoretical biologist, says the CDC contacted her (following an NYT story) about an app she helped develop to notify people (anonymously) about possible covid-19 exposure. Her group  put a very informal preprint on github nearly immediately, and a more formal one on medrxiv soon after. A CDC coauthor was added to shepherd it through MMWR, which is described as “CDC’s primary vehicle for scientific publication of timely, authoritative, and useful public health information and recommendations.”

The preproposal was rejected. Informal feedback was that they liked it but were so backlogged that a peer reviewed journal was likely faster. This initiated 6 months of clearance procedures needed for CDC coauthor to stay on paper.

What CDC staff spend a LOT of time on: rewriting manuscripts with meticulous attention to style guides. Eg, Methods must follow exactly the order they are used in Results, all interpretation must be in Discussion not in Results, etc. to a point truly unimaginable in my field.

and

6 months and endless CDC work hours later, after new CDC edits overclaimed efficacy in ways we deny, at CDC’s urging we removed the CDC coauthor in order to terminate clearance to instead make the deadline for a relevant CDC-run special issue…On top of minor revisions from reviewers, more style guide edits required by CDC journal editors. Eg because style bans reference to an individual as a primary or secondary case, we now refer to individuals who test positive v. infected individuals v. those infected by each. After resubmission in <30 days, rejected months later despite green light from peer reviewers. Bottom line from CDC editor: because our data is now too old, we longer conform with journal guidelines….

So after the manuscript spend the vast majority of the previous 12 months on CDC desks not ours, we were rejected by the CDC because the data had become >12 months old.

Doesn’t this sound like a replay of what Andre Beaufre observed?

I saw very quickly that our seniors were primarily concerned with forms of drafting. Every memorandum had to be perfect, written in a concise, impersonal style, and conforming to a logical and faultless planbut so abstract that it had to be read several times before one could find out what it was about…”I have the honour to inform you that I have decided…I envisage…I attach some importance to the fact that…” Actually no one decided more than the barest minimum, and what indeed was decided was pretty trivial.

See the costs of formalism and credentialism.

1/4/2022:   Updated to correct name of Picasso’s artist friend.

Worthwhile Reading & Viewing

Use of mechanical tools may improve language skills.

The logistics crisis as an introduction to concentration risk.

Thousands of Chinese photographs saved from recycling.

The Two Countercultures.

Green shoots for nuclear energy–in the EU?

Why the ‘woke’ won’t debate.

Teams solve problems faster when they are more cognitively diverse.

The psychological and social costs of the college admissions game and the college treadmill.

 

Jobs Without Workers

It’s well-known that there are currently a lot of jobs going begging, even as employers offer higher pay;   see for example this article.   Bernie Sanders offers his explanation: he suggests that the problem lies in the ratio of CEO pay growth to worker pay growth since 1978, and “Maybe the problem isn’t a so-called ‘worker shortage.’ Maybe — just maybe — the working class of this country has finally had enough.”

I don’t think Bernie Sanders has a whole lot of experience with this whole ‘working’ thing, so it seems unlikely that he really understands what is going on.

Not very common, I think, for someone to turn down a job because someone at a level stratospherically above him makes a whole lot more money than what he is being offered.   Do people really decide against a job at Wal-Mart because Doug McMillon got paid $20.9 million in 2020?   Or decide not to go workin’ on the CSX railroad because of James Foote’s compensation package of $15.3 million? While people are very concerned with comparative pay levels, they are usually most concerned about the pay of people doing comparable work or those one or two levels above them (or below them) organizationally.

So what are the factors that are actually keeping so many jobs from attracting workers?

One factor, I think, is simple inertia: people who have been out of the workforce for several months during Covid lockdowns may be delaying going back to work, even though they know they will need to eventually.   Another factor is the difficulties with child care / education…even when school are physically open, it’s hard to know how long it will be until they are locked down again, so you can’t count on them for a predictable schedule…and also, there are probably a fair number of people not very enthused about sending their kids back to public school at all, given what they’ve learned about them over the past year.

There are also people who are doing work off-books, and may find that by avoiding FICA and taxes…and any reduction in means-tested benefits…they can do better than they’d do at a full-time job.

Certainly one factor in reluctance to go back to work lies in the unnecessarily unpleasant nature of too many jobs…I’m not talking about jobs that, say, involve working in foundries in high temperatures or working outdoors on commercial fishing boats in winter, but rather retail and customer service jobs that feature extreme micromanagement plus schedules that change from week to week.   See Zeynep Ton’s book The Good Jobs Strategy for more on this point. (my review here)   And the enforcement of political correctness, also, makes quite a few workplaces unpleasant places to be.

And there is a feeling on the part of many people that they can’t get ahead, because of the importance of credentialism and contacts.   I’m sure there are a lot of people in low-level positions in banks who would make excellent branch managers, but are not considered for these jobs because they don’t have college degrees…also, branch managers who are not considered for region executive jobs because they don’t have MBAs, and people who do have MBAs who can’t break into investment banking because their MBA is not from a ‘top’ school.   The importance of credentialism varies widely by industry and by specific company within an industry, of course, I suspect many people think it’s more all-encompassing than it actually is, and this is demoralizing to them and creates a ‘why bother?’ mentality.

Finally, there is the problem of skill mismatch: the jobs that are open will often require skills that the potential applicants don’t have, even when unnecessary credential requirements are eliminated.   (Although one would think that the trend toward jobs that can be done remotely would mitigate this problem to a considerable extent, by broadening the geography from which people can be drawn)

What else?