A Sketchy History of America’s Conservative Media Insurgency

This is perpetually a dream project for me. There is absolutely no way I will have the time to devote to it to really do it right in the next year or two, and after that it is still unlikely. But I’ve given this a great deal of thought over a fairly long period; in some ways this is the topic I have remained most continuously engaged with for the longest span, investing here and there in reading, writing, discussing, and refining little by little.

So I’m going to sketch out the basic outline of how I think about this here, in a blog post, for future reference and hopefully to flesh out into something properly scholarly at some point.

In an important sense, midcentury media was both centralized and wildly competitive and creative. This is because even then, even in the absolute heyday of mass media, “media” was not one single monolithic thing. Magazines and club periodicals and other niche audience publications occupied a space that was energetic as well as ideologically and stylistically diverse.

Mass media, however, was a cartel through and through. From newspapers to book publishers to broadcast television, a very small number of people who all ran in the same social circles set the agenda and got to decide who could and who could not have access to national audiences. The entire niche publishing market combined was a drop in the bucket compared to the enormous audiences enjoyed by the midcentury media cartel.

I would like to pause here for a moment and say that midcentury media is long overdue a serious demystification. The problem is that right now that entire discursive space is monopolized by right-wing narratives, which is chock full of bad actors. On that, more in a moment. But the main thing I wish to say is that like any cartel which is able to get away with a lot of bad bullshit, midcentury mass media did, indeed, engage in a lot of bad bullshit, and get away with it.

Contra the aforementioned narratives, I think that most of the bad bullshit was not political in nature, but was just about drumming up interest—strong-arming people to get a story, selectively and dishonestly quoting people to achieve a certain effect, and so on. In other words, by and large the bad shit they got away with was out of indifference to the harm they caused in pursuit of their bottom line, or sloppiness plain and simple. Because as the only game in town, they could afford to be so callous and so sloppy.

But I don’t want to overstate this, as I often did back in the old bloggers vs journalists days. These were also serious institutions, in every sense of the word. It’s just that institutions of that kind are a lot more shot through with bad behavior than we like to think, especially when dealing more immediately with the faults of their replacements.

On the question of bias: I do think they had it, but here I think the leftist critics are more on the money than the rightist ones. The midcentury cartel had a conventionalist bias. It’s just that they hardened into a cartel at a moment when New Deal Liberalism was at its apotheosis, enjoying extremely broad public support. And so it was biased in that direction. Later, the conservative media insurgency would define itself against that very bias. Later still, as that insurgency began winning major victories, the institutions of the old mass media would begin to define themselves against the insurgents—the beginning of a long slide into the partisan press we live with today.

But I’m getting ahead of myself. Again if I had the time and resources I’d flesh out the good and the bad more thoroughly, along with the exact nature of media bias at particular moments. But I don’t, so let’s move on to the next part: the dawn of the conservative media insurgency.

National Review holds a certain place of pride in movement conservatism, but it is and has always been in the niche audience market. The real conservative insurgency began with Talk Radio. As is typical of entrants seeking to compete with big, established players, conservative media entrepreneurs went into a close substitute field that was basically unoccupied, creating a cottage industry overnight. 

Fox News was the next big accomplishment of this movement; while cable news was not new by the time it launched in 1996, it was still not as pervasive as it has now become. And with the comparative wealth of channels cable TV offered relative to broadcast, there was more room for entrants in general.

The Internet was naturally the last big piece, and it has been big indeed, though we musn’t overstate it—the communities created by Talk Radio and Fox News are probably the biggest pieces of the conservative insurgency’s positive contribution to its final victory. The rise of the Internet probably contributed to that victory most directly by bleeding newspapers dry of classifieds revenue, and by suddenly putting every form of media into more direct national, even global competition with one another. In other words, the Internet ruptured a number of contingent business arrangements and market structures that had buoyed the established media’s profitability for a long time.

But of course, alongside these blows to the entrenched players, the Internet also opened the market up to all comers. The excitement around so-called “Web 2.0” centered specifically on the development of tools and platforms that non-technical users could easily employ to publish themselves online. This very blog, obscure as it is, could have millions of viewers tomorrow under a completely conceivable (not to say likely) set of circumstances. The birth of the big social media platforms reduced the number of steps you needed to receive such sudden and massive attention, which is so common these days we coined a phrase for it (“going viral”) for convenience’s sake.

Unlike Talk Radio, but like Fox News, the conservative insurgency was not the lone pioneer in online media. Especially because the blogosphere really lit up under a Republican administration, the leftist and liberal blogosphere was active and energetic. But the conservative one was equally so, precisely because the mainstream press, though flagging, was still dominant, and when contrasted to the genuinely conservative voices they could now find online (and in Talk Radio and on Fox News), the hated Mainstream Media (“MSM”) was intolerably left-biased. Out of the conservative blogosphere was birthed many an online conservative publication, Breitbart being among the most significant.

All right. Now let us return to the question of quality, raised ever so fleetingly during the discussion of the midcentury cartel.

I mentioned that a lot of bad bullshit occurred under the midcentury cartel because they could get away with it. Well, the conservative insurgency has engaged in a hundred times as much bad bullshit. They have done so because, in general, it is not the nice guys who are able to unseat a truly entrenched power. There is an important parallel in the history of the partisan polarization of congress, discussed by political scientist Frances Lee and summarized by Vox here. In that telling of history, Democrats dominated Congress for decades, and in order to overturn that, the Republicans in effect declared all out war, putting all tactics on the table and steamrolling past established norms.

The conservative insurgency is a phrase I picked precisely because I think it is best seen in such total war terms. Talk Radio played for keeps, and Breitbart definitely does. The more they succeed, the more their enemies sink into the same total war mindset.

Frances Lee thinks that the war of the congressional parties cannot be settled until one dominates again and the other accepts defeat for a while. Here the metaphor becomes hazy, for congressional seats are determined by first past the post voting and other particularities that do not apply to the media landscape. We have lived through times when the press was explicitly partisan before, and that may be our new normal again. A conservative media victory on par with the midcentury cartel is simply impossible under present technological circumstances, and unlikely given the notably unconservative bent of the typical resident of America’s large urban population centers.

Whatever the future holds for American media, the present seems relatively clear to me: at a high level, the big presses are transparently aligned with one major party or other. This does not, again, make them symmetrical: the conservative media ecosystem is truly a cesspool of the worst sort. And that, again, does not get the liberal press off the hook: its sins range from the laughable mediocrity of Glenn Kessler-style “fact checking” and Washington Post #resistance theater in its new motto, to becoming dependent upon pumping up the very bad tendencies it ostensibly is aligned against.

But I don’t mean to do any serious evaluations here. I just wanted to sketch out that broad history: from midcentury media split between the creative niches and the conventionalist cartel, to the conservative media insurgency spearheaded through Talk Radio and Fox News, to the Internet’s gutting of mainstream media’s business models and empowering of the online conservative media ecosystem, to our current partisan press.

That’s the gist. To be fleshed out and defended (and modified based on research of course) at some future time. Please do not hesitate to steal it if you intend to do the scholarly legwork (not that I would mind a link or a shout out should you do so!)

The Storyteller

There was once a boy who wanted to tell stories. These were exactly the sorts of stories you would expect a little boy to tell; they involved video game and cartoon characters that he thought were cool. They spent most of their time getting into fights that the boy would contrive some reason for. Sometimes the boy could be bothered to write these stories on paper, but most of the time he simply acted them out with an ever-growing army of action figures.

As the boy grew, he aspired to write down more of these stories, but was never very dedicated to the task. Every so often he would spend a lot of energy on an idea for a story and then drop it before he was finished. Long after he had become a man—to the extent that one could call him such a thing with a straight face—the number of completed works of fiction he had seen through in his lifetime could be counted on his hands.

However, he had been far from idle as a storyteller. At some point late in his childhood, he began to tell stories about things in the world he had heard about, and read about, and talked about, particularly the things he had talked with his father about. Current events, history, philosophy, and every sort of idea filled the thousands of stories that he told. Though he had never had a trouble writing essays for a grade, still he was quite bad at telling a good story to begin with. But he enjoyed it, and he kept at it. After more than a decade of such storytelling, he began to find that the number of stories he produced that he was proud of was beginning to exceed the ones he considered duds.

Yet the fact that he never spent any of that energy telling the kinds of stories that excited him as a boy ate at him. He thought, surely, as I have been telling stories all along, it would be a simple matter to switch over and start doing the other sort. But domain dependence turns out to be far narrower than he had thought; writing fiction was hard work, while writing nonfiction came as easily to him as breathing.

Finally he came to terms with the fact that there was no shortcut for rounding out his abilities as a storyteller. If he wanted to tell the stories he had so loved growing up, he would have to start in the same place that he had started with the stories he could now tell so effortlessly. He would have to commit to stories that he could finish, no matter how small. He would have to keep at it, rather than writing one story and then not doing another for months or years. He would have to accept that it would be a long time before he was ready to write stories that he could be proud of on a regular basis.

So, afraid that he might be setting himself for another false start, he began. He began with the simple story of how he got here, because it’s a story he already knew well and knew how to tell. And that is the story you have just finished reading.

An Homage to the Wimp Turned Badass

Spider-Man_spider-bite

The eldest of the superhero icons, Superman and Batman, are badasses through and through. Superman was born stronger and faster than is humanly possible, along with having the ability to fly (among other things). Batman has no powers, but is well-rounded in his mortal badassness—not only is he fit and strong and capable of kicking your ass with ten different types of martial arts, but he’s also smart and mega-rich.

Spiderman came later than these two, and he is emblematic of a different sort of hero. Peter Parker was scrawny, a nerd, and—let’s be honest—something of a loser. How did he gain his powers? He was hanging around some boring science demonstration and pretty much got bit by it. Peter Parker was a wimp, but his powers turned him into a badass.

steveditko-asm-05

Spiderman was my hero of choice growing up. It was the 90’s, and comics in general and Marvel in particular were going through a weird time. But Spiderman nevertheless remained true to Stan Lee’s original vision of a superhero that had a lot of problems. Most of them were boring, normal human problems.

Perhaps it is because I grew up on Spiderman, but I am a huge sucker for the wimp to badass genre. After my consumption of Spiderman and American (non-web) comics in general fell, my main source for these stories has been Japanese manga and Korean manhwa.

Manga has a specific genre called shounen that is targeted to teenage boys that is rife with wimp to badass stories. The most emblematic is probably Naruto, current king of the manga mountain, about a ninja who has no skills at all but manages to achieve greatness through hard work.

As my teenage years are now nearly ten years behind me, it’s a bit embarrassing how addicted I can get to this formula in these settings. My current obsession is a manhwa series called The Breaker that comes out every Friday. Come Friday morning, I am eager to wake up so I can read the next 18 or so pages of Shioon Lee’s adventures. I sometimes get so impatient waiting for the next installment that I go back and reread a few hundred pages of the earlier parts!

By far my favorite work in this genre is the manga Holyland. Holyland has a great deal to recommend it: it doesn’t use super powers to spice up the fights, the artist actually knows a great deal about boxing and martial arts and the human body, and the female characters are actually proportioned like human beings, rather than some teenage boy’s deranged idea of a sexpot.

But the real draw is the main character, Kamishiro Yuu. Yuu is not like Spiderman. He doesn’t crack jokes, and his rise isn’t what you’d call a feel good story—though it has an excellent resolution. Bullied and marginalized socially, Yuu becomes stronger purely to overcome the feeling he has of being utterly pathetic. Once he is strong, the resentment he felt from being made to feel like garbage does not simply go away.

Holyland_v02_c012_p040

Mori Koji (the artist) introduces this darker element quite slowly and tactfully. At first it truly appears that Yuu is just an innocent bystandard who is defending himself from people who underestimate him. As the story progresses, however, it becomes clear that he takes a dark enjoyment in beating the shit out of people. The manga follows his character as he grows stronger, discovers this side of himself, is terrified by it, and fights against being consumed by it entirely. At the same time, the character is genuinely likable, as are the friends he makes along the way who help him resist falling into violence entirely.

I am not ashamed to say that I find it to be a masterpiece in the wimp to badass genre, and anyone who enjoys stories along this line owes it to themselves to give it a read.

Read from right to left

I have to say, as I get older I really wish that there were more stories in this genre that were set somewhere other than a High School. Even Spiderman started out there! But perhaps there’s something inherently juvenile about wanting to see scrawny nerds go around beating people up.

If so, my content consumption would seem to imply…a rather juvenile tendency in my tastes.

Oh well. If growing up means giving up my wimp to badass stories, then I don’t want to grow up.

Stories About Education

My piece this week at The Umlaut was inspired by the ongoing debate about online education. I say “inspired by” because, while it was my intention to write about online education at the outset, that’s not where I ended up at all. I came to feel that the whole debate wasn’t really about Udacity or any of the new sexy education tech of the moment, but rather about a general sentiment that something has gone horribly amiss in the American system of higher education.

Moreover, it became clear to me that there isn’t anything particularly special about the latest online offerings. Cheap, practical alternatives to the college path have existed for a long time now in the form of professional development courses, industry certifications, and vocational schools. For some reason, people tend to look down their noses at these options, if they even acknowledge them as options at all. I decided to make our weird priorities, and the consequences of them, the main thrust of my piece.

It’s always fascinating to me the different stories that we have about why we go through this crazy 16 year process called formal education. One thing I noticed is that proponents of the “online education is going to change everything” point of view tended to all subscribe to the notion that education was about information transfer. Their critics, on the other hand, were much more ambiguous in what they thought education was for—and seemed to lean towards some sort of cultural, rite of passage type argument.

Meanwhile, in economics, you have the signal theory of education. The short version of this is that the content of your education is more or less worthless, it’s really just about sending a signal to the market about what kind of worker you are. One of the biggest proponents of this point of view is Bryan Caplan, who is quite skeptical about online education’s ability to make a dent in the establishment. Unlike most of online education’s critics, he is arguing from a place of cynicism rather than idealism about the nature of education in general.

Information Transmission

The productivity of teaching, measured in, say, kilobytes transmitted from teacher to student per unit of time, hasn’t increased much. As a result, the opportunity cost of teaching has increased, an example of what’s known as Baumol’s cost disease. Teaching has remained economic only because the value of each kilobyte transmitted has increased due to discoveries in (some) other fields. Online education, however, dramatically increases the productivity of teaching.
-Alex Tabarrok, Why Online Education Works

The whole point of learning is that you learn something, right? It’s all about imparting information upon the student. Whether we’re talking about multiplication tables or the date and consequences of the Battle of Hastings, students are—in theory—supposed to walk away from the school year with more information in their brains than they had at the beginning of the year.

If this is your story of education, then brick-and-mortar education must surely be doomed. In the essay linked to above, Tabarrok points out three reasons why this would be so:

I see three principle advantages to online education, 1) leverage, especially of the best teachers; 2) time savings; 3) individualized teaching and new technologies.

The first point goes to the fact that a single recorded lecture or piece of writing can now be viewed by anyone anywhere in the world that has access to the Internet. Tabarrok’s TED talk has been watched 700,000 times, several hundred thousand times more than his non-recorded, un-uploaded lectures ever will be. This is the blockbuster effect. In theory, the very best lectures by the very best teachers can now dominate the education of everyone in the world.

The time savings comes from the fact that with a recorded lecture, you can be as concise as possible, since people who don’t get it the first time have the luxury of rewatching it as many times as they want. Meanwhile, the people who get it the first time can move right on to the next lecture, a convenience not afforded students in a classroom who have to wait while the teacher answers their classmates’ questions.

The individualized teaching comes from the fact that teachers can outsource the lecture part of education to online resources and spend the time they would have been lecturing answering individual questions instead, and talking one on one with students. This is what is called flipping the classroom.

Clay Shirky also subscribes to the education as information transmission story. In his post which kicked off a huge debate about online education and education in general, he compares Udacity and MOOCs to Napster and the MP3. Infinite copies can be made, it can be transmitted over the Internet, and it’s available at no charge. In a response to critics of the piece, he bluntly states what he believes to be the chief purpose of education:

What we do is run institutions whose only rationale—whose only excuse for existing—is to make people smarter.

I am highly skeptical of the information transmission story of education. I’m sure that some information does get transmitted, though, as Caplan points out, most students forget most of it, and it doesn’t even take very long. Moreover, as I outline in my Umlaut article, there have been more cost-effective methods for transmitting information to students for decades, and these have only multiplied in quantity and variety, and lowered in cost.

Yet still we treat the 16 year path from K-12 to a bachelor’s degree as the proper way of doing business. Does it really take 16 years for us to convey all the information we want conveyed to our youth, even without digital technology? I find this story hard to swallow. Something else must be going on here.

Manufacturing Persons of Quality

The classroom has rich value in itself. It’s a safe, almost sacred space where students can try on ideas for size in real time, gently criticize others, challenge authority, and drive conversations in new directions.

-Siva Vaidhyanathan, A New Era of Unfounded Hyperbole

My suspicion is that this whole formal education thing is just a case of cultural snobbery. K-12 makes a certain sense—there’s certainly a lot of value in promoting literacy and basic math skills. I don’t think there’s any reason why that should take until we’re 18, but there you go.

But college in particular was never about information transmission, back before the modern push to universalize attendance to it. College was where Persons of Quality went to learn how to sound intelligent when talking with other Persons of Quality.

We talk about college as if it’s the only thing standing between the average student and a lifetime of unemployment—or worse, a lifetime as a cashier or burger flipper at McDonald’s. But I think on some deeper level, people just think there is something wrong with the kind of people who don’t go to college. Or that college imbues its students with something glorious and unquantifiable that it is unjust to deny anyone access to.

But if you don’t want to work at McDonald’s, you could become, say, an electrician. According to the BLS, this requires 144 hours of technical training and then four years of paid apprenticeship, after which the median electrician makes $48,250 a year–enough to live comfortably. And this is just one example–there are tons of paths that cost enormously less in both money and time to avoid the burger-flipping or gas station clerk outcome, if avoiding that sort of work is your goal.

But if it’s not a lawyer or a doctor, we sneer at vocational education.

In the week leading up to submitting my piece at the Umlaut, I read a lot of responses to Tabarrok and Shirky’s arguments. One thing I found odd was that these critics seemed to have a less clear idea of just what education was for than Shirky or Tabarrok did. However, I detected cultural snobbery in the background. Take the Siva Vaidhyanathan quote above. Or the following:

As a student, when I was at Ohio State I took a class with Jennifer Cognard-Black, a graduate student. I had been reading George Orwell’s letters. I just went to her office hours and I was like, I’ve got these letters, aren’t they cool? And I had nothing to say! I was really just thrashing around, [it was] incoherent excitement. And she said, “So, what are you interested in, which part of it?” I don’t even remember what we said. It wasn’t that this was an intellectually transformative experience; it was that I was taken seriously as a thinker, and it validated the entire idea of being excited about George Orwell’s letters. It sounds like a small thing, but it wasn’t; it was huge.

That’s Aaron Bady, quoted in the Awl. Unlike most of the participants in this debate, Bady seems refreshingly clear that we don’t really know what this is all for:

The thing is, when you frame this as, “what does this give them for the rest of their lives?” one never really knows, and I think that’s the point; there is something, but it’s something we’re all discovering together. When we reduce education to job training; when we reduce it to, “we need X skills, so let’s do whatever causes X skill to come out,” you really close down all the possibilities.

So college is a place where you can be taken seriously as a thinker, but we don’t really know what value that will have for the rest of your life. But if you hone in on one particular thing, you’re being closed-minded about all the other possibilities.

As someone who spends a lot of time being excited by any number of nerd-equivalents to George Orwell, I feel confident saying that I’ve been able to live Bady’s experience over and over for something like half of my life. I did it online. When I was a teenager, I went from forum to forum, raging about politics and philosophy to anyone who would engage. And engage they did. I found plenty of people to share my excitement over esoteric intellectual subjects with over the years. After forums, it was blogs, which are obviously still a big part of it. Then Facebook and Twitter and the new wave of social tools grew up and it became that much easier to connect with others who would share my excitement.

So finding a group where you can be “taken seriously as a thinker” is easier than it has ever been. And I’m not sure it’s worth cramming billions of dollars in subsidies and encouraging people to take on hundreds of thousands of dollars in student loans to keep an open mind about what college might be about.

It would be unfair not to link to Bady’s own critique of Shirky here, which is much more targeted to Shirky’s specific arguments.

But from Bady, Vaidhyanathan, the author of the Awl piece, and elsewhere, I’ve sensed an implicit cultural judgment in the same family as complaints that we’re reading tweets rather than Tolstoy. I always wonder–why Tolstoy? A lot of people are reading Harry Potter, for instance. Are they somehow spiritually inferior if they haven’t also read Tolstoy, or some great classic?

I don’t meant to imply that there is no value in Tolstoy or in the great classics. I do mean to imply that obtaining that sort of value probably isn’t actually worth the enormous amount of money that is currently being spent on it by governments, charities, and private individuals. Especially when you can read Tolstoy for free online!

Signaling Theory

According to the signaling model, employers reward educational success because of what it shows (“signals”) about the student. Good students tend to be smart, hard-working, and conformist – three crucial traits for almost any job. When a student excels in school, then, employers correctly infer that he’s likely to be a good worker. What precisely did he study? What did he learn how to do? Mere details. As long as you were a good student, employers surmise that you’ll quickly learn what you need to know on the job.

-Bryan Caplan, The Magic of Education

Signaling theory in economics was pioneered by Michael Spence. The basic idea is that there are people with desirable qualities for employers, and people without them, but on the surface they seem identical. However, it turns out that obtaining a college degree costs less for the people with the desired qualities than the people without them. Maybe this is because those people tend to come from middle class families, and therefore have the financial support of their families. Or maybe this is because the people without the desirable qualities don’t have the discipline to make it through four years of coursework.

Whatever the reason, the cost differential is all that matters. The students could learn nothing but garbage for four years, but if they can get to the diploma at a lower cost than people without the qualities that are valued in the market, they will increase their lifetime earnings by getting the diploma.

Note that education policy understandably aims to lower the cost of access for everyone. If education is largely signaling, then this is extremely wasteful. Since the cost differential is what matters, then lowering the costs for everyone just raises the bar for obtaining the differential. In practice this means spending more years in college than people would have under a less generous policy. So, if signalling theory is what explains most of why people go to college, then our current policy is wasteful both in spending and in that it encourages people to waste their time for longer.

Caplan brings a lot of empirical arguments to bear to defend the signalling theory of education. Most of these are intended to demonstrate how worthless an education actually would be in the market, if all we cared about was the actual content of it. Consider the following:

Yes, I can train graduate students to become professors. No magic there; I’m teaching them the one job I know. But what about my thousands of students who won’t become economics professors? I can’t teach what I don’t know, and I don’t know how to do the jobs they’re going to have. Few professors do.

Many educators sooth their consciences by insisting that “I teach my students how to think, not what to think.” But this platitude goes against a hundred years of educational psychology. Education is very narrow; students learn the material you specifically teach them… if you’re lucky.

Other educators claim they’re teaching good work habits. But especially at the college level, this doesn’t pass the laugh test. How many jobs tolerate a 50% attendance rate – or let you skate by with twelve hours of work a week? School probably builds character relative to playing videogames. But it’s hard to see how school could build character relative to a full-time job in the Real World.

Caplan makes strong, provocative arguments, and I look forward to his book on the subject. I tend to think that at least part of education must be explained by the signalling model. On the ground, this was certainly a story that my fellow students would often pay lip service to. The story was not so systematic or formal as the actual economic theory of signalling; instead it took the form of the belief that all we really got out of college was a piece of paper that for some reason bestowed magical qualities upon us in the job market. Whether or not anyone really believed that depended on the mood you caught them in, but it was a well circulated story none the less.

I also wonder if there isn’t some marriage of the signalling story and the Person of Quality story to be found. What if what employers really want are people raised with a certain set of values, and going to college demonstrates a commitment to those values?

In the diffusion of innovations literature, new ideas and products spread lightning fast when they reach that big chunk of the population (labeled the “early majority” and “late majority”) where the vast majority of the people involved have very similar characteristics. This sets them apart from “innovators” and “early adopters” who tend to be richer or of higher status on some margin than the majority, and “late adopters”, who tend to be poorer and of lower status than the majority.

What if the chief benefit of universalizing formal western education in this country was that it made everyone a lot more like one another? Just as we’re more likely to marry or befriend people who are more like us, we also may be more likely to hire someone who is more like us, or invest in a company run by someone who is more like us, and so on. Maybe education has almost nothing to do with information transmission, but instead is some mixture of acculturation and signalling?

How Education Has Changed and Will Continue To

The bottom line is that we don’t really know what function education serves. There are a lot of stories and you can put the evidence together in various ways to defend many of them, many that contradict one another.

But it seems clear to me that the way education will change, and has been changing, is clear, regardless of what story you choose to believe.

It will change in the way that all things have changed since the onset of the Industrial Revolution–we will see bigger blockbusters and longer tails.

Let’s say you believe the information transmission story. Then, as Tabarrok pointed out, you will get blockbuster lectures and educational materials; stuff that is seen by an unprecedented number of people around the world who are eager to learn. You will also get long tail effects–a huge amount of variety, some of which only gets seen by perhaps a handful of people but which may nevertheless enrich them intellectually.

Let’s say you’re a believer that the world has been going to hell in a handbasket ever since we all stopped reading Tolstoy. Well, as I mentioned before, now anyone anywhere in the world with an Internet connection can access Tolstoy’s works, for free. And anyone anywhere in the world can write about Tolstoy, and Shakespeare, and how society is going to hell in a handbasket since there are people who would rather read Harry Potter. There will be a long tail of communities populated by people who subscribe to the culture of the Person of Quality.

Caplan is extremely skeptical that online education will have much of an impact if the signalling theory is correct. But there has been a long tail of credentialing for a long time–consider project management certification, or SAS certification, or any number of other industry specific certifications. And Russ Roberts pointed out that homeschooling went from being a marginal activity to gaining acceptance.

Moreover, there’s an argument to be made that our current way of paying for higher education is simply fiscally unsustainable–Shirky makes this case at length. So the nature of the average education may end up changing due to some combination of financial implosion in the traditional sector and innovation on the outside.

Education is already a power law industry, and it will always remain one. It will probably grow even more skewed than it is today. But the particulars are going to change, and the long tail will get longer. On the whole, I am optimistic.

PostScript

After posting these, I received a couple of responses that tell a story of a different sort.

Along the same lines, my father added:

I think Shirky’s right: higher education is like the daily newspaper, a bundle of unrelated stuff. It all makes cultural sense, until it doesn’t. College was a place for the Great Middle Class to park their kids until they figured life out. The cost-benefit of that makes the commitment increasingly untenable…

Unleash the Practitioners

Richard Dawkins is famously optimistic about human knowledge, especially within the confines of science. He is–understandably–allergic to the brand of postmodernist who believes that reality is simply a matter of interpretation, or cultural narrative. He has a much repeated one-liner that comes off as quite devastating–“There are no postmodernists at 30,000 feet.

It’s quite convincing. Engineers were able to make airplanes because of knowledge that was hard-won by the scientific community. The latter developed and tested theories, which the former could then put to use in order to get us moving about in the air at 30,000 feet. Right?

Wrong.

Historian Philip Scranton has done extensive work demonstrating that the original developers of the jet engine had no idea of the theory behind it, which was only developed after the fact. The jet engine was arrived at through tinkering and rote trial and error.

Dawkins was correct that there is a hard reality that is undeniable, and led to many failed prototypes. But the background story of science that he subscribes to is simply incorrect in this instance. Scientists didn’t develop theory which practitioners could apply; the practitioners invented something that scientists then felt the need to explain.

What’s amazing is how often this turns out to be the case, once you start digging.

Practitioners Elevated Us to New Heights

If there is one book that should be mandatory reading for every student of history, it is Deirdre McCloskey’s Bourgeois Dignity. It lays out in stark fashion just how little we know about what caused the enormous explosion in our standard of living that started over two hundred years ago. She systematically works through every attempted explanation and effectively eviscerates them. Issues of the day seem small when put in the perspective of a sixteen-fold growth in our standard of living (conservatively measured), and the utter inability of theorists to explain this phenomena is humbling.

For our purposes here we focus on Chapter 38: “The Cause Was Not Science”.

We must be careful when throwing around words like science, as it means many things to many people. What McCloskey is referring to is the stuff that generally gets grouped into the Scientific Revolution; the high theory traded by the Republic of Letters.

The jet engine example I mentioned earlier is exactly the sort of thing McCloskey has in mind. Take another example, from the book:

“Cheap steel,” for example, is not a scientific case in point. True, as Mokyr points out, it was only fully realized that steel is intermediate between cast and wrought iron in its carbon content early in the nineteenth century, since (after all) the very idea of an “element” such as carbon was ill-formed until then. Mokyr claims that without such scientific knowledge, “the advances in steelmaking are hard to imagine.” I think not. Tunzelmann notes that even in the late nineteenth century “breakthroughs such as that by Bessemer in steel were published in scientific journals but were largely the result of practical tinkering.”” My own early work on the iron and steel industry came to the same conclusion. Such an apparently straightforward matter as the chemistry of the blast furnace was not entirely understood until well into the twentieth century, and yet the costs of iron and steel had fallen and fallen for a century and a half.

This story plays out over and over again–the hard work of material progress is done by practitioners, but every assumes that credit belongs to the theorists.

It turns out that it isn’t even safe to make assumptions about those industries where theory seems, from the outside, to really dominate practice. What could be more driven by economic and financial theory than options trading? Surely this must be a case more in line with traditional understandings of the relationship between theory and practice.

And yet Nassim Taleb and Espen Gaarden Haug have documented how options traders do not use the output of theorists at all, but instead have a set of practices developed over time through trial and error.

Back to McCloskey:

The economic heft of the late-nineteenth-century innovations that did not depend at all on science (such as cheap steel) was great: mass-produced concrete, for example, then reinforced concrete (combined with that cheap steel); air brakes on trains, making mile-long trains possible (though the science-dependent telegraph was useful to keep them from running into each other); the improvements in engines to pull the trains; the military organization to maintain schedules (again so that the trains would not run into each other: it was a capital-saving organizational innovation, making doubletracking unnecessary); elevators to make possible the tall reinforced concrete buildings (although again science-based electric motors were better than having a steam engine in every building;  but the “science” in electric motors was hardly more than noting the connection in 1820 between electricity and magnetism-one didn t require Maxwell’s equations to make a dynamo); better “tin” cans (more electricity); asset markets in which risk could be assumed and shed; faster rolling mills; the linotype machine; cheap paper; and on and on and on. Mokyr agrees: “It seems likely that in the past 150 years the majority of important inventions, from steel converters to cancer chemotherapy, from food canning to aspartame, have been used long before people understood why they worked…. The proportion of such inventions is declining, but it remains high today.”

In 1900 the parts of the economy that used science to improve products and processes-electrical and chemical engineering, chiefly, and even these sometimes using science pretty crudely-were quite small, reckoned in value of output or numbers of employees. And yet in the technologically feverish U.K. in the eight decades (plus a year) from 1820 to 1900, real income per head grew by a factor of 2.63, and in the next eight “scientific” decades only a little faster, by a factor of 2.88. The result was a rise from 1820 to 1980 of a factor of (2.63) • (2.88) = 7.57. That is to say-since 2.63 is quite close to 2.88-nearly half of the world-making change down to 1980 was achieved before 1900, in effect before science. This is not to deny science its economic heft after science: the per capita factor of growth in the U.K. during the merely twenty years 1980 to 1999 was fully 1.53, which would correspond to an eighty-year factor of an astounding 5.5. The results are similar for the United States, though as one might expect at a still more feverish pace: a factor of 3.25 in per capita real income from 1820 to 1900, 4.54 from 1900 to 1980, and about the same frenzy of invention and innovation and clever business plans as Britain after 1980.

Note that McCloskey is not saying that science hasn’t made any contribution at all, or that the contribution is small. Taleb does not make that claim either. What is at issue here is that the contribution of science to our material well being is not just overblown, but overblown by several orders of magnitude. McCloskey ultimately concludes that “We would be enormously richer now than in 1700 even without science.”

Yet They Are Everywhere in Chains

Alex Tabarrok thinks to road to the innovation renaissance is through focusing such funding on STEM majors and tailoring our patent system so it only provides protection for industries like pharmaceuticals where it appears to make the biggest positive difference. Even Michele Boldrin and David Levines, who otherwise believe in abolishing intellectual property entirely, agree with Tabarrok’s exception. And Tyler Cowen believes that part of what we need to do in order to climb out of the Great Stagnation is elevate the status of science and scientists.

With respect to these distinguished gentlemen, I disagree. The road to greater prosperity lies in breaking the shackles we have increasingly put around practitioners, and elevating their work, and their status.

Whether or not the specific skills implied by a STEM career contribute to progress, it is quite clear that what is taught in the classroom is unlikely to be what is practiced in the field–since the teaching is done by teachers, who are not as a general rule practitioners. And let us return to Scranton, McCloskey, and Taleb: the vast majority of our material wealth came from tinkering that is decidedly non-STEM.

If you want to make progress in pharmaceuticals, don’t do it by enforcing (or worse, expanding) patents, which inhibit trial and error by those who do not hold the patent. Instead, remove the enormous impediments we have put up to experimentation. The FDA approval process imposes gigantic costs on drug development, including the cost of delaying when a drug comes to market and greatly reducing the number of drugs that can be developed. There is an entire agency whose sole purpose is to regulate medical trials.

It is all futile–as I have said before, in the end, the general market becomes the guinea pigs for many years after the drug is available, and no conceivable approval process can change that fact. But if you think differently–if you think theorists can identify what treatments are likely to succeed ahead of time, and are capable of designing experiments that will detect any serious side-effects, then our current setup makes a lot of sense.

But that is not the reality. Nassim Taleb argued in his latest book that we should avoid treating people who are mostly healthy, because of the possibility of unknown complications. On the other hand, we should take way more risks with people who are dangerously ill than our current system allows.

The trend is going the other way. Because we have made developing drugs so expensive, it is much more profitable to try to come up with the next advil, that will be used to ease symptoms of a mild disease but purchased by a very wide market, than a cure for rarer but more deadly diseases. And it doesn’t matter what they try to do, because the ultimate use of a drug is discovered through practice, not through theory. But it does matter, in the sense that we’re currently wasting many rounds of trial and error taking putting people at risk to attempt to make small gains.

Thalidomide remains the iconic example of how this works. It was marketed as an anti-nausea drug but caused birth defects when pregnant women took it. Yet it is widely used today, for treating far more serious  problems than nausea.

You Cannot Banish Risk

Aside from overestimating the abilities of theorists, the reason the discovery process of practitioners has been so hamstrung is because people are afraid of the errors inevitable in a process of trial and error. Thalidomide babies were an error, a horrible one. But there is no process, no theory that will allow us to avoid unforeseen mistakes. The only path to the drug that cures cancer or AIDS or malaria is one that involves people being hurt by unforeseen consequences. As Neal Stephenson put it, some people have to take a lot of risks in order to reduce the long run risk for all of us.

And along with the unforeseen harms, there are unforeseen gains as well. Penicillin, arguably the single greatest advancement in medicine in the 20th century, was an entirely serendipitous discovery.

I do not know if the stories of a great stagnation are accurate, but I agree with Peter Thiel that our regulatory hostility towards risk taking impoverishes us all, and allows many avoidable deaths every year.

The only way to start pushing the technological frontier again like we did at the peak of the Industrial Revolution is to empower the practitioners rather than impair them.

Unleash the practitioners and progress will follow.