How to allocate subsidies most effectively

Sometimes you hear something that sounds so much like common sense that you end up missing how it overturns everything you were actually thinking, and points in a far more interesting and disturbing direction. That’s how I’m feeling about the coverage of a recent paper on student loans and college tuition coming out of the New York Federal Reserve, “Credit Supply and the Rise in College Tuition: Evidence from the Expansion in Federal Student Aid Programs,” by David Lucca, Taylor Nadauld, and Karen Shen.
They find that “institutions more exposed to changes in the subsidized federal loan program increased their tuition,” or for every dollar in increased student loan availability colleges increased the sticker price of their tuition 65 cents. Crucially, they find that the effect is stronger for subsidized student loans than for Pell Grants. When they go further and control for additional variables, Pell Grants lose their significance in the study, while student loans become more important.
There’s been a lot of debate over this research, with Libby Nelson at Vox providing a strong summary. I want to talk about the theory of the paper. People have been covering this as a normal debate about whether subsidizing college leads to higher tuition, but this is a far different story. It actually overturns a lot of what we believe about higher education funding, and means that the conservative solution to higher education costs, going back to Milton Friedman, will send tuition skyrocketing. And it ends up providing more evidence of the importance of free higher education.

Thus begins this piece by Mike Konczal, fascinating throughout. This is a true mystery, because why does tuition rise more student loans are available, and why doesn't it rise just as much if funding comes in the form of Pell Grants? Konczal explains why this is strange:

David Boaz at the Cato Institute has a snarky post in response to the study, saying that “[u]nderstanding basic economics” would have predicted it. This is false, because economics 101 would have predicted the opposite. Economists fight a lot about this [1], but the simple economics story is clear. According to actual economics 101, letting students borrow against future earnings should have no effect on prices.
This derives from something called the Modigliani-Miller Theorem (MM), the frustrating staple of corporate finance 101 courses. A quick way of understanding MM is that how much you value an asset or investment, be it a factory or higher education, should be independent of how you finance it. Whether you pay cash, a loan, your future equity, a complicated financial product, or some other means that doesn’t even exist yet, you ultimately value the asset by how profitable and productive it is. In this story, which requires abstract and complete markets, expanding credit supply won’t drive tuition higher.
Now what would change your valuation, according to this theorem, is getting subsidies, say in the form of Pell Grants. This would make you willing to buy more and pay a higher price. This is one of the reasons why so much of the economics research focuses on Pell Grants instead of student loans: the story about what is happening is clearer. But, again, extensions of the credit supply, not subsidies, are doing the work here.

Conservatives position an increase in the student credit supply as enabling them to borrow against future earnings. I even read somewhere last year about a company that wanted to allow actors or other celebrities to sell ownership of their future owners. You could become a shareholder of, say, Jennifer Lawrence by fronting her some cash now in exchange for her take from future Hunger Games and X-Men movies and whatever else she does.

In the case of education, this entire proposal doesn't work if the increase in credit supply is met with an equal increase in tuition. Why does tuition increase in lock step with credit supply? Konczal isn't sure, and it's the central mystery.

Note that it isn’t clear why students borrowing more against their future is driving increases in tuition they’ll pay. It could be “rational” under arcane definitions of that word. It could be that in a winner-take-all economy, in which those at the top do fantastically and those who don’t make it do not make it at all, leveraging up and swinging for the fences is a smart play. It could be that liquidity and credit are important determinants of the economy as a whole rather than a neutral veil over real resources. It could be as simple as the fact that 18-year-olds aren’t highly calculating supercomputers solving thousands of Euler equations of their future earnings into an infinite future, but instead a bunch of kids jacked up on hormones doing the best they can with the world adults provide them.

This article on public options dives in deeper on the topic.

So far, so familiar. The interesting question is what happens when we generalize this logic to other areas, like higher education. Imagine a state that's considering a choice between spending, let's say, $1 million either subsidizing its public university system, enabling it to keep tuition down, or as grants to college students to help them pay tuition. On the face of it, you might think there's no first-order difference in the effect on access to higher ed -- students will spend $1 million less on tuition either way. The choice then comes down to the grants giving students more choice, fostering competition among schools, and being more easily targeted to lower-income households; versus whatever nebulous value one places on the idea of public institutions as such. Not surprisingly, the grant approach tends to win out, with an increasing share of public support for higher education going to students rather than institutions.

But what happens when you bring price effects in? Suppose that higher education is supplied inelastically, or in other words that there are rents that go to incumbent institutions. Then some fraction of the grant goes to raise tuition for existing college spots, rather than to increase the total number of spots. (Note that this must be true to at least some extent, since it's precisely the increased tuition that induces colleges to increase capacity.) In the extreme case -- which may be nearly reached at the elite end -- where enrollment is fixed, the entire net subsidy ends up as increased tuition; whatever benefit those getting the grants get, is at the expense of other students who didn't get them.

Conversely, when public funds are used to reduce tuition at a public university, they don't just lower costs for students at that particular university. They also lower costs at unsubsidized universities by forcing them to hold down tuition to compete. So while each dollar spent on grants to students reduces final tuition costs less than one for one, each dollar spent on subsidies to public institutions reduces tuition costs by more.
The same logic applies to public subsidies for any good or service where producers enjoy significant monopoly power: Direct provision of public goods has market forces on its side, while subsidies for private purchases work against the market. Call it progressive supply-side policy. Call it the general case for public options. The fundamental point is that, in the presence of inelastic supply curves, demand-side subsidies face a headwind of adverse price effects, while direct public provision gets a tail wind of favorable price effects. And these effects can be quite large.

Our precious soil

Top cities became hotbeds of innovative activity against which other places could not easily compete. The people clustered together boosted each others’ employment opportunities and potential income. From Bangalore to Austin, Milan to Paris, land became a scarce and precious resource as a result; the economic potential of a hectare of a rural Kentucky county is dramatically lower than that of a hectare in Silicon Valley’s Santa Clara county. And there is only so much of Santa Clara to go around.
Yet more Santa Clara could be built, were it not for the second and more distressing factor behind land’s return: the growing constraint imposed by land-use regulation. The Santa Clara town of Mountain View, for instance, is home to some of the world’s leading technology firms. Yet nearly half of the city’s homes are single-family buildings; the population density is just over 2,300 per square kilometre, three times lower than in none-too-densely populated San Francisco.
The spread of land-use regulation is not hard to understand. The clustering that adds to local economic vibrancy has costs, too, as the unregulated urban booms of the 19th century made clear. Crowded slums were fertile soil for crime and epidemics; filthy air and water afflicted rich and poor alike. Officials began imposing new rules on those building in cities and, later, on those extending them: limiting heights and building designs; imposing maximum densities and minimum parking requirements; setting aside “green belts” on which development was prohibited. Such regulations have steadily expanded in scope and spread to cities around the world.
As metropolitan economies recovered from their mid-20th-century slump populations began growing again. The numbers of people living in the central parts of London and New York have never been higher. And as demand for quality housing increased the unintended consequences of the thicket of building regulation that had grown up in most cities became apparent.

Great piece in the Economist about how land has become a constrained commodity, a brake on our economic growth. A lot of wealth has been made off of rent capture, and most of it accrues to the already wealthy while the middle class are saddled with mortgage debt and the poor, who rent, are just priced out of desirable regions.

This is a nightmarish scenario for the economy. It's bad any time you have any constraint on growth, but when that scarce commodity is land, it seems particularly difficult to remove because it has to be done through a political system that is in thrall to the moneyed, connected, real-estate-owning class.

The ugliest effect of the return of land, though, may be the brake it applies to the economy as a whole. One of the main ways economies increase worker productivity, and thus grow richer, is through the reallocation of people and resources away from low-productivity segments to more efficient ones. In business this means that bad firms go bust and good ones grow to great size. Something similar should hold for cities. Where workers can be put to use at high levels of productivity labour scarcity will lead to fast growing pay packets. Those pay packets will attract workers from other cities. When they migrate and find new, high-paying work, the whole economy benefits.
Mediterranean Avenue to Boardwalk
But that process is now breaking down in many economies. For workers to move to the high wages on offer in San Francisco, they must win an auction for a home that provides access to the local labour market. The bidding in that auction pushes up housing costs until there are just enough workers interested in moving in to fill the available housing space. Salaries that should be sending come-hither signals are ending up with rentiers instead, and the unfairness can trigger protest, as it has in San Francisco. Many workers will take lower-paying jobs elsewhere because the income left over after paying for cheaper housing is more attractive. Labour ends up allocating itself toward low-productivity markets, and the whole economy suffers.
Chang-Tai Hsieh, of the University of Chicago Booth School of Business, and Enrico Moretti, of the University of California, Berkeley, have made a tentative stab at calculating the size of such effects. But for the tight limits on construction in California’s Bay Area, they reckon, employment there would be about five times larger than it is. In work that has yet to be published they tot up similar distortions across the whole economy from 1964 on and find that American GDP in 2009 was as much as 13.5% lower than it otherwise could have been. At current levels of output that is a cost of more than $2 trillion a year, or nearly $10,000 per person.

First and foremost, let's acknowledge that this is all solvable if we just relax land use regulations, build more housing, and increase the population density of our urban centers. Supply and demand still works in this universe. For a variety of reasons, some clear to me, like NIMBYism, some not clear to me, that seems intractable. Does the new David Simon show explain how hopeless it all is? I need to watch it so I find some entertainment value in my despair.

If that's a road to nowhere, can the germ of a solution come from the private sector? Tech companies tend to be creative about trying to solve problems that constrain their growth because they arise from a culture of ignoring accepted impediments. For now, though, they haven't made a ton of progress on this issue. At most they've turned to providing shuttle services with wi-fi that widen the geographic footprint in which their employees can live and still get to work.

But that doesn't work if real estate is expensive everywhere within that radius. What's next? Perhaps a deeper investment in conferencing or virtual reality technology to amplify the sensation of proximity and intimacy of even the furthest flung workers? It's long been a promise of technology, but it has yet to be realized in full.

What about turning office space into living space at night when it lies idle? It's sounds ridiculous, but a company did that with the slack time of cars and seems to raise a billion dollars of capital every other week. I know, it sounds terrible, living at the office, but I'm wary of making paternalistic prescriptions about how a person should spend their free time. Some of my closest friends in life are people I spent a lot of time with at the office at Amazon during my years there.

Maybe tech companies will start their own housing developments? The economic productivity of the average productive tech worker likely still exceeds their compensation, and tech companies, led by Google in particular, have been aggressive in pushing into that gap with increased salary, benefits, stock options, etc. Housing is just another form of investment, and they need not provide it for free, it could just be subsidized. Of course, that means they'd need to acquire land, and that, paired with a thicket of land-use regulations, still restrict the human density achievable with even the most aggressive development.

Finally, solving the problem for just tech workers doesn't feel like a path towards solving it for the rest of the population. Frankly, no one feels much sympathy for tech workers these days (with the exception of some Amazonians who are working long hours, though I'm still suspicious; a lot of it feels secretly like Schadenfreude in disguise). Bay Area complaints about high real estate costs are going to fall on deaf ears, even if it's symptomatic of a dangerous trend for our economy, as noted by the Economist article.

Group all the things I miss the most about NYC, and the root of all of them was the unmatched human density. It's not just the visceral sensation of the people around you (which some dislike), but the diversity of businesses and communities that can sprout and thrive when the potential customer base is so large and tightly packed. The variety of entertainment, like theater and museums, the variety of local cuisine, the ability to find someone to share almost any interest, from curling to revivalist arthouse cinema to hip hop dancing.

I have no answers, only a longing for the Bay Area to experience the liveliness of density in the physical world to match that of the the networks and communities they've built in the virtual space, where real estate is cheap and plentiful.

Q&A with Tyler Cowen

My intellectual spirit animal Tyler Cowen did a Q&A on Product Hunt. I'm not sure what the Q&A's have to do with Product Hunt's mission, but as always Cowen delivered a wide-ranging, high entropy intellectual joy ride. Some of my favorite questions and responses:

Ben Casnocha
How do you think your career and life would have been different if blogging, twitter, and digital media had be ubiquitous in your teens and 20's? Would you have still pursued an academic path or would you have become a full-time columnist/commentator/speaker earlier on? I seem to recall you saying at one point that you're glad the internet didn't exist early on in your life as it gave you the time to read the classics and develop a substantive base of knowledge.
tylercowen— Professor, George Mason University
@bencasnocha I am glad I was forced to live in "book culture" and "meat space' for my first forty years. Or maybe thirty-five years would have been enough. People these days have lost the sense of information being scarce, and counterintuitively that makes it harder for them to develop profound thoughts. It's like practicing chess by asking the computer right away, all the time, what the right move it. If I were starting today, probably I would not be an academic. The seductions of the on-line world would be too great, I am pretty sure.

Erik Torenberg— Product Hunt

you mentioned Travel is more beneficial than reading. Is this almost always true? Does one of travel or reading have more diminishing marginal return than the other?

tylercowen— Professor, George Mason University

@eriktorenberg well, you have to go somewhere good and go with an open mind. But most places are good if you visit them in the right manner. Reading has strongly diminishing returns once you have, say, read half of Bloom's list in *The Western Canon* and achieved a reasonable understanding of some of the social sciences.


Cole Unger

@tylercowen Underrated or overrated: professionals in America working ever-longer hours per week?

tylercowen— Professor, George Mason University

@ungercole it depends at what age. Do it for the first stage of your carrer, for sure. A lot of the alternative ways of spending time are overrated! Who wants to go out drinking? Not I. After that, see how you like it. Clearly it's not for everyone but in some key ways I think it is underrated. It is sometimes worth pushing yourself to some limits.


Haris H.

Counterfactual: the US never builds the Interstate system. Are we better off today (more urbanization, railroads, less carbon dependence & its side effects) or were the gains from increased mobility large enough to offset costs? Will this still be true in 50 years? 100?

tylercowen— Professor, George Mason University

@kingharis Huge gains, most of all social. USA could never have been such a railroads country, just look at population density. Carbon is a global problem, we are only a small part of it. Think how backward the American south would still be today without interstates.

Cowen recently linked to my post on Cuisine and Empire and it was one of the highlights of my month.

Respecting the preferences of the poor

One feature, in particular, stands out. The life of the rural poor is extremely boring, with repetitive back-breaking tasks interrupted by periods of enforced idleness; it is far removed from Marie-Antoinettish idylls of Arcadia. As the authors remark, villages do not have movie theatres, concert halls, places to sit and watch interesting strangers go by and frequently not even a lot of work. This may sound rather demeaning to the poor, like Marx's comment about “the idiocy of rural life”.
But it is important to understand because, as the authors remark, “things that make life less boring are a priority for the poor”. They tell the story of meeting a Moroccan farmer, Oucha Mbarbk. They ask him what would he do if he had a bit more money. Buy some more food, came the reply. What would he do if he had even more money? Buy better, tastier food. “We were starting to feel very bad for him and his family when we noticed a television, a parabolic antenna and a DVD player.” Why had he bought all this if he didn't have enough money for food? “He laughed and said ‘Oh, but television is more important than food.'”
Nutritionists and aid donors often forget this. To them, it is hard to imagine anything being more important than food. And the poorer you are, surely, the more important food must be. So if people do not have enough, it cannot be because they have chosen to spend the little they have on something else, such as a television, a party, or a wedding. Rather it must be because they have nothing and need help. Yet well-intentioned programmes often break down on the indifference of the beneficiaries. People don't eat the nutritious foods they are offered, or take their vitamin supplements. They stick with what makes life more bearable, even if it is sweet tea and DVDs.

From a piece at the Economist kicking off a discussion of the book Poor Economics by Abhijit Banerjee and Esther Duflo. When people throw around the phrase “first world problem” the presumption is that the poor are so many rungs down on Maslow's Hierarchy of Needs that they couldn't possibly have the mindshare to contemplate such a frivolous dilemma. In fact, though, the marginal value of something we consider frivolous may be greater for the poor than for the wealthy. This has been one of the greatest breakthroughs in my understanding of the poor and how they think about where to allocate their next dollar.

I've written about this previously in The Psychological Poverty Trap and The Persistence of Poverty. The latter looked at the work of Charles Karelis, who believes our economic models of the poor are broken.

When we're poor, Karelis argues, our economic worldview is shaped by deprivation, and we see the world around us not in terms of goods to be consumed but as problems to be alleviated. This is where the bee stings come in: A person with one bee sting is highly motivated to get it treated. But a person with multiple bee stings does not have much incentive to get one sting treated, because the others will still throb. The more of a painful or undesirable thing one has (i.e. the poorer one is) the less likely one is to do anything about any one problem. Poverty is less a matter of having few goods than having lots of problems.
Poverty and wealth, by this logic, don't just fall along a continuum the way hot and cold or short and tall do. They are instead fundamentally different experiences, each working on the human psyche in its own way. At some point between the two, people stop thinking in terms of goods and start thinking in terms of problems, and that shift has enormous consequences. Perhaps because economists, by and large, are well-off, he suggests, they've failed to see the shift at all.
If Karelis is right, antipoverty initiatives championed all along the ideological spectrum are unlikely to work - from work requirements, time-limited benefits, and marriage and drug counseling to overhauling inner-city education and replacing ghettos with commercially vibrant mixed-income neighborhoods. It also means, Karelis argues, that at one level economists and poverty experts will have to reconsider scarcity, one of the most basic ideas in economics.

Karelis' thinking is summarized in his book The Persistence of Poverty: Why the Economics of the Well-Off Can't Help the Poor.

Karelis' ideas are one possible explanation behind the effectiveness of Sam Tsemberis' approach towards solving chronic homelessness. Pathways to Housing, the organization Tsemberis founded, believes in giving the homeless housing first, no strings attached, rather than forcing the homeless to jump through a series of hoops before they qualify.

Housing First was developed to serve the chronically homelessness who suffer from serious psychiatric disabilities and addictions. Traditionally, the chronically homeless live in a cycle of surviving on the street, being admitted to hospitals, shelters, or jails and then going back to the street. The stress of surviving each day in this cycle puts a tremendous amount of pressure on the individual’s psychiatric and physical health. “Living in the street,” one Pathways to Housing client said, “It makes you crazy.”
The traditional structures in place to “help” the homeless population often make things worse, particularly for those who suffer from mental illness. Shelters and transitional living programs often require people to pass sobriety tests and other hurdles before they can be considered for housing programs. Housing is considered a reward for good behavior instead of a tool to help stabilize a homeless-person’s mental health. This attitude cuts out the people who need the support the most, effectively punishing them for their conditions. 

Respecting the preferences of the poor means understanding that the logic behind many of their purchase decisions may be very rational under a happiness-maximization framework. That we judge them to be otherwise is more a failure of empathy than anything else.

Cuisine and Empire

Great episode of the podcast Econtalk featuring guest Rachel Laudan, author of Cuisine and Empire, an instant purchase for me. Host Russ Roberts has a fascinating diversity of guests on his show, but always takes an interesting angle into the conversation, one that is driven a lot but not entirely by economics.

Some of my favorite moments from this episode. First, on the history of the potato, and just think about how many ideas are packed into just this short exchange.

Russ: And, just to stick with basics for a minute: At one point, quite surprising to me, quite late in the book, you mention the potato. I think of the potato as a very basic foodstuff. But you point out that the potato is a relatively late invention. Talk about its cultural significance and a little bit about its history. 
Guest: Well, the potato is one of a series of roots--roots in a culinary sense, that is, underground bits of plants that can be cooked into edible foods. They have--the roots have always been of less interest to civilized societies because they are so wet and heavy you cannot provision them [?] fit to use with roots. Now, the one exception or partial exception to this is the high Andes mountains where they did grow potatoes and use them from early on. But they developed an incredibly elaborate way of freeze drying them to make them light enough and storable enough to go into cities as well as combining them with maize, which by then was down there. So when the potato comes into Europe, it's an enormous cultural effort to integrate the potato into the European food system, because for anyone who lives in a settled society with cities, root-eating is a sign of basically being more like animals. Roots were animal food in Europe. And so basically the poor of Europe had to be bludgeoned into adopting the potato in the 17th and 18th century. 
Russ: It's a little hard to understand because I really love French fries, and it's hard to imagine how someone could resist this. But they didn't have French fries. Talk about what they had. 

Guest: Well, basically, fat is very expensive for most people. So French fries, until the 1960s, 1970s, well they weren't invented until the middle of the 19th century, late 19th century. But until the invention of frozen French fries in the 1960s and 1970s, French fries were for the elite. Only the richest people could afford the potatoes that were cooked in that much fat. And double-cooked in that fat--which is what you have to do for French fries. What you find in the 19th century, as fats become more available for a large bulk of the population is that potatoes become more acceptable. Because you can put butter on your boiled potatoes; you can layer potatoes with milk and cheese and make a gratin; you can bake them and add butter. And that fat makes them much, much more palatable. 

Russ: But the point you make in the book is that the potato that was first introduced--I think in the early 18th century-- 

Guest: Right. 

Russ: was bitter, and nothing like the Idaho baked potato that we might envision at a potato bar. 

Guest: No. I've been concentrating in talking to you on the cooking and processing side, but there was also this agricultural trick they had to pull off to turn a plant that lived 8,000, 10,000 feet in the Andes, where seasons are reversed from Northern Europe, into a plant that would grow successfully and be palatable in Europe and the United States. And that took 100 plus years. 

Russ: And that's true of a lot of the things that we eat, I assume. I assume that if we went back to the 15, 16, 1700s and looked at what they called a 'blank'--whatever blank is, we would find it almost unrecognizable and very unattractive. Is that fair? Or am I being too harsh? 

Guest: Yes. Very few fruits--there are a few: dates, grapes--are palatable [?] without breeding. But most fruits have been systematically bred over the centuries. Animals have been bred. Probably the only things that we regularly eat but taste as they would have done hundreds of years ago are fish of various kinds. But everything else is the result of human breeding. 

Russ: Yeah, the goal of fruit has been to make fruit more like an M&M, and it's working evidently. 

Guest: Exactly.

Many parallels to the invention and then diffusion of technology. First, it's available to just the aristocrats, wealthy, and/or elite. The cost of production comes down with scale, and then it's brought to the masses. Finally, the wealthy go in search of some other way to signal their status, to differentiate. It's one reason behind the rise of extravagant $250 prix fixe menus in which guests photograph each dish as if it were their newborn child.

Food has replaced music at the heart of the cultural conversation for so many, and I wonder if it's because food and dining still offer true scarcity whereas music is so freely available everywhere that it's become a poor signaling mechanism for status and taste. If you've eaten at Noma, you've had an experience a very tiny fraction of the world will be lucky enough to experience, whereas if you name any musical artist, I can likely find their music and be listening to it within a few mouse clicks. Legally, too, which removes even more of the caché that came with illicit downloading, the thrill of being a digital bootlegger.

Once, it felt like watching music videos on MTV was a form of rebellion in plain sight. Nowadays, the channel doesn't play any music videos. Instead, we have dozens of food and cooking shows, even entire channels like The Food Network dedicated to the topic. Chefs have become elevated to the status of master craftsmen, with names that have risen above the status of their restaurants, and diners revere someone like Jiro of Jiro Dreams of Sushi fame the way a previous generation worshipped the guitar sound of a rock god like Jimi Hendrix.

The food scene today offers a seemingly never-ending supply of scarce experiences, ingredients, and dishes. Cronuts you have to wait in line for a few hours to get your hands on. Pop-up restaurants that serve only on a few nights a week for a few weeks, then disappear forever. Restaurants that you have to sacrifice a goat to just to get a reservation, and then they'll actually take that goat you killed and prepare your entire dinner from it, nose to tail. A white truffle add-on that tacks $80 on to a single piece of cured hamachi, and oh, the truffle is only available for four weeks a year and came over on a gondola from Alba, Italy, and the hamachi is one of the last of three members of its species so you know, you should probably try it before...oops, sorry, the chef says someone just ordered the last of it. Yep, it's that couple at the corner table, and that's the last plate that she's Instagramming right now.

It's not just the scarcity of the actual food that offers such signaling opportunities. You can generate your own scarcity just by having a broad palate. When it comes to dining, many people still have narrow bands of taste, so if you're from the Jonathan Gold school of adventurous dining, you can easily set yourself apart by ingesting something exotic, like tripe stew, or some part of an animal that most people didn't even know was edible and certainly wouldn't dream of consuming.

In more recent history, the tech world has spawned yet another branch of food religion with the invention of Soylent, representing the polar opposite of the foodie religion, with its reverence for organic ingredients, elaborate preparation, and theatrical plating. Soylent is the food for people who find cooking and eating to be a waste of time, a complex job in need of simplification. It is dining function over form, with Soylent promising to deliver an exact and efficient dose of the nutrition we need as humans. I love food and dining with others too much to ever be an acolyte of this school, but that it exists is proof enough of how broad and diverse the world of food has become.

In contrast, punk rock and other formerly edgy genres of music have been assimilated by the mainstream to such an extent that flying your freak flag is harder and harder through your musical tastes. Ironically loving Taylor Swift or pop music has somehow become more iconoclastic than listening to some indie band. That Apple has so dominated the act of musical consumption (through ubiquitous iPhone and iPods and white earbuds, TV commercials for said devices and the music services accessible through them, and the massively popular bands that play at their keynotes) has mainstreamed the very idea of listening to music.

Back to the Laudan episode of Econtalk. Here she is on how French high cuisine came to be the eponymous fine dining of the world. 

Russ: You are what you eat, I guess, is an appealing idea, and to some extent true. But maybe not to the extent they used to believe. So, we have the British having a big influence on world cuisine at the end of the 19th century. Somehow, French cuisine becomes the standard of sophistication and high dining. How did that happen? And it still persists, to some extent. It's lost some of its caché, I'd say in the last 50 years. But it still remains a standard of high dining. How did that come about and why was it important? 
Guest: I think it's first important to say it's French high cuisine, because the high cuisine of France that became the international standard was something that most French people had never seen and never ate. It did not come, swell up from the peasantry. There's a slightly complicated story about what happened around 1650 when you get a rapid political change and the establishment of, after the Peace of Westphalia, a series of nations in Europe, on supposedly equal terms, combined with a shift of the scientific revolution and the Protestant revolution. And in complicated ways these would act together to produce a new cuisine that the world had never seen before. It's a really striking example of radical and rapid culinary change. The old cuisines of spiced food that--ultimately stemming from Persia but that had really influenced China, dominated in the high cuisine of India, right across to Southern Europe, were displaced by this new Northern European cuisine. And the people who developed it in its most elaborate form, because they had the greatest resources--the richest courts--were the French. And they developed it really terribly rapidly between 1650 and 1700. And that's the point where diplomacy is become important because of this national state system. And the national state system needs something to use for diplomatic dinners, to demonstrate modernity, Europeanness against the Persian-type cuisines that existed before. And so French high cuisine becomes the cuisine of European diplomacy in the 18th century, and then of international diplomacy and the international elite in the 19th century. So that by 1880 you could go to Tokyo, you could go to Santiago de Chile, you could go to Sydney, you could go to San Francisco and the thing to be eating was, if you were really rich or you were really high in politics was high French cuisine.

I think French high cuisine is on the decline from its perch atop the world's dining hierarchy, at least among the most passionate U.S. based food lovers. Japanese cuisine is a strong competitor, and the overfishing of the world has added an element of barbarism but also scarcity to eating sushi that gives an extra thrill. That it is perceived as healthier than French high cuisine, with its butter-based sauces and rich, fatty cuts of meat, makes Japanese cuisine better-suited to carry the banner forward in this decade in which a healthy diet has become a high status problem. For a variety of reasons, at its high end sushi can also justify the nosebleed prices that people expect from high status symbols and pastimes.

In the first world, access to enough food to survive is no longer an issue, and so cooking has ascended into the realm of art. Some meals I've had in the past few years are as much performance and theater as they are a way of refueling. With our insatiable desire for narrative, we've enlisted a meal out as a story we first consume and then that we tell about ourselves.

Given America's relative youth as a nation, our national dining habits have always been a cultural battleground. The U.S. came about long after the age when food was a scarce commodity, so most of our wrestling with its meaning has been first around its symbolic value, and now more recently about how to optimize our relative consumption of different types of food like carbohydrates, fat, and protein.

If any food symbolizes American dining today, it's the hamburger. Laudan can take such a humble food item and connect it to forward and back through our nation's culinary history.

Guest: Well, if I may I'd like to back up a tiny bit about presidents serving French dinners, because the American presidency has had a terrible time deciding what to do at diplomatic dinners from the get-go. There were those, like Jefferson, who said we've got to be part of international culture as well as the economy, and we should go with high French cuisine. But there is also this extraordinarily strong republican--with a small 'r'--tradition in America that's part of what the Revolution is about. And the republican strain in American thought said very emphatically that, 'No, we do not want high French cuisine. We do not want aristocratic dining. That is not appropriate. And they looked back to the Roman republic and to the Dutch republic and to other republican movements in Europe and said, 'What we need is a decent cuisine for all citizens.' And that is very much the origin of Thanksgiving, which is not a fancy French dinner for diplomats but a dinner that essentially all Americans can afford and can cook, of American ingredients. It's a kind of striking symbol of the republican tradition exemplified in an American custom, and was deliberately designed to be so. But what happened--I mean the hamburger is just sort of amazing. People say, 'Well, the British had fish and chips.' Well, fish and chips don't cut it, because fish and chips are not this beef, bread, French fry phenomenon. And what Americans managed to do beginning with White Tower but pulled off triumphantly by McDonald's is to make the food of aspiration worldwide something that in America everybody can afford, and in much of the rest of the world the middle class can afford, namely a kind of ersatz piece of roast beef or steak that is a beef hamburger on a piece of white bread with a bit of fresh vegetable out of season, even in the winter, with a sauce which is part of high cuisine, with French fries, which, you know, are popular--which become really widespread with McDonald's and the frozen French fry, which Simplot perfects--until then the French had said it was the apex of French civilized food--and washed down either with a sparkling cold drink or with a milkshake, sweet and rich and cold and foamy. That is just--it makes the food of aspiration accessible to all, and you have it in this brightly lit dining room that is clean, that you have access to. I think only if we understand how McDonald's taps into all these competing traditions that go back so deep in our culture can we understand why it became such a kind of fire point for and against modern American food.

If McDonald's has been the 600 lb. gorilla of American dining in the past several decades, perhaps Chipotle is a more suitable totem of this current age of our culinary anxieties and obsessions, if one even exists anymore given our increasingly diverse dining habits. Chipotle serves an ethnic food, derived from one of our largest immigrant populations, but transformed into something palatable for the masses, claiming to be sourced with only GMO-free, organic ingredients, served in franchises that are clean if somewhat generic, available in the places we inhabit, from cities to suburbs to highway stops. It's not food prepared by ourselves, and we eat our burrito bowls at our desks, alone, a fork in one hand, our smartphones in the other, scrolling one of our many feeds while we feed our stomachs. Chipotle represents something America does better than any country, this assimilation of the world's people and ideas and then a subsequent radiation of that back out to the world in a form more agreeable to the masses. Harvey Weinstein used to do the same to niche independent films.

Laudan's most controversial opinion, though one that is likely quite widespread among economists, is that our reverence for natural food and distrust of industrialized, processed food is the reverse of what it should be. Her piece In Praise of Fast Food was published years ago but is as relevant as ever given current attitudes.

As a historian I cannot accept the account of the past implied by this movement: the sunny, rural days of yore contrasted with the gray industrial present. It gains credence not from scholarship but from evocative dichotomies: fresh and natural versus processed and preserved; local versus global; slow versus fast; artisanal and traditional versus urban and industrial; healthful versus contaminated. History shows, I believe, that the Luddites have things back to front.
That food should be fresh and natural has become an article of faith. It comes as something of a shock to realize that this is a latter-day creed.


Eating fresh, natural food was regarded with suspicion verging on horror; only the uncivilized, the poor, and the starving resorted to it. When the ancient Greeks took it as a sign of bad times if people were driven to eat greens and root vegetables, they were rehearsing common wisdom. Happiness was not a verdant Garden of Eden abounding in fresh fruits, but a securely locked storehouse jammed with preserved, processed foods.

As for slow food, it is easy to wax nostalgic about a time when families and friends met to relax over delicious food, and to forget that, far from being an invention of the late 20th century, fast food has been a mainstay of every society. Hunters tracking their prey, shepherds tending their flocks, soldiers on campaign, and farmers rushing to get in the harvest all needed food that could be eaten quickly and away from home. The Greeks roasted barley and ground it into a meal to eat straight or mixed with water, milk, or butter (as Tibetans still do), while the Aztecs ground roasted maize and mixed it with water (as Mexicans still do).

What about the idea that the best food was country food, handmade by artisans? That food came from the country goes without saying. The presumed corollary—that country people ate better than city dwellers—does not. Few who worked the land were independent peasants baking their own bread and salting down their own pig. Most were burdened with heavy taxes and rents paid in kind (that is, food); or worse, they were indentured, serfs, or slaves. They subsisted on what was left over, getting by on thin gruels and gritty flatbreads.

The dishes we call ethnic and assume to be of peasant origin were invented for the urban, or at least urbane, aristocrats who collected the surplus. This is as true of the lasagna of northern Italy as it is of the chicken korma of Mughal Delhi, the moo shu pork of imperial China, and the pilafs, stuffed vegetables, and baklava of the great Ottoman palace in Istanbul. Cities have always enjoyed the best food and have invariably been the focal points of culinary innovation.

I excerpt Laudan heavily, but it's only a fraction of her output not just on the podcast episode but in writing. All of what I've read thus far is fascinating.

This tendency to romanticize the past, to imagine it as a pastoral paradise of harmony between people and nature and each other, is an odd human trait. Dissatisfied with the present, we look to the past for an answer, as far back as our caveman days when it comes thing like the paleo diet, even if we hardly realize just how much harder and treacherous and brutal life was back then. Your pastoral fantasy? Here's how it ends, with you stepping in cow shit, contracting cholera, and dying after several feverish nights in an unheated bedroom, at the age of 20.

As for the future? Well, most of our most popular visions of the distant future are dystopic, either tales of stragglers trying to survive in a post-apocalyptic wasteland, or prophecies of human enslavement by AI run amok. When our society does survive into the next century, it has often morphed into a nightmarish surveillance state, where all human diversity in thought and being has been stamped out. You are both funny and headstrong? You are...divergent. Still one of the silliest ideas for a book and movie in recent memory.

All this despite a steady rise in the quality of life throughout human history, with increasing tolerance and leisure and life expectancy. All evidence is that the arc of the moral universe is long, but it bends towards justice. From scarcity to abundance. And taller, healthier people. I myself am very happy I wasn't born in an age when I'd have to be a farmer just to feed myself. 

Watch enough movies, though and you'd think that the arc of human life bends into a black hole, an apocalypse, from which we have to start over again, with the seeds of of the rebirth of civilization being resown by a lone hero, most often a white male, and the beautiful woman who grew to love him sometime during the second of the three acts of the script.

Beware your nostalgia for an age you never lived in. It was probably worse then than it is now. Given our increasing resolution of detail in recorded history, I wonder if future generations will be more immune to nostalgia. I'm somewhat hopeful. I told my nephew recently that when I was his age, I had no iPhone or iPad. I hadn't even seen a desktop computer yet, let alone the web.

“Whaaaaaaaat?” he said. “That sounds terrible.” Then he went back to playing a game on his mom's iPhone.

Age of abundance, #hashtag edition

People are appending anything up to 50 hashtags to their Instagram posts, carefully researching the most popular hashtags, or formulating individual strategies (here’s a travel blogger explaining hers).
Hashtags are a search tool, providing a way to make your content discoverable by people who don’t already know or follow you. In this way, they’re a means of getting attention – and therefore status – in the endless popularity contest that’s metric-driven social media. Excessive hashtag use may be a bid for Instacelebrity, and the ensuing Instacash – with reports of top style bloggers earning $1m per year, and an estimated $1 billion sponsored Instagram post economy - or a sheer addiction to the dopamine hit of the ‘like’ count ticking upward.
But as a matter of taste, it all looks… a little grasping.

Anyone well versed in social media understands hacks like these to gain distribution for their content. This piece, whose opening is cited above, is much more interesting for its analysis of hashtag use in conveying and reinforcing status.

Let’s start from the principle that hashtag usage is often a bid for attention – you want your content to be discoverable, for more people to see it (and hopefully like it). But visibly betraying a desire for attention is a sign of neediness – and neediness is low status (you are dependent on other people’s behaviour to define your self worth). Therefore:
Hypothesis: High status brands don’t use hashtags extensively
Evidence:  We find @ChanelOfficial using hashtags, but with two constraints:
· A maximum of three per post, often only one
· Almost entirely ‘owned’ hashtags based on their campaign names

Whole thing isn't that long, all worth a read.

I recall being a kid in school, struggling to learn, often painfully, about how my words and clothing and haircut and actions affected how people perceived me, what circles I could enter and which were closed off. A terrifying crucible.

What must it be like to grow up today, not only having to learn the real world signaling prices but also the values of strategies and cultural assets and selfie poses in the social media market? I've heard from many people that if they post something to social media and if it doesn't garner a certain volume of likes within some period of time, they pull it down immediately. Oh the horror of changing your Facebook profile photo and not getting enough likes within the first hour. Every one of these kids a William Masters or Virginia Johnson of social media, exploring the boundaries of what is or isn't acceptable to local and global tribes.

From my limited sample set of observation (yep, it's still a sample set of one), a lot of social media usage cuts along a generational line demarcated by whether you grew up in the age of scarcity or in the internet-driven age of abundance. I don't have data to back this up, but if someone out there does, please let me know.

Older people, who largely grew up in an age of scarcity, publish content to social media and interact or affirm such content carefully. A like from such a person is difficult to earn because they treat it as something that must be earned. The act of giving out such a like also conveys something about the giver, so it is a considered action.

Younger people seem to be more generous and prolific with content, likes, etc. They've grown up in an age where everything digitizable is available on demand, from TV shows and movies to music to photos to articles. Their likes are freely given, and plentiful, often used more as a read receipt than a standing ovation.

It makes sense if viewed from an abundance economic framework. Likes are an infinitely replenishable virtual good, and if it adds some happiness to the recipient, what's the harm? Perhaps everyone would be happier if we all liked and favorited more frequently, more generously. Social media need not be a zero sum game.

The other view, that of scarcity, is that we'd just be reinforcing coddled millennials who, in receiving affirmation for everything, receive it for nothing. Damn these sensitive unemployed self-promoting kids with their need for trigger warnings and their impulse to take offense at even the most harmless of jokes!

The piece quoted up top comes full circle by the end.

High status social media usage often demands that the labour of working at one’s social media persona be concealed. As with beauty, status is something one is supposed to attain effortlessly – and should the frantic paddling below the surface be revealed, that is vulgar, a faux pas.
This is why Kim Kardashian is so interesting – because she, almost uniquely, does not pretend she #wokeuplikethis, but instead makes the artifice of her social media persona not only evident but into a published art photography book, the brilliantly entitled ‘Selfish. In this way, Kardashian (and also Amalia Ulman,) make the ‘Oh me? I’m not self-promoting’ hashtaglessness of Chiara Ferragni et al. look like the studied pose it really is.
Hyperproliferating hashtag useage is thus interesting as one potential tactic to invert social media ‘good taste’.

What more suitable patron saint of the age of abundance than Kim Kardashian, who finds every opportunity to shove her ample, or shall we say abundant, derriere in the public's face through all possible social media channels.

The most scarce play she's made is releasing an actual physical coffee table book that costs $9.97, at last count, on Amazon, and includes photos not released on Instagram before. I suspect these first several customer reviews are from the scarcity school of thought.

Expense reports I'd want to audit

It might be incongruous to think of spies having to account for expenses, like any old suit on a business trip, but in reality, people working for intelligence services do have to keep track of the money they're spending, file expense reports, and even hound their company (the Company, in this case) to reimburse them. "They're the same as the reports any businessman would submit after meeting a client," says Chris Lynch, former FBI and CIA counterintelligence officer and author of The C.I Desk. "Meals, miles, parking, small gifts, other expenses, receipts if they had them, some kind of 'certification' if they didn’t."
Information about expense reports for intelligence operations is somewhat hard to come by, both because it's mundane and potentially revealing. Spy memoirs don't spend a lot of time recalling the hours spent on filling out paperwork, but, on the other hand, boring paperwork, if it included line-by-line accountings of expenses, could show how an officer operates—and how lavishly he or she spends. The expenses for setting up an operation might include sourcing equipment, creating supply caches, arranging safe houses, and training people; one court case in Italy revealed records of U.S. intelligence officers staying at luxury hotels and spending as much as $500 a day eating out.
But some of the most intriguing expenses that intelligence operations rack up come from the requests of agents—the well-placed people that intelligence officers recruit to secretly pass along valuable information. Some agents simply want to be paid for their efforts. But some have much more unusual requests. 

Spies have to submit expense reports, too.

Tumblr idea: imagined renderings of James Bond's expense reports from each of the movies.

Payroll matters in MLB

FiveThirtyEight notes that despite some small budget successes in MLB like the Houston Astros or Kansas City Royals, money correlates as strongly to winning as ever.

J.C. Bradbury, an economics professor at Kennesaw State University, found that winning more increases revenue exponentially. “Going from 85 wins to 90 is worth more than 80 wins to 85,” he says. As a result, while it might cost more per win for a team that wins 90 games than 85, it makes financial sense because the revenue reward will be higher as well. This leads to a self-perpetuating cycle. Additionally, fans of teams that win frequently expect them to continue winning, and management pays more to do so. For a team like the New York Yankees, paying 10 percent more than anyone else for a second baseman who is only 5 percent better than his closest peer is worth the money (and they can afford it).
Perhaps one reason for the renewed focus on the success of small-budget teams is the importance of playoff success versus the regular season. Postseasons in American sports offer a smaller sample size than, say, soccer’s English Premier League, where the winner is determined by 38 games. In baseball, the better team (the one with the higher payroll) is less likely to prevail over the course of a short playoff series than they would be over an entire season. That, combined with the expansion of the playoffs, means it’s easier for a small-budget team to reach the World Series, as the Kansas City Royals did in 2014, losing to the San Francisco Giants in Game 7. Winning a playoff series can come down to a few factors — a couple of good pitchers and luck — that are less important during the regular season. “The formula seems to be: limp through regular season, get into playoffs, then win,” said Rodney Fort, professor of sport management in the School of Kinesiology at the University of Michigan.

That's the compromise at the design of MLB. It's harder for a small-budget team to make the playoffs, but once they're there, the odds of them winning it all are better than they are in, say, the NBA. Much of the design of sports is arbitrary, you can have set things up to increase or decrease the role of randomness for your own narrative goals. If you're uncomfortable with the idea that you can just buy wins, you're not going to root for teams like the Dodgers, Yankees, or Red Sox.

I'm not a huge soccer fan, but it seems there are no salary caps for UEFA teams in Europe. Do fans there feel similar reservations about the effective monopoly on success for those with deep pockets?

I'm of mixed emotions on the topic. On the one hand, a salary cap that puts all teams on on equal footing seems equitable. On the other hand, its larger effect is to suppress player salaries, shifting those dollars into owners' pockets. Oddly, most sports fans I know seem more sympathetic to owners than players, not what you'd expect from people who are themselves laborers. That is, if their team gets a bargain on a star player, they're happy.

I generally side with players, even if their salaries are already high, because I like seeing people achieve fair market value for their contributions. I wonder if the prevalence of fantasy sports has made more fans more sympathetic to ownership than players since such games generally put fans in the position of being a general manager.