Why there are faux Irish pubs everywhere

Ireland, as much of the world knows it, was invented in 1991. That year, the Irish Pub Company formed with a mission to populate the world with authentic Irish bars. Whether you are in Kazakhstan or the Canary Islands, you can now hear the lilt of an Irish brogue over the sound of the Pogues as you wait for your Guinness to settle. A Gaelic road sign may hang above the wooden bar and a fiddle may be lying in a corner. As you gaze around, you might think of the Irish—O, that friendly, hard-drinking, sweater-wearing people!—and smile. Your smile has been carefully calculated.
In the last 15 years, Dublin-based IPCo and its competitors have fabricated and installed more than 1,800 watering holes in more than 50 countries. Guinness threw its weight (and that of its global parent Diageo) behind the movement, and an industry was built around the reproduction of "Irishness" on every continent—and even in Ireland itself. IPCo has built 40 ersatz pubs on the Emerald Isle, opening them beside the long-standing establishments on which they were based. 
IPCo's designers claim to have "developed ways of re-creating Irish pubs which would be successful, culturally and commercially, anywhere in the world." To wit, they offer five basic styles: The "Country Cottage," with its timber beams and stone floors, is supposed to resemble a rural house that gradually became a commercial establishment. The "Gaelic" design features rough-hewn doors and murals based on Irish folklore. You might, instead, choose the "Traditional Pub Shop," which includes a fake store (like an apothecary), or the "Brewery" style, which includes empty casks and other brewery detritus, or "Victorian Dublin," an upscale stained-glass joint. IPCo will assemble your chosen pub in Ireland. Then they'll bring the whole thing to your space and set it up. All you have to do is some basic prep, and voilà! Ireland arrives in Dubai. (IPCo has built several pubs and a mock village there.)

The strange true-life story of how Ireland packaged and exported a version of its culture all over the world, and how it boomeranged back to Ireland in the form of added tourism revenue. As with technology apps these days, it's all about marketing and distribution.

Mathematics of why hipsters all dress the same

Here comes the crucial twist. In all of the examples so far, we assumed that everyone had instant knowledge of what everyone else was wearing. People knew exactly what the mainstream trend was. But in reality, there are always delays. It takes time for a signal to propagate across a brain; likewise it takes time for hipsters to read Complex or Pitchfork or whatever in order to figure out how to be contrarian.

So Touboul included a delay into the model. People would base their decisions not on the current state of affairs, but on the state of affairs some number of turns prior.

What Touboul noticed is that if you increase the delay factor past a certain point, something amazing happens. Out of what appears to be random noise, a pattern emerges. All of the hipsters start to synchronize, and they start to oscillate in unison.

This mathematician must be an early frontrunner for the Nobel Prize.

In all seriousness, though, the model has a certain explanatory elegance, akin to Schelling's Segregation Model.

A theory of jerks

Picture the world through the eyes of the jerk. The line of people in the post office is a mass of unimportant fools; it’s a felt injustice that you must wait while they bumble with their requests. The flight attendant is not a potentially interesting person with her own cares and struggles but instead the most available face of a corporation that stupidly insists you shut your phone. Custodians and secretaries are lazy complainers who rightly get the scut work. The person who disagrees with you at the staff meeting is an idiot to be shot down. Entering a subway is an exercise in nudging past the dumb schmoes.

We need a theory of jerks. We need such a theory because, first, it can help us achieve a calm, clinical understanding when confronting such a creature in the wild. Imagine the nature-documentary voice-over: ‘Here we see the jerk in his natural environment. Notice how he subtly adjusts his dominance display to the Italian restaurant situation…’ And second – well, I don’t want to say what the second reason is quite yet.

From Eric Schwitzgebel over at Aeon. He defines a jerk thus:

I submit that the unifying core, the essence of jerkitude in the moral sense, is this: the jerk culpably fails to appreciate the perspectives of others around him, treating them as tools to be manipulated or idiots to be dealt with rather than as moral and epistemic peers. This failure has both an intellectual dimension and an emotional dimension, and it has these two dimensions on both sides of the relationship. The jerk himself is both intellectually and emotionally defective, and what he defectively fails to appreciate is both the intellectual and emotional perspectives of the people around him. He can’t appreciate how he might be wrong and others right about some matter of fact; and what other people want or value doesn’t register as of interest to him, except derivatively upon his own interests. The bumpkin ignorance captured in the earlier use of ‘jerk’ has changed into a type of moral ignorance.

At some point in the technology world, it became fashionable to reject the jerk. Perhaps the first meaningful example was the great Netflix culture deck, with its rejection of the “brilliant jerk.” I don't know if any studies led to this movement or whether it was anecdotal, but somewhere along the line, the prevailing strategy shifted from trying to hire and isolate brilliant jerks to just rejecting them outright, perhaps because of a perceived increase in the need for effective collaboration among team members to ship important work.

Foodism, the new cultural signifier

Food may well have replaced high art, as Deresieswicz argues, but it has also replaced popular culture. People talk about food now the way they used to talk about bands. Music has become too fractured and diverse to provide the necessary combination of accessibility and specificity for self-definition. If you liked Kiss or The Eagles in the 1970s, the fact established part of who you were: your class affiliations, where you came from, what drugs you liked to take, whether you were urban or rural. Today, if you like Grizzly Bear or Kanye West, it virtually means nothing. You could be a banker or a member of Occupy Wall Street. You could be eighty or eighteen. You could live in East Texas or the Upper East Side. I mean, Marco Rubio's favorite group is NWA.
Today, your attitude toward pork belly is a clearer statement of who you are and where you come from than any television show you watch or band you follow. Tell me what you know about pasta, and I'll tell you how much your parents made, how much education you managed, how much is in your savings account. Unlike other cultural phenomena, which are more or less generationally undefined now, food explicitly identifies youthfulness. The younger you are, the more you know about food, generally speaking.

I've had this tab open since last Thanksgiving (I'm not joking, I know I have a problem), but now that it's been a year it still applies: On Thanksgiving, the Foodies Should Shut Up. Anecdotally, it does seem as if books, music, and movies have receded as cultural touchstones in favor of food and television.

The rise of TV and food in the pop culture pantheon are related. Television turned chefs (and by association their restaurants) into celebrities. Both television and food have also benefitted from the internet. Television episodes recaps and reviews, food and restaurant blogs, everywhere is a boom in information about a small number of shows and restaurants that we only once read about in books like the local newspaper or books like the Zagat guides. 

The number of items in each category matters. Whereas the number of restaurants in a city has stayed largely fixed, and whereas the number of TV shows I watch has increased but perhaps only by 2X, the number of bands and musicians I can follow now has increased by 5 to 10X. Almost everyone I know can easily name 10 bands they love that I've never heard of, but it's rare for someone to mention a restaurant in SF or a TV show I don't already know. Having a small number of items of shared devotion creates a sense of communal power. Music still matters, but more people I know share a greater overlap with my TV and restaurant favorites than with my music universe. 

Not everyone is happy with the foodism bubble.  Writes William Deresiewicz:

But what has happened is not that food has led to art, but that it has replaced it. Foodism has taken on the sociological characteristics of what used to be known — in the days of the rising postwar middle class, when Mortimer Adler was peddling the Great Books and Leonard Bernstein was on television — as culture. It is costly. It requires knowledge and connoisseurship, which are themselves costly to develop. It is a badge of membership in the higher classes, an ideal example of what Thorstein Veblen, the great social critic of the Gilded Age, called conspicuous consumption. It is a vehicle of status aspiration and competition, an ever-present occasion for snobbery, one-upmanship and social aggression. (My farmers’ market has bigger, better, fresher tomatoes than yours.) Nobody cares if you know about Mozart or Leonardo anymore, but you had better be able to discuss the difference between ganache and couverture.
But food, for all that, is not art. Both begin by addressing the senses, but that is where food stops. It is not narrative or representational, does not organize and express emotion. An apple is not a story, even if we can tell a story about it. A curry is not an idea, even if its creation is the result of one. Meals can evoke emotions, but only very roughly and generally, and only within a very limited range — comfort, delight, perhaps nostalgia, but not anger, say, or sorrow, or a thousand other things. Food is highly developed as a system of sensations, extremely crude as a system of symbols. Proust on the madeleine is art; the madeleine itself is not art.
A good risotto is a fine thing, but it isn’t going to give you insight into other people, allow you to see the world in a new way, or force you to take an inventory of your soul.
Yes, food centers life in France and Italy, too, but not to the disadvantage of art, which still occupies the supreme place in both cultures. Here in America, we are in danger of confusing our palates with our souls.

There's no doubt foodism has become a totem of class, and I would be happy to never see another photo of someone's dinner plate or lunch again on a social network–and that includes pictures of my own food, which I've been guilty of inflicting on others in the past (unless you're a chef or serious food journalist, then it's expected).

But the one aspect of foodism I do enjoy is the deep fetishism of the craft of cooking, as epitomized in documentaries like Jiro Dreams of Sushi and Kings of Pastry or cookbooks like The French Laundry Cookbook or Modernist Cuisine. Reading about the conception of a dish and then the process of perfecting it until it becomes a recipe for maximizing the chances reproducing the best version of that dish is no more ridiculous, to me, than an article discussing the redesign of a website, a book about how the Mac was made, or a piece in American Cinematographer about how a movie like Gravity was shot. The more complex and audacious the better. Craftsmanship is sexy.

Why we sign our emails "Thank you"

...the need for those sorts of rituals remains important, particularly in electronic communication where tone is hard to read. We end our communiques with “talk later,” “talk 2 u tomorrow,” or even simply “bye.” “Thanks” and “Thank you” have worked their way into this portion of the formula particularly in emails. More traditional valedictions have been replaced with “Thank you” so subtly that it’s now a common sign-off in this medium. But what does it mean? And why is it more acceptable than “Sincerely” or “Yours truly”?
It is in part be a reflection of our times. Email offers a speedier means of contact than an actual letter (and in some cases, a telephone), but that speed also means we’re sending more messages through this medium both for personal and professional reasons, and reading and responding to these messages requires a commitment of time. So it’s more important that the sender recognize the burden that they’ve placed on the recipient. In a time when letters took time to write, send, and respond to, it was important for the sender to attest to her reliability. Responses and actions were not so easy to take back. “Sincerely” and “Yours truly” which were meant to build trust between communicants. Credibility was an important determinant of whether a response would be issues. Today, as the web enables stranger to contact each other with little effort, credibility is less of a factor in determining responses (SPAM mail aside) when weighed against time.

From Scientific American.

I disagree with the end of the article, though, in which the author argues that affectionate closings are "vital to the continuation of the relationship."

The line between email and messaging (SMS, Facebook, Twitter DM's, WhatsApp, etc.) has blurred. In professional settings, you're taught that shorter emails are better, and that has removed one thing that differentiates email from messages. Since no one puts valedictions or even greetings in messages, they're starting to disappear from emails as well. Most of the email I receive no longer begins with a greeting like "Eugene", and most of them don't even end with a signature since it's clear from the From: line who the email is from.

Hollywood's cultural responsibility

Still cleaning out browser tabs from way back. This one is from August last year, a Jonathan Chait essay on how liberal views dominate Hollywood. As much as conservatives dominate Fox News, liberal viewpoints tend to dominate most other channels and most Hollywood movies.

Few will find that surprising. What's more important is Chait's survey of the available evidence of the power of movies and TV on society, and the evidence says it's powerful.

Several years ago, a trio of researchers working for the Inter-American Development Bank set out to help solve a sociological mystery. Brazil had, over the course of four decades, experienced one of the largest drops in average family size in the world, from 6.3 children per woman in 1960 to 2.3 children in 2000. What made the drop so curious is that, unlike the Draconian one-child policy in China, the Brazilian government had in place no policy to limit family size. (It was actually illegal at some point to advertise contraceptives in the overwhelmingly Catholic country.) What could explain such a steep drop? The researchers zeroed in on one factor: television.

Television spread through Brazil in the mid-sixties. But it didn’t arrive everywhere at once in the sprawling country. Brazil’s main station, Globo, expanded slowly and unevenly. The researchers found that areas that gained access to Globo saw larger drops in fertility than those that didn’t (controlling, of course, for other factors that could affect fertility). It was not any kind of news or educational programming that caused this fertility drop but exposure to the massively popular soap operas, or novelas, that most Brazilians watch every night. The paper also found that areas with exposure to television were dramatically more likely to give their children names shared by novela characters.

Novelas almost always center around four or five families, each of which is usually small, so as to limit the number of characters the audience must track. Nearly three quarters of the main female characters of childbearing age in the prime-time novelas had no children, and a fifth had one child. Exposure to this glamorized and unusual (especially by Brazilian standards) family arrangement “led to significantly lower fertility”—an effect equal in impact to adding two years of schooling.


A trio of communications professors found that watching Will & Grace made audiences more receptive to gay rights, and especially viewers who had little contact in real life with gays and lesbians. And that one show was merely a component of a concerted effort by Hollywood—dating back to Soap in the late seventies, which featured Billy Crystal’s groundbreaking portrayal of a sympathetic gay character, through Modern Family—to prod audiences to accept homosexuality. Likewise, the political persona of Barack Obama attained such rapid acceptance and popularity in part because he represented the real-world version of an archetype that, after a long early period of servile black stereotypes, has appeared in film and television for years: a sober, intelligent African-American as president, or in some other position of power.

Hollywood industry insiders deny a liberal bent to their work, claiming they're only following market demand, leading to an ironic configuration of the conservative-liberal debate on the topic: 

The denials generally take the form of a simple economic aphorism. The entertainment business is a business, so if its product leans left, it must reflect what the audience wants. One oddity of the Hollywood-liberalism debate is that it makes liberals posit the existence of a perfect, frictionless market, while conservatives find themselves explaining why a free market is failing to function as it ought to.

The power of popular culture is often underestimated, as is the value of art in general. Some of the examples noted above are why I've become less tolerant of lazy stereotypes or tropes in even the lightest of Hollywood movies. Argo was a skillfully crafted movie, but it played so loose with the facts that I can't in my heart embrace it as fully as I would had it been brave enough to stand by the truth. (Spoiler alert here) When the crowd I saw the movie with burst into applause as the plane lifted off the tarmac at the end, just eluding dozens of angry Iranian guards with guns giving chase in jeeps, I couldn't help feeling saddened that once again Hollywood felt the need to conform the truth to a more cartoonish narrative arc for commercial reasons. There's an entire section of the Argo Wikipedia entry titled Historical Accuracy which addresses the discrepancies. I felt the same unease with A Beautiful Mind, which scrubbed many less savory elements from John Nash's life to make him a more palatable hero: anti-semitism, homosexuality, a child he fathered with a nurse who he abandoned, and his divorce, among others.

[The phrase "based on a true story" is one of the great cheats in filmmaking. It excuses you to bend the facts but still capture the amazement of a credulous audience which won't ever bother to investigate what liberties the filmmakers took.]

Even movies which aren't based on historical facts bother me when they grab and reuse defective modules from the Hollywood library. Pitch Perfect was a cinematic stick of sugar, but it resorted to several popular stereotypes of Asians that made me cringe (I refer not to the quiet Asian singer with the bangs, but the no-fun bookworm of a roommate and her stone-faced friends).

The most common retort to all of this is that it's just a movie, and nobody cares or should care. But it's not just a movie, as Chait notes. These works of art carry cultural tropes that burrow into our brains and leave germs that sprout at an almost subconscious level.

And that's how I sense Hollywood's cultural influence operates, not as overtly as, say, those who claim that the use of guns in movies leads to shootings in direct emulation, but at a more subtle level. In a recent post I discussed two versions of the Sapir-Whorf hypothesis, strong and weak. If there was a theory of Hollywood's influence on our cultural norms, I'd subscribe to a weak version of it.