10 more browser tabs

Still trying to clear out browser tabs, though it's going about as well as my brief flirtation with inbox zero. At some point, I just decided inbox zero was a waste of time, solving a problem that didn't exist, but browser tab proliferation is a problem I'm much more complicit in.

1. Why the coming-of-age narrative is a conformist lie

From a more sociological perspective, the American self-creation myth is, inherently, a capitalist one. The French philosopher Michel Foucault theorised that meditating and journalling could help to bring a person inside herself by allowing her, at least temporarily, to escape the world and her relationship to it. But the sociologist Paul du Gay, writing on this subject in 1996, argued that few people treat the self as Foucault proposed. Most people, he said, craft outward-looking ‘enterprising selves’ by which they set out to acquire cultural capital in order to move upwards in the world, gain access to certain social circles, certain jobs, and so on. We decorate ourselves and cultivate interests that reflect our social aspirations. In this way, the self becomes the ultimate capitalist machine, a Pierre Bourdieu-esque nightmare that willingly exploits itself.
 
‘Growing up’ as it is defined today – that is, as entering society, once and for all – might work against what is morally justifiable. If you are a part of a flawed, immoral and unjust society (as one could argue we all are) then to truly mature is to see this as a problem and to act on it – not to reaffirm it by becoming a part of it. Classically, most coming-of-age tales follow white, male protagonists because their integration into society is expected and largely unproblematic. Social integration for racial, sexual and gender minorities is a more difficult process, not least because minorities define themselves against the norm: they don’t ‘find themselves’ and integrate into the social context in which they live. A traditional coming-of-age story featuring a queer, black girl will fail on its own terms; for how would her discovering her identity allow her to enter a society that insists on marginalising identities like hers? This might seem obvious, but it very starkly underscores the folly of insisting on seeing social integration as the young person’s top priority. Life is a wave of events. As such, you don’t come of age; you just age. Adulthood, if one must define it, is only a function of time, in which case, to come of age is merely to live long enough to do so.
 

I've written about this before, but almost always, the worst type of film festival movie is about a young white male protagonist coming of age. Often he's quiet, introverted, but he has a sensitive soul. As my first year film school professor said, these protagonists are inert, but they just "feel things." Think Wes Bentley in American Beauty filming a plastic bag dancing in the wind for fifteen minutes with a camcorder, then showing it to a girl as if it's Citizen Kane.

If they have any scars or wounds, they are compensated for with extreme gifts. Think Ansel Elgort in Baby Driver; cursed with tinnitus since childhood, he listens to music on a retro iPod (let's squeeze some nostalgic product placement in here, what the hell, we're also going to give him a deaf black foster father to stack the moral cards in his favor, might as well go all the way) and is, that's right, the best getaway driver in the business.

Despite having about as much personality as a damp elephant turd, their beautiful souls are both recognized and extracted by a trope which this genre of film invented just for this purpose, the manic pixie dream girl.

[Nathan Rabin, who invented the term manic pixie dream girl, has since disavowed the term as sometimes misogynist, and it can be applied too broadly like a hammer seeking nails, but it doesn't undo the reality that largely white male writing blocs, from guilds to writer's rooms, aren't great at writing women or people of color with deep inner lives.]

This is tangential to the broader point, that the coming-of-age story as a genre is, in and of itself, a lie. It reminds me of the distinction between Finite and Infinite Games, the classic book from James Carse. The Hollywood film has always promised a finite game, and thus it's a story that must have an ending. Coming-of-age is an infinite game, or at least until death, and so we should all be skeptical of its close-ended narrative.

(h/t Michael Dempsey)

2. Finite and Infinite Games and The Confederate

This isn't a browser tab, really, but while I'm on the topic of Carse's Finite and Infinite Games, a book which provides a framework with which so much of the world can be bifurcated, and while I'm thinking about the white male dominated Hollywood profession, I can't help but think of the TV project The Confederate, by the showrunners of Game of Thrones.

"White people” is seen by many whites as a pejorative because it lowers them to a racial class whereas before they were simply the default. They are not accustomed to having spent their entire lives being named in almost every piece of culture as a race, the way women, people of color, and the union of the two are, every single day, by society and culture.

All Lives Matter retort to Black Lives Matter is to pretend that we're all playing the same finite game when almost everyone who are losers in that game know it is not true. Blacks do not feel like they “won” the Civil War; every day today they live with the consequences and the shadow of America's founding racism, every day they continue to play a game that is rigged against them. That is why Ta Nehisi Coates writes that the question of The Confederate is a lie, and that only the victors of this finite game of America would want to relitigate the Civil War in some Alt History television show for HBO. It's as if a New England Patriot fan asked an Atlanta Falcons fan to watch last year's Super Bowl again, with Armie Hammer playing Tom Brady.

“Give us your poor, your huddled” is a promise that the United States is an infinite game, an experiment that struggles constantly towards bettering itself, evening the playing field, such that even someone starting poor and huddled might one day make a better life and escape their beginning state. That is why Stephen Miller and other white nationalists spitting on that inscription on the Statue of Liberty is so offensive, so dangerous.

On society, Carse writes:

The prizes won by its citizens can be protected only if the society as a whole remains powerful in relation to other societies. Those who desire the permanence of their prizes will work to sustain the permanence of the whole. Patriotism in one or several of its many forms (chauvinism, racism, sexism, nationalism, regionalism) is an ingredient in all societal play. 
 
Because power is inherently patriotic, is is characteristic of finite players to seek a growth of power in a society as a way of increasing the power of a society.
 

Colin Kaepernick refusing to stand for the National Anthem is seen as unpatriotic by many in America, including the wealthy white owners of such teams, which is not surprising, as racism is a form of patriotism, per Carse, and part and parcel of American society when defined as a finite game.

Donald Trump and his large adult sons are proof of just how powerful the inheritance of title and money are in America, and the irony that they are elected by those who feel that successive rounds of finite games have started to be rigged against them is not lost on anyone, not even, I suspect, them. One could argue they need to take a lesson from those oppressed for far longer as to how a turn to nihilism works out in such situations.

Those attacking Affirmative Action want to close off the American experiment and turn it into a series of supposedly level finite games because they have accumulated a healthy lead in this game and wish to preserve it in every form.

White nationalists like Trump all treat America as not just a finite game, but a zero sum finite game. The idea of immigrants being additive to America, to its potential, its output, is to treat America as an infinite game, open-ended. The truth lies, as usual, between the poles, but closer to the latter.

Beware the prophet who comes with stories of zero-sum games, or as Jim Collins once wrote, beware the "tyranny of the or." One of my definitions of leadership is the ability to turn zero-sum into positive sum games.

3. Curb Your Enthusiasm is Running Out of People to Offend

Speaking of fatigue with white male protagonists:

But if Larry David’s casual cruelty mirrors the times more than ever, the show might still fit awkwardly in the current moment. Watching the première of Season 9 on Sunday night, I kept thinking of a popular line from George Costanza, David’s avatar on “Seinfeld”: “You know, we’re living in a society!” Larry, in this first episode of the season, seems to have abandoned society altogether. In the opening shot, the camera sails over a tony swath of L.A., with no people and only a few cars visible amid the manicured lawns and terra-cotta roofs. It descends on Larry’s palatial, ivy-walled house, where he showers alone, singing Mary Poppins’s “A Spoonful of Sugar” and bludgeoning a bottle of soap. (Its dispenser pump is broken—grounds for execution under the David regime.) He’s the master of his domain, yes, but only by default: no one else is around.
 
“Curb” has always felt insulated, and a lot of its best jokes are borne of the fact that Larry’s immense wealth has warped his world view over the years. (On the most recent season he had no compunction about spending a princely sum on Girl Scout Cookies, only to rescind the order out of spite.) But the beginning of Season 9 offers new degrees of isolation. Like a tech bro ensconced in a hoodie and headphones, Larry seems to have removed himself almost entirely from public life. Both “Curb” and “Seinfeld” like to press the limits of etiquette and social mores, but the latter often tested these on subway cars and buses, in parks or on the street. Much of “Curb,” by contrast, unfolds in a faceless Los Angeles of air-conditioned mansions, organic restaurants, and schmoozy fund-raisers, a long chain of private spaces. The only time Larry encounters a true stranger, it’s in the liminal zone between his car and the lobby of Jeff’s office. She’s a barber on her way to see Jeff at work—even haircuts happen behind closed doors now.
 

Groundhog Day, one of the great movies, perhaps my favorite Christmas movie of all time, has long been regarded a great Buddhist parable

Groundhog Day is a movie about a bad-enough man—selfish, vain, and insecure—who becomes wise and good through timeless recurrence.
 

If that is so, then Curb Your Enthusiasm is its dark doppelganger, a parable about the dark secret at the heart of American society, that no person, no matter how selfish, vain, and petty, can suffer the downfall necessary to achieve enlightenment, if he is white and a man. 

In this case, he is a successful white man in Hollywood, Larry David, and each episode of Curb Your Enthusiasm is his own personal Groundhog Day. Whereas Bill Murray wakes up each morning to Sonny and Cher, trapped in Punxsutawney, Pennsylvania, around small town people he dislikes, in a job he feels superior to, Larry David wakes up each morning in his Los Angeles mansion, with rewards seemingly only proportionate to the depths of his pettiness and ill humor. Every episode, he treats all the friends and family around him with little disguised disdain, and yet the next episode, he wakes up in the mansion again.

Whereas Bill Murray eventually realizes the way to break out of his loop is to use it for self-improvement, Larry David seems to be striving to fall from grace by acting increasingly terrible and yet finds himself back in the gentle embrace of his high thread count sheets every morning.

Curb Your Enthusiasm has its moments of brilliance in its minute dissection of the sometimes illogical and perhaps fragile bonds of societal goodwill, and its episode structure is often exceedingly clever, but I can't help watching it now as nothing more than an acerbic piece of performance art, with all the self absorption that implies.

Larry David recently complained about the concept of first world problems, which is humorous, as it's difficult to think of any single person who has done as precise a job educating the world on what they are.

[What about Harvey Weinstein and Louis C.K., you might ask? Aren't they Hollywood royalty toppled from lofty, seemingly untouchable perches? The story of how those happened will be the subject of another post, because the mechanics are so illuminating.]

4. Nathan for You

I am through season 2 of Nathan for You, a Comedy Central show that just wrapped its fourth and final season. We have devalued the term LOL with overuse, but no show has made me literally laugh out loud by myself, on the sofa, as this, though I've grinned in pleasure at certain precise bits of stylistic parody of American Vandal.

Nathan Fielder plays a comedic version of himself. In the opening credits, he proclaims:

My name is Nathan Fielder, and I graduated from one of Canada's top business schools with really good grades [NOTE: as he says this, we see a pan over his transcript, showing largely B's and C's]. Now I'm using my knowledge to help struggling small business owners make it in this competitive world.
 

If you cringed while watching a show like Borat or Ali G, if you wince a bit when one of the correspondents on The Daily Show went to interview some stooge, you might believe Nathan For You isn't, well, for you. However, the show continues to surprise me.

For one thing, it's a deeply useful reminder of how difficult it is for physical retailers, especially mom and pop entrepreneurs, to generate foot traffic. That they go along with Fielder's schemes is almost tragic, but more instructive.

For another, while almost every entrepreneur is the straight person to Fielder's clown, I find myself heartened by how rarely one of them just turns him away outright. You can see the struggle on each of their faces, as he presents his idea and then stares at them for an uncomfortably long silence, waiting for them to respond. He never breaks character. Should they just laugh at him, or throw him out in disgust? It almost never happens, though one private investigator does chastise Fielder for being a complete loser.

On Curb Your Enthusiasm, Larry David's friends openly call him out for his misanthropy, yet they never abandon him. On Nathan For You, small business owners almost never adopt Fielder's ideas at the end of the trial. However, they almost never call him out as ridiculous. Instead, they try the idea with a healthy dose of good nature at least once, or at least enough to capture an episode's worth of material.

In this age of people screaming at each other over social media, I found this reminder of the inherent decency of people in face to face situations comforting and almost reassuring. Sure, some people are unpleasant both online and in person, and some people are pleasant in person and white supremacists in private.

But some people try to see the best in each other, give others the benefit of the doubt, and on such bonds a civil society are maintained. That this piece of high concept art could not fence in the humanity and real emotion of all the people participating, not even that of Fielder, is a bit of pleasure in this age of eye-rolling cynicism.

[Of course, these small business owners are aware a camera is on them, so the Heisenberg Principle of reality television applies. That a show like this, which depend on the subjects not knowing about the show, lasted four full seasons is a good reminder of how little-watched most cultural products are in this age of infinite content.]

BONUS CONTENT NO ONE ASKED FOR: Here is my Nathan for You idea: you know how headline stand-up comedians don't come on stage to perform until several lesser known and usually much lousier comics are trotted out to warm up the crowd? How, if you attend the live studio taping of a late night talk show like The Daily Show or The Tonight Show, some cheesy comic comes out beforehand to get your laugh muscles loose, your vocal chords primed? And when the headliner finally arrives, it comes as sweet relief?

What if there were an online dating service that provided such a warm-up buffoon for you? That is, when you go on a date, before meeting your date, first the service sends in a stand-in who is dull, awkward, a turn off in every way possible? But a few minutes into what seems to be a disastrous date, you suddenly show up and rescue the proceedings?

It sounds ridiculous, but this is just the sort of idea that Nathan for You would seem to go for. I haven't watched seasons 3 and 4 yet, so if he does end up trying this idea in one of those later episodes, please don't spoil it for me. I won't even be mad that my idea was not an original one, I'll be so happy to see actual footage of it in the field.

5. The aspect ratio of 2:00 to 1 is everywhere

I first read the case for 2:00 to 1 as an aspect ratio when legendary cinematographer Vittorio Storaro advocated for it several years ago. He anticipated a world where most movies would have a longer life viewed on screens at home than in movie theaters, and 2:00 to 1, or Univisium, is halfway between the typical 16:9 HDTV aspect ratio and Panavision, or 2:35 to 1.

So many movies and shows use 2:00 to 1 now, and I really prefer it to 16:9 for most work.

6. Tuning AIs through captchas

Most everyone has probably encountered the new popular captcha which displays a grid of photos and asks you to identify which contain a photo of a store front. I just experienced it recently signing up for HQTrivia. This breed of captcha succeeds the wave of captchas that showed photos of short strings of text or numbers and asked you to type in what you saw, helping to train AIs trying to learn to read them. There are variants of the store front captcha: some ask you to identify vehicles, others to identify street signs, but the speculation is that Google uses these to train the "vision" of its self-driving cars.

AI feels like magic when it works, but underrated is the slow slog to take many AI's from stupid to competent. It's no different than training a human. In the meantime, I'm looking forward to being presented with the captcha that shows two photos, one of a really obese man, the other of five school children, with this question above them: "If you had to run over and kill the people in one of these photos, which would you choose?"

7. It's Mikaela Shiffrin profile season, with this one in Outside and this in the New Yorker

I read Elizabeth Weil's profile of Shiffrin in Outside first:

But the naps: Mikaela not only loves them, she’s fiercely committed to them. Recovery is the most important part of training! And sleep is the most important part of recovery! And to be a champion, you need a steadfast loyalty to even the tiniest and most mundane points. Mikaela will nap on the side of the hill. She will nap at the start of the race. She will wake up in the morning, she tells me after the gym, at her house, while eating some pre-nap pasta, “and the first thought I’ll have is: I cannot wait for my nap today. I don’t care what else happens. I can’t wait to get back in bed.”
 
Mikaela also will not stay up late, and sometimes she won’t do things in the after­noon, and occasionally this leads to more people flipping out. Most of the time, she trains apart from the rest of the U.S. Ski Team and lives at home with her parents in Vail (during the nine weeks a year she’s not traveling). In the summers, she spends a few weeks in Park City, Utah, training with her teammates at the U.S. Ski and Snowboard Center of Excellence. The dynamic there is, uh, complicated. “Some sports,” Mikaela says, “you see some athletes just walking around the gym, not really doing anything, eating food. They’re first to the lunchroom, never lifting weights.”
 

By chance, I happened to be reading The Little Book of Talent: 52 Tips for Improving Your Skills by Daniel Coyle, and had just read tips that sounded very familiar to what was mentioned here.

More echoes of Coyle's book in The New Yorker profile:

My presumption was that her excellence was innate. One sometimes thinks of prodigies as embodiments of peculiar genius, uncorrupted by convention, impossible to replicate or reëngineer. But this is not the case with Shiffrin. She’s as stark an example of nurture over nature, of work over talent, as anyone in the world of sports. Her parents committed early on to an incremental process, and clung stubbornly to it. And so Shiffrin became something besides a World Cup hot shot and a quadrennial idol. She became a case study. Most parents, unwittingly or not, present their way of raising kids as the best way, even when the results are mixed, as such results usually are. The Shiffrins are not shy about projecting their example onto the world, but it’s hard to argue with their findings. “The kids with raw athletic talent rarely make it,” Jeff Shiffrin, Mikaela’s father, told me. “What was it Churchill said? Kites fly higher against a headwind.”
 

So it wasn't a real surprise to finally read this:

The Shiffrins were disciples of the ten-thousand-hours concept; the 2009 Daniel Coyle book “The Talent Code” was scripture. They studied the training methods of the Austrians, Alpine skiing’s priesthood. The Shiffrins wanted to wring as much training as possible out of every minute of the day and every vertical foot of the course. They favored deliberate practice over competition. They considered race days an onerous waste: all the travel, the waiting around, and the emotional stress for two quick runs. They insisted that Shiffrin practice honing her turns even when just skiing from the bottom of the racecourse to the chairlift. Most racers bomb straight down, their nonchalance a badge of honor.
 

Coyle's book, which I love for its succinct style (it could almost be a tweetstorm if Twitter had slightly longer character limits, each tip is averages one or two paragraphs long), is the books I recommend to all parents who want their kids to be really great at something, and not just sports.

Much of the book is about the importance of practice, and what types of practice are particularly efficient and effective.

Jeff Shiffrin said, “One of the things I learned from the Austrians is: every turn you make, do it right. Don’t get lazy, don’t goof off. Don’t waste any time. If you do, you’ll be retired from racing by the time you get to ten thousand hours.”
 
“Here’s the thing,” Mikaela told me one day. “You can’t get ten thousand hours of skiing. You spend so much time on the chairlift. My coach did a calculation of how many hours I’ve been on snow. We’d been overestimating. I think we came up with something like eleven total hours of skiing on snow a year. It’s like seven minutes a day. Still, at the age of twenty-two, I’ve probably had more time on snow than most. I always practice, even on the cat tracks or in those interstitial periods. My dad says, ‘Even when you’re just stopping, be sure to do it right, maintaining a good position, with counter-rotational force.’ These are the kinds of things my dad says, and I’m, like, ‘Shut up.’ But if you say it’s seven minutes a day, then consider that thirty seconds that all the others spend just straight-lining from the bottom of the racecourse to the bottom of the lift: I use that part to work on my turns. I’m getting extra minutes. If I don’t, my mom or my coaches will stop me and say something.”
 

Bill Simmons recently hosted Steve Kerr for a mailbag podcast, and part I is fun to hear Kerr tell stories about Michael Jordan. Like so many greats, Jordan understood that the contest is won in the sweat leading up to the contest, and his legendary competitiveness elevated every practice and scrimmage into gladiatorial combat. As Kerr noted, Jordan single-handedly was a cure for complacency for the Bulls. 

He famously broke down some teammates with such intensity in practice that they were driven from the league entirely (remember Rodney McCray?). Everyone knows he once punched Steve Kerr and left him with a shiner during a heated practice. The Dream Team scrimmage during the lead in to the 1992 Olympics, in which the coaches made Michael Jordan one captain, Magic Johnson the other, is perhaps the single sporting event I most wish had taken place in the age of smartphones and social media.

What struck me about the Shiffrin profiles, something Coyle notes about the greats, is how many of the lives of the great ones are unusually solitary, spent in deliberate practice on their own, apart from teammates. It's obviously amplified for individual sports like tennis and skiing and golf, but even for team sports, the great ones have their own routines. Not only is it lonely at the top, it's often lonely on the way there.

8. The secret tricks hidden inside restaurant menus

Perhaps because I live in the Bay Area, it feels as if the current obsession is with the dark design patterns and effects of social apps. But in the scheme of things, many other fields whose work we interact with daily have many more years of experience designing to human nature. In many ways, people designing social media have a very naive and incomplete view of human nature, but the power of the distribution of ubiquitous smartphone and network effects have elevated them to the forefront of the conversation.

Take a place like Las Vegas. Its entire existence is testament to the fact that the house always wins, yet it could not exist if it could not convince the next sucker to sit down at the table and see the next hand. The decades of research into how best to part a sucker from his wallet makes the volume of research among social media companies look like a joke, even if the latter isn't trivial.

I have a sense that social media companies are similar to where restaurants are with menu design. Every time I sit down at a new restaurant, I love examining the menus and puzzling over all the choices with fellow diners, as if having to sit with me over a meal isn't punishment enough. When the waiter comes and I ask for an overview of the menu, and recommendations, I'm wondering what dishes the entire experience is meant to nudge me to order.

I'm awaiting the advent of digital and eventually holographic or A/R menus to see what experiments we'll see. When will we have menus that are personalized? Based on what you've enjoyed here and other restaurants, we think you'll love this dish. When will we see menus that use algorithmic sorting—these are the most ordered dishes all-time, this week, today? People who ordered this also ordered this? When will see editorial endorsements? "Pete Wells said of this dish in his NYTimes review..."

Not all movies are worth deep study because not all movies are directed with intent. The same applies to menus, but today, enough menus are put through a deliberate design process that it's usually a worthwhile exercise to put them under the magnifying glass. I would love to read some blog that just analyzes various restaurant menus, so if someone starts one, please let me know.

9. Threat of bots and cheating looms as HQ Trivia reaches new popularity heights

When I first checked out HQ Trivia, an iOS live video streaming trivia competition for cash prizes, the number of concurrent viewers playing, displayed on the upper left of the screen, numbered in the hundreds. Now the most popular of games, which occur twice a day, attract over 250K players. In this age where we've seen empires built on exploiting the efficiencies to be gained from shifting so much of social intimacy to asynchronous channels, it's fun to be reminded of the unique fun of synchronous entertainment.

What intrigues me is not how HQ Trivia will make money. The free-to-play game industry is one of the most savvy when it comes to extracting revenue, and even something like podcasts points the way to monetizing popular media with sponsorships, product placement, etc.

What's far more interesting is where the shoulder on the S-curve is. Trivia is a game of skill, and with that comes two longstanding issues. I've answered, at most, 9 questions in a row, and it takes 12 consecutive right answers to win a share of the cash pot. I'm like most people in probably never being able to win any cash.

This is an issue faced by Daily Fantasy Sports, where the word "fantasy" is the most important word. Very soon after they became popular, DFS were overrun by sharks submitting hundreds or thousands of lineups with the aid of computer programs, and some of those sharks worked for the companies themselves. The "fantasy" being sold is that the average person has a chance of winning.

As noted above in my comment about Las Vegas, it's not impossible to sell people on that dream. The most beautiful of cons is one the mark willingly participates in. People participate in negative expected value activities all the time, like the lottery, and carnival games, and often they're aware they'll lose. Some people just participate for the fun of it, and a free-to-play trivia game costs a player nothing other than some time, even if the expected value is close to zero.

A few people have asked me whether that live player count is real, and I'm actually more intrigued by the idea it isn't. Fake it til you make it is one of the most popular refrains of not just Silicon Valley but entrepreneurs everywhere. What if HQ Trivia just posted a phony live player count of 1 million tomorrow? Would their growth accelerate even more than it has recently? What about 10 million? When does the marginal return to every additional player in that count go negative because people feel that there is so much competition it's not worth it? Or is the promise of possibly winning money besides the point? What if the pot scaled commensurate to the number of players; would it become like the lottery? Massive pots but long odds?

The other problem, linked to the element of skill, is cheating. As noted in the article linked above, and in this piece about the spike in Google searches for answers during each of the twice-a-day games, cheating is always a concern in games, especially as the monetary rewards increase. I played the first game when HQ Trivia had a $7,500 cash pot, and the winners each pocketed something like $575 and change. Not a bad payout for something like 10 minutes of fun.

Online poker, daily fantasy sports, all are in constant battle with bots and computer-generated entries. Even sports books at casinos have to wage battle with sharks who try to get around betting caps by sending in all sorts of confederates to put down wagers on their behalf.

I suspect both of these issues will be dampeners on the game's prospects, but more so the issue of skill. I already find myself passing on games when I'm not with others who also play or who I can rope into playing with me. That may be the game's real value, inspiring communal bonding twice a day among people in the same room.

People like to quip that pornography is the tip of the spear when it comes to driving adoption of new technologies, but I'm partial to trivia. It is so elemental and pure a game, with such comically self-explanatory rules, that it is one of the elemental forms or genres of gaming, just like HQ Trivia host Scott Rogowsky is some paragon of a game-show host, mixing just the right balance of cheesiness and snarkiness and effusiveness needed to convince all the players that any additional irony would be unseemly.

10. Raising a teenage daughter

Speaking of Elizabeth Weil, who wrote the Shiffrin profile for Outside, here's another of her pieces, a profile of her daughter Hannah. The twist is that the piece includes annotations by Hannah after the fact.

It is a delight. The form is perfect for revealing the dimensions of their relationship, and that of mothers and teenage daughters everywhere. In the interplay of their words, we sense truer contours of their love, shaped, as they are, by two sets of hands.

[Note, Esquire has long published annotated profiles, you can Google for them, but they are now all locked behind a paywall]

This format makes me question how many more profiles would benefit from allowing the subject of a piece to annotate after the fact. It reveals so much about the limitations of understanding between two people, the unwitting and witting lies at the heart of journalism, and what Janet Malcolm meant, when she wrote, in the classic opening paragraph of her book The Journalist and the Murderer, "Every journalist who is not too stupid or too full of himself to notice what is going on knows that what he does is morally indefensible."

Interview with Matthew Gentzkow

Due to this work, we now know that newspaper media slant is driven mostly by the preferences of readers, not newspaper owners. And by examining browser data, he discovered that people don’t largely live in internet “echo chambers”—that is, they don’t exclusively visit sites that align with their political bent. Product brand preferences, he found, are established early in life and endure long after exposure to essentially identical, less expensive alternatives.
 

That's from the introduction to this interview of 2014 Clark Medal winner Matthew Gentzkow. This immediately caught my eye because it echoes some ideas I have (which is perhaps ironic considering one of those points is about the over-estimation of Internet echo chambers).

I think the Internet has expanded, on balance, the volume of ideas on all sides that most people are exposed to, offsetting the echo chamber effect. What should concern us is how people have reacted to that broadened exposure; instead of pushing people to the center, it has increased polarization. That may say more about how we receive ideas that threaten our worldviews and tribal affiliations than it does about the inherent nature of the internet. 

Like Gentzkow, I also believe the reason so much advertising targets young people, even though it's the adults that have money, is to lock in consumer preference for life. In that respect much of that advertising is more efficient than it appears.

Frankly, this interview contains so much high quality material I can excerpt all day and still barely make a dent, so do read the whole thing.

Good news for parents who, on occasion, let their kids watch a bit of TV just to get a respite from care-taking duties.

This reflects what I think is an important conceptual point—that took a while to really sink in for us—which is that you can’t talk about the effect of TV without thinking about what it’s crowding out. TV viewing is shifting time around. And, really, for any new technology, any change that is shifting the allocation of time, its effect is the effect of that technology relative to whatever you would have been doing otherwise. 
 
That has pretty important implications for this question because if you think about children of different backgrounds and what else they might be doing with their time, it’s easy to imagine that for some kids, watching television is a much richer source of input than a lot of what it might be crowding out. TV has lots of language; it exposes them to lots of different people and ideas. 
 
It’s also easy to imagine kids for whom it could be a lot worse than whatever else they would have been doing. Educated, wealthy parents or parents with a lot of time to invest in their kids might be taking them to museums and doing math problems with them and so forth. I think part of the reason so many people writing about this assume TV is bad is that they themselves are in the latter group.
 

We have a strong norm in America about the corrupting influence of TV on children. I'm not sure how it arose or where it came from, but I'd love to know the history of that meme.

Regardless, what it means is that TV is often underrated for its positive aspects. I saw a paper once, though I can't seem to track it down, that showed that the introduction of TV in different countries and societies correlated with a strong rise in equality for a variety of groups including women and minorities.

That's not so surprising when you consider just how efficient television is at transmitting cultural norms. Humans love stories, and in this age those stories travel most efficiently to more people when encoded in the form of television and film narratives.

On the other hand, TV has had a negative effect on political turnout.

On the other hand, TV isn’t just political information; it’s also a lot of entertainment. And in that research, I found that what seemed to be true is that the more important effect of TV is to substitute for—crowd out—a lot of other media like newspapers and radio that on net had more political content. Although there was some political content on TV, it was much smaller, and particularly much smaller for local or state level politics, which obviously the national TV networks are not going to cover. 
 
So, we see that when television is introduced, indeed, voter turnout starts to decline. We can use this variation across different places and see that that sharp drop in voter turnout coincides with the timing of when TV came in.
 

This reminds me of an idea I've written about before, that in this age of near infinite content, we now gravitate towards an information diet that is much more reflective of our daily preferences than in the past. Newspapers of old started with the front page and included editorially prescribed sections in equal volume: World, Business, Sports, Entertainment, Autos, and so on.

I was always skeptical those sections merited equal surface area, but it wasn't until readers could actually consume anything they wanted that we had a true view of their preferences. The internet is perhaps history's great lab on consumer choice, and what it shows is that most people generally only want small doses of the main entree of hard news, and a lot more appetizers and dessert: sports, entertainment, celebrity gossip, clickbait self-help, pornography. 

That's why the addition of The Ringer is valuable for Medium. There are only so many tech confessional pieces even the most ardent tech enthusiast can handle in the Silicon Valley bubble chamber; scatter a few copies of US Weekly on the coffee table, and hang a flatscreen TV tuned into ESPN, and more people will visit more often.

My co-authors, Bart and JP, along with Sanjay Dhar, another co-author of theirs, had written a really important paper in the Journal of Political Economy a couple of years earlier that documented huge differences across U.S. cities in which brands are popular. They showed that that actually is correlated with the timing of which brands were introduced first in those cities, even though all of those introductions happened, for the most part, 50 or 100 years ago and few people remember a time when you couldn’t buy both. Say, for example, that we have two brands that have both been in a particular city for 50 years. If one was introduced 70 years ago and the other 50 years ago, you can predict that the one that’s been there for 70 years is going to have a much bigger market share.
 

We often think of first-mover advantage in sectors with network effects, perhaps none more clearly so than in messaging, with the odd geographically clustered favorites around the world. What Gentzkow notes here is that first-mover advantage can apply in consumer packaged goods, too. 

It's not that surprising, though I point it out for those who are always questioning why brands target unemployed millennials or kids without any income with advertising. Think about the loyalty fans have to sports teams from their childhood hometowns, long after they've moved elsewhere.

Our research results push back on that and say that, at least in this particular context, ownership is not really the key driver of slant and, in fact, a lot of the driver is actually coming from consumer demand. Not only does that say that you might not need to be as worried about ownership, but it also says that the welfare implications of this are a little more complicated because now consumers are getting what they want. 
 
We might think from a political, democratic point of view that it would be better if the public got different, more diverse information. But there’s going to be a welfare trade-off because we would be giving them content they would prefer less. If we want to give people diverse content that we think is good for democracy, then we have to get them to actually read, watch or consume it. And, you know, giving a bunch of people in conservative places some liberal newspaper—well, our results would suggest they’re not going to read it. So, that seems to have important implications for policy. 
 
But it comes with a really important caveat. The finding that ownership doesn’t matter in terms of a newspaper’s political slant is not a universal result. It doesn’t apply everywhere. It’s a statement about newspaper markets in the United States—a highly commercialized, relatively competitive setting, and a place where the political returns to manipulating the average content of a newspaper might not be all that big.
 

The chicken and egg question: did Fox News come along and satisfy a market need that conservatives weren't aware of, or did the market need summon Fox News out of nothingness?

If the filter bubble is not the internet's creation, but inherent to human nature, that argues for a much different solution than just exposing people to more ideas. Perhaps it's how the ideas are framed? How people are educated? Do we need to instill different mental models?

I'm fairly certain that taking an angry Trump supporter, cuffing them in a chair, locking their eyes open like Alex undergoing the Ludovico technique in A Clockwork Orange, and forcing them to watch Rachel Maddow for days on end isn't going to have the salutary effect one might suppose (and neither would force feeding a liberal Fox News).

Chinese robber fallacy

Given the recent discussion of media bias here, I wanted to bring up Alyssa Vance’s “Chinese robber fallacy”, which she describes as:
 
..where you use a generic problem to attack a specific person or group, even though other groups have the problem just as much (or even more so)
 
For example, if you don’t like Chinese people, you can find some story of a Chinese person robbing someone, and claim that means there’s a big social problem with Chinese people being robbers.
 
I originally didn’t find this too interesting. It sounds like the same idea as plain old stereotyping, something we think about often and are carefully warned to avoid.
 
But after re-reading the post, I think the argument is more complex. There are over a billion Chinese people. If even one in a thousand is a robber, you can provide one million examples of Chinese robbers to appease the doubters. Most people think of stereotyping as “Here’s one example I heard of where the out-group does something bad,” and then you correct it with “But we can’t generalize about an entire group just from one example!” It’s less obvious that you may be able to provide literally one million examples of your false stereotype and still have it be a false stereotype. If you spend twelve hours a day on the task and can describe one crime every ten seconds, you can spend four months doing nothing but providing examples of burglarous Chinese – and still have absolutely no point.
 
If we’re really concerned about media bias, we need to think about Chinese Robber Fallacy as one of the media’s strongest weapons. There are lots of people – 300 million in America alone. No matter what point the media wants to make, there will be hundreds of salient examples. No matter how low-probability their outcome of interest is, they will never have to stop covering it if they don’t want to.
 

A fantastic and important post by Scott Alexander of the great Slate Star Codex: Cardiologists and Chinese Robbers.

This is why I'm so suspicious of anecdote-based journalism, especially when it comes from an outlet with a hallowed reputation. Think back to the piece on Amazon working conditions in the NYTimes, and see how much actual data backs up some of the generalizations made in the piece. I'm not saying that the individual stories of terrible managers don't matter, because each of those in and of themselves was terrible and worth deep investigation.

Many people I know just take it for granted that it's like that throughout the company, though. Take this op-ed from Joe Nocera. He felt comfortable enough, after reading that piece, to make sweeping statements like this:

It’s an enormously adversarial place. Employees who face difficult life moments, such as dealing with a serious illness, are offered not empathy and time off but rebukes that they are not focused enough on work. A normal workweek is 80 to 85 hours, in an unrelenting pressure-cooker atmosphere.
 

I will bet Joe Nocera his net worth that the average workweek at Amazon is not 80 to 85 hours. I don't think any company in the world with over 170,000 employees has an average work week approaching anywhere near 80 to 85 hours. But hey, it's just a NYTimes op-ed, let's just throw a crazy fact like that out there with no sourcing whatsoever, who's going to fact-check an op-ed anyhow?

What 170,000 employees and who knows how many former employees provides a reporter is a lot of people to mine for Chinese robbers.

[Incidentally, that large a sample should also provide plenty of counter-examples, but Amazon's restrictive, and in my opinion, short-sighted social media policy prevents folks like that from speaking out. One employee couldn't take the piece lying down and wrote a rebuttal on LinkedIn, and later other former employees came out in the company's defense, including one who felt her story was used in the piece in a misleading way. It doesn't have to work just in the company's favor, other stories like this one have come and added to some of the terrible anecdotes in the original NYTimes piece. However, since the social media policy restricts current employees from speaking out, it likely mutes the largest population of people who enjoy working there.]

I don't mean to wade back into the Amazon debate with this piece, and parts of it, even if rhetorically framed with bias, struck me as reasonably accurate. It just happens to be the most prominent recent example of Chinese robber fallacy that came to mind. Anyone who's been the subject of an anecdote-based journalistic piece should be suspicious of such pieces, yet so many people in and outside of tech took the Amazon piece as gospel.

The fact is, the Chinese robber fallacy really works. It must be so satisfying, as a reporter, to come across a source willing to go on the record with a dramatic narrative, even if it isn't statistically significant. That source also has spent their life looking for narrative patterns, and soon it's Chinese robbers all the way down.

Humans are wired to respond to narratives, to draw conclusions based on insufficient data. We're all looking for narrative shortcuts to the truth. When reporters give us a few carefully chosen examples, it's game over, regardless of whether or not it's a statistically significant sample, or whether or not the sample was plagued by selection bias.

Such journalism can be moving and hugely important. It can move people's hearts, and that's often what's needed to change the world. But it's also a dangerous weapon. Recall Janet Malcolm's opening line to her classic piece “The Journalist and the Murderer”:

Every journalist who is not too stupid or full of himself to notice what is going on knows that what he does is morally indefensible.
 

She meant it in a different context, but it echoes here.

Journalism with lots of data and statistics aren't sexy. They may not even require as much legwork as interviewing lots of people over long period of time, and it's not the type of journalism that gets dramatized in the movies. But there's a reason that science isn't based on a few good stories.

Interview with Jon Stewart

A few bits I dug from this good mini-profile of Jon Stewart:

Instead, he describes his decision to quit The Daily Show, the American satirical news programme he has hosted for 16 years, as something closer to the end of a long-term relationship. “It’s not like I thought the show wasn’t working any more, or that I didn’t know how to do it. It was more, ‘Yup, it’s working. But I’m not getting the same satisfaction.’” He slaps his hands on his desk, conclusively. 
 
“These things are cyclical. You have moments of dissatisfaction, and then you come out of it and it’s OK. But the cycles become longer and maybe more entrenched, and that’s when you realise, ‘OK, I’m on the back side of it now.’”
 
...
 
Like every TV celebrity, in person, Stewart is both better-looking than you expect and smaller, with his long torso making up most of his 5ft 7in, giving the illusion of height from behind his studio desk. He is dressed casually, and after years of watching him on TV wearing a suit, seeing him in a T-shirt and casual trousers feels almost like catching my father half-undressed.
 
...
 
Now that he is leaving The Daily Show, is there any circumstance in which he would watch Fox News again? He takes a few seconds to ponder the question. “Umm… All right, let’s say that it’s a nuclear winter, and I have been wandering, and there appears to be a flickering light through what appears to be a radioactive cloud and I think that light might be a food source that could help my family. I might glance at it for a moment until I realise, that’s Fox News, and then I shut it off. That’s the circumstance.”

Facebook hosting doesn't change things, the world already changed

Like any industry, the media loves a bit of navel-gazing (what is the origin of this phrase, because I don't enjoy staring at my own navel; maybe mirror-preening instead?). When Facebook announced they were offering to host content from media sites like The New York Times, the media went into a frenzy of apocalyptic prediction, with Mark Zuckerberg in the role of Mephistopheles.

All this sound and fury, signifying nothing. Whether media sites allow Facebook to host their content or not won't meaningfully change things one way or the other, and much of the FUD being spread would be energy better spent focused on other far larger problems.

Let's just list all the conditions that exist and won't change one bit whether or not you let Facebook host your content:

  • News is getting commodified. The days of being special just for covering a story are over. Beyond millions of citizen journalists that the internet has unleashed, you're competing with software that can do basic reporting. Tech press inadvertently furnished evidence of the commodification of news when, in the past few years, they all did a giant game of musical chairs, seemingly everyone picking up and moving from one site to the next. Are these sites anything more than a collection of their reporters? If so, did the brands meaningfully change when everyone switched seats? I love and respect many tech reporters, but a lot of others seem interchangeable (though I like some of them, too). Instead of just reporting news, what matters is how you report it: your analysis, the quality of your writing and infographics, the uniqueness of your perspective. The bar is higher to stand out, as it tends to be when...
  • ...distribution is effectively free. Instead of pulp, our words take the form of bits that are distributed across...oh, you know. As the Unfrozen Caveman might say, “Your packets of data frighten and confuse me!” The Internet: seventh wonder of the world. This must be what it feels like to have grown up when electricity first became widespread. Or sewer systems. Okay, maybe not as great as sewer systems, I don't know how people lived before that.
  • Marketing is cheaper. You can use Twitter or Facebook or other social media to make a name for yourself. Big media companies can take advantage of that, too, but the incremental advantage is greater for the indies. Ben Thompson is one of my favorite examples, an independent tech journalist/writer living in Taiwan who built up his brand online to the point that I pay him $10 a month to have him send me email every day, and it's worth every penny. He is smarter about the tech industry than just about every “professional” journalist covering tech, and he's covered a lot of what I'm covering here already. He's just one example of how...
  • ...competition for attention is at an all-time high and getting worse. Facebook already competes with you, whether you let them host your content or not. So does Snapchat, Instagram, Twitter, IM, Yik Yak, television, cable, Netflix, video games, Meerkat/Periscope, movies, concerts, Spotify, podcasts, and soon VR. When it comes to user attention, the one finite resource left in media, most distractions are close substitutes.
  • Facebook will continue to gain audience. Even if Facebook pauses for a rest after having gained over 1 billion users, they also own Instagram, which is growing, and WhatsApp, which will likely hit 1 billion users in the near future, and Oculus, which is one part of the VR market which is one portion of the inception of the Matrix that we will all be living in as flesh batteries for Colonel Sanders in the medium-range future. If you think withholding your content from Facebook will change their audience meaningfully one way or the other, you really may be an unfrozen caveman from journalism's gilded age. The truth is...
  • Facebook and Twitter and other social media drive a huge % of the discovery of content. Media companies can already see this through their referral logs. This isn't unique to the text version of media. Facebook drives a huge share of YouTube video streams, which is why they're building their own video service, because why send all that free ad revenue to a competitor when you can move down the stack and own it yourself. And also, YouTube's ad model is not that great: those poorly targeted banner ads that pop up and cover the video in a blatant show of disrespect for the content, those pre-rolls you have to wait 5 seconds to skip...wait a minute, this sounds a lot like how...
  • ...media ad experiences are awful. I wonder sometimes if folks at media companies ever try clicking their own links from within social media like Twitter or Facebook, just to experience what a damn travesty of a user experience it is. Pop-ups that hide the content and that can't be scrolled in an in-app browser so you effectively can't ever close them to read the article. Hideous banner ads all over the page. Another pop-up trying to get you to sign up for a newsletter for the site when you haven't even read the article to see if you'd even want to get that newsletter (the answer is no, by the way). Forced account creation or login screens, also before you read a word of content. An interstitial ad that tries to load video for a few seconds while you wait patiently for a countdown timer or X button to close it out as quickly as possible. Short articles spread across 3 pages for no reason other than to inflate page views. Articles that take so long to load that you just click away because in-app browsers are already disadvantaged from a speed perspective, and media sites compound the problem by loading a ton of cruft like ad tracking and other crap all over the place, reducing the content to just a fraction of the total payload. It's the reading equivalent of being a beautiful girl at a New York bar, getting hit on by dozens of obnoxious first year investment banking analysts in pinstripe suits and Hermès ties. This is what happens when you treat your reader like a commodified eyeball to monetize and not a living, breathing human whose patronage you appreciate and wish to nurture. And this is why I'm happy when services like Flipboard or Facebook transform content into a more friendly reading experience. Chris Cox of Facebook said that reading on mobile is still a crummy experience, and amen to that. The poor media ad experience is a symptom of the fact that...
  • ...media business models are not great. Monopolies don't have to have great business models, because as Peter Thiel will tell you, being a monopoly is itself a great business model. For the longest time at media sites, and this probably still happens, the reporters sat on a different floor for the ad sales folks. This meant that the way the company made money was divorced from the product people (to use a more techie term). This works great when there isn't a lot of user choice (“No one ever got fired for buying IBM”) and the ad sales people can throw their weight around (before), but not so great when ad buyers suddenly have a whole lot more choice in where to spend their money (now). It turns out that having your best product people separate from your ad team is a dangerous game and leads to a terrible ad experience, which should come as a surprise to no one. Many still defend this practice as a way to preserve journalistic integrity, a separation of church and state that keeps the money from corrupting the writing, but the Internet has other ways to defend against that now. It's great that the New York Times has a public editor in Margaret Sullivan, but today the eyeballs of the world on your content serve as one giant collective public editor, like some human blockchain of integrity. I sympathize with media companies, though, because even if they wanted to improve on this front...
  • ...tech companies have better ad platforms than media companies. Facebook's native ad unit may not be perfect, but it's leaps and bounds better than the god awful ad experience on almost any media site. It's better not just for readers, but likely for advertisers, too. At Flipboard, we went with full-page ads a la glossy fashion magazines because our philosophy was that when content is on the screen, it deserves your full attention, and the same with ads, never the two shall meet. This is exacerbated by the smaller screen sizes of mobile phones and tablets. Trying to split user attention with banner ads is a bad idea for both readers and advertisers, and most every study on ad recall and effectiveness that I've seen bear this out. Because of tech companies' scale and technology advantage, as noted in the previous bullet, their ad platforms will continue to get better and scale, while those at media companies will not. When I was at Hulu, we shopped around for an ad platform that could meet all our needs and couldn't find one so we just rolled our own. That's possible if you can hire great developers, but if you're a media company, it's not easy, and that's because...
  • ...tech companies have a tech hiring advantage on non-tech companies. This sounds like it's self-evident, but it's critical and worth emphasizing. It's not just media but other businesses that suffer from this (which is particularly awful for consumers when it comes to information security). At this hour of the third industrial revolution, software is eating the world, but we still have a scarcity of software developers, let alone great ones. The ones that are blessed to live in this age want to work with other great developers at cool technology companies where the lunches are free, the dress codes are flexible, the hours vampiric, and ping pong tables abound. It's like being a free range chicken, but with stock options and before the death and refrigeration. Companies like that include Facebook, Google, Apple, Amazon, and so on, but they don't include most media companies, even though most of those also allow you to dress how you want, I think. Maybe someday the market will overcorrect itself and everyone will know how to program, but by that point we will probably all be living lives of leisure while AI software and robots take care of everything while we just lounge around experiencing a never-ending stream of personalized VR pleasure. If David Foster Wallace were alive to rewrite Infinite Jest, VR would be the infinite jest.
  • Design skill is not equally distributed. In an age when software comes to dominate more of the world, the returns to being great at user interface design are still high and will continue to be for some time. It's no wonder that Apple is the world's largest company now given their skill at integrated software and hardware design. That's become the most valuable user experience in the world to dominate. It's not going to let up, either. Every day I still experience a ton of terrible user experiences, from government to healthcare to education to household appliances to retail to you name it. The number of great product and design people in the world is still much too finite, and it happens that a lot of them work for tech companies. Not for companies in all the other industries I named above. Even in tech, the skills are too finite, which is why enterprise software is being disrupted by companies like Dropbox and Slack and others that simply bring a better user experience than the monstrosities that pass for most enterprise software. And yes, these people tend not to work for media companies.
  • Tech companies are rich. Take all the factors above, add it up, and it comes down to the fact that we're living through another gold rush, and this time most of the wealth is flowing into Silicon Valley. Take a bunch of companies that are extremely wealthy and employ great software developers and designers at a time when software is eating the world, add in a healthy dose of world-changing ambition, and you get companies that keep expanding their footprints, to the point where they are all competing in almost every business. People wonder why Apple might build a car, but I say why not? Above all, they are great at building computers, and what is a Tesla other than another portable computer (“The first one is an oversized iPad. The second is a revolutionary transport vehicle. And the third is a portable air conditioner. So, three things: an oversized iPad, a revolutionary transport vehicle, and a portable air conditioner. An iPad, a vehicle, and an air conditioner. An iPad, a vehicle…are you getting it? These are not three separate devices, this is one device, and we are calling it Apple Car.”)? Facebook, Apple, Google, Amazon, et al all continue to compete directly in more and more spaces because at their heart they are all software companies. I suppose they could have all decided not to compete with each other, but companies looking to maximize return in free markets usually don't behave that way, and so we'll see all of them trying to do more and more of the same things, like stream music and video, build smart phones, deliver stuff, etc. That's how a nuclear arms race happens. Your neighbor has the bomb, it's pointed at some part of your business, you get one too, if for no other reason than defensive purposes. Meanwhile, you also try to do some virgin land grabs, because networked businesses tend to reward first movers, and that's how you end up with tech companies trying to colonize space, build self-driving cars, float balloons around the world to bring the Internet to everyone, and, to bring it full circle, be the new front page for every user.

It's worth repeating: all the things above have been happening, are happening, and will continue to happen whether or not Facebook hosts your content.

By the way, you can still host your own content yourself, even if you let Facebook host yours. Getting yourself set up to host content on Facebook is largely a one-time fixed cost of some time to provide them with some feed. It was the same at Flipboard, though some companies took longer than expected because they couldn't output an RSS feed of their content out of legacy CMS systems. It was shocking to learn that a random blogger on Squarespace or Wordpress or Tumblr could syndicate their content more easily than a famous media company, but that was often the case and speaks to the tech deficit in play here.

This may all sound grim for media companies, but here's the kicker: it really is that grim. Wait, are kickers supposed to be positive? Maybe I meant kick in the butt.

Okay, I can offer some positives. A media company may not be able to be world class at every layer of the full stack, from distribution and marketing to ad sales and producing great content, but it doesn't have to be. Far better to be really good at one part of that, the one that tech companies are least likely to be good at, and that's producing great, differentiated content.

The fact is, great content is not yet commodified. That may sound like Peter Thiel's advice to be a monopoly. Self-evident, non-trivial, not useful. But many of the best advice is just that, as banal as a fortune cookie prescription but no less true.

Let's take The New Yorker as an example. They don't try to compete on breaking news, though they have beefed up on that front with their online blogs. They hire great writers who go long on topics, and thus they can charge something like $50 a year for a subscription because their content is peerless. I'm subscribed through something like 2020 (so please stop mailing me renewal solicitations, New Yorker, please!?).

Look at Techmeme. They provide value by curating all the tech news out there, using a mix of human and algorithm to prioritize the tech news stream to produce Silicon Valley's front page at any given moment in time. Curation is a key part of discovery, you don't have to focus on producing content yourself. A daily visit for me.

Look at HBO. A media company with great content that you can't easily find a substitute for, with a smart content portfolio strategy that minimizes subscriber churn. They surprised me recently by announcing they were going to launch HBO Now, ahead of when I anticipated, at the same price it costs to add it on to cable package. Kudos to them for not letting innovator's dilemma handcuff them for too long.

Look at Buzzfeed. Ignore the jealous potshots from their elders and marvel at their ability to create content you can't easily find elsewhere. That's right, I said it. Despite being the company that everyone says just rips off other people's content, Buzzfeed actually has more content I can't find substitutes for than most tech news sites. It's not just their original content and reporting, which is good and getting better. Like Vox trying to make the top news stories of the day digestible for more people, Buzzfeed takes fun content and packages it in a really consumable way. It turns out in a world of abundance, most people would prefer just a portion of their media diet from the heavy news food group. More of their daily diet is from the more fun food groups, and Buzzfeed owns a ton of shelf space in that aisle. It's something other sites can do, but many avoid because they're too proud or because it isn't part of their brand. I saw white and gold, BTW.

Look at Grantland. They also hit the fun part of the daily diet by targeting pop culture and sports with great writers and new content daily. People jab at Bill Simmons' a lot now that he is in the media penthouse, but he started as a blogger for AOL, and he was the first writer to really channel the fan's voice and point of view. It could've been you, perhaps, but it wasn't.

Look at Gruber, or Ben Thompson, or Marc Maron, or Serious Eats, or The Wirecutter, or Adam Carolla. Hell, even look at Fox News (just don't look too long). It turns out that differentiated content is differentiated. When the world's an all-you-can-eat buffet of information, you want to be the king crab legs, not the lettuce bowl.

The value of being a generalist as a reporter, someone who just shows up and asks questions and transcribes them into a summary article, is not that valuable. If you cover an industry, do you understand that industry? Take tech reporters as an example, many of them don't understand the underlying technology they write about. That may have sufficed in a bygone age, but it no longer does, which is good for Gruber's claim chowder business but not good business. Taking the time to become an expert in a domain still has value because it takes hard work, and that is also not a resource that is equally distributed in the world.

Some companies try to tackle more than one part of the stack, with some success. Look at MLBAM. They have managed to hire some strong technologists and build such a powerful platform that other media companies are syndicating it for their own use. Yeah, it's great to have content from a legally sanctioned monopoly to bootstrap your business, but credit to them for embracing the future and leveraging that content goldmine to build a differentiated technology platform.

Is it easy to replicate any of those? No, but your mother should have taught you that lesson long ago. At least what they're doing is clear and understandable to any outside observer.

If you've stuck with me this long, you may still think that hosting your content on Facebook is a Faustian bargain. Maybe Facebook changes their News Feed algorithm and your traffic vanishes overnight, like Zynga. Or maybe Facebook holds you hostage and asks for payment to promote your content more in the News Feed.

It's possible, but that risk exists whether your content is hosted there or not. Maybe hosting minimizes that risk a bit, but Facebook's first priority will always be to keep their user's attention and engagement because that's how they keep their lights on (and pay for the free lunches). If your content is engaging, it will keep a News Feed roof over its head, and if it doesn't, it won't.

Does that mean you have to write clickbait headlines and package stories up into listicles with animated GIFs? I don't think so, and if that's not your brand then by all means steer clear. That doesn't mean you shouldn't write a compelling headline. I despise clickbait headlines that just try to coax a click when the content has barely anything of substance, just to gain a cheap page view, but I appreciate a well-written headline over a dull one, too. Jeff Bezos used to caution us against the “tyranny of the or,” or false tradeoffs. This is one example. I also believe Zuckerberg and other Facebook execs when they say they'd like to weed out the more egregious clickbait from the News Feed. I understand if others don't, but my general belief about most tech companies is that they're just semi-evil.

Let's go deeper into the FUD. What if Facebook decides to go into the media business themselves? What if, instead of hosting your content, they produce their own and prefer it in the News Feed?

First of all, if that ever happens, it won't happen anytime soon. When you're in the phase of convincing folks to hop aboard your platform, you have to remove that possibility or no one will join.

Secondly, content production isn't generally a business that tech companies love. The margins aren't great, it's a headache to manage creative types, content production is messy and labor intensive, and tech companies prefer playgrounds where software economics play better.

It's far more likely that tech companies use their ample cash to license content. Remember how I said tech companies are rich? It turns out they are richer than movie studios and TV networks and newspapers and book publishers and music labels, and it turns out that writing a check for exclusive content hurts in the short-term but is great in the long run paired with the right business model, regardless of whether that's subscription or subsidized by ads. If you have the best ad units and platform, the marginal return on user attention is higher for you than the next competitor, and that means licensing can make sense. You also get to meet some celebrities, too, who are beautiful and charming.

Lastly, if Facebook wanted to go into the media business, they could do it now, or they could do it in the future, and your Facebook hosting abstinence wouldn't matter one bit. They already have all the eyeballs they need, it's not a situation like Netflix in its early days where they had to build a subscriber base first before they could consider producing their own original content (thank you First Sale Doctrine!). Long before Facebook even had a News Feed where your articles were shared, hundreds of millions of people already tuned in to see what that cute guy or girl was up to, or to see their friends' latest selfie, and other forms of ambient intimacy. I could perhaps even craft an argument where if all the sites out there stood on the sidelines it might accelerate Facebook's move into the space.

And if Facebook did, if they decided to compete with The New York Times and Grantland and all the other media companies, or to buy one or more of them, is that so bad? Maybe you could work for them, if you're unique and differentiated. If you are, you'll do just fine, in this world and the next.

Did I mention they have free lunches?

The secret technology of The Daily Show

Many have mourned Jon Stewart's announcement that he'll be leaving The Daily Show this year. Count me among those dressed in black; Stewart felt like my cool, whip smart Jewish uncle the past 16 years. One can claim that sometimes it's the format of a program that endures, and not the bodies filling the seats—for example, with Saturday Night Live—but with both The Daily Show with Jon Stewart and The Colbert Report, that's just wishful thinking. These two shows, like the late night talk shows, have long had their hosts very names in the titles, and for good reason; without Stewart and Colbert, the shows will become something different out of both necessity and circumstance.

Emily Nussbaum wrote a wonderful appreciation of Stewart's legacy, and one piece of it caught my eye for pointing out what I consider the show's most undervalued skill.

The truth is that Stewart was often at his most exciting when he got down in the dirt, instead of remaining decent and high-minded, your twinkly-eyed smartest friend. Five years ago, when he confronted MSNBC’s financial reporter Jim Cramer over his coverage of Wall Street, Stewart refused to be collegial. He nailed Cramer on his manipulations, airing clip after damning clip, and shouting “Roll 212!” with prosecutorial glee. He was a good interviewer with people he admired, but in some of the show’s most memorable segments he relied on search technology—in particular, his staff’s ability to cull clips and spin them into brutal montages—to expose lies that might have gone unremarked upon. Over time, he became not merely a scourge of phonies but the nation’s fact checker, training others in the craft. You can see that influence not only among hosts who started out on “The Daily Show,” including Colbert, John Oliver, and Larry Wilmore, but everywhere online. Twitter, on its best nights (and they do exist, doubters), can feel like a universe of sparkling mini-Stewarts, cracking wise but also working to mob-solve the latest crisis, and providing access to a far wider array of perspectives than any one comic could.

That kind of digging, of disrespecting authority, was a model for reinventing journalism, not comedy.

The secret technology behind The Daily Show was search.

Any viewer is, by now, familiar with the show's format. The opening third, almost always my favorite, would feature Stewart tackling a variety of the most prominent current events in politics and society and putting either some of the protagonists or the media on trial. Sometimes he'd dissect them himself, like a gifted if somewhat smug trial lawyer, but more often than not, he won by jiu jitsu. He let witnesses hang themselves on a rope of their own words.

I've never read how they do it, but the Daily Show seems to have catalogued every piece of video from every politician and reporter in the history of television. Did a politician claim one thing? Here's a clip of them from another time, contradicting themselves. Did Fox News castigate Obama for his decisiveness on a piece of foreign policy? Here's a clip of their anchors praising Bush for the same quality when it came to a similar situation. Often that opening portion of The Daily Show felt like a Three Stooges clip, with hapless politicians slapping themselves in the face, Stewart and his writing staff pulling the strings.

Do they have banks and banks of cable boxes and DVRs, recording every minute of CSPAN, Fox News, CNN, and MSNBC, converting all the dialogue to text, labeling every moment with row after row of metadata? How many researchers do they have on staff? How do they retrieve clips so quickly each day, and what is the interface for that system? Can they run searches by simply stringing together words like "Bill O'Reilly" "hypocrisy" "Iraq War"? Or is there a giant dropdown box with a bunch of predefined categories like "old white senators saying racist things"?

In turning what seems like the entire history of televisions news into a deeply catalogued primary source, The Daily Show lifted the journalistic standard of television news. This isn't a new phenomenon. The internet is, above all else, the greatest information distribution technology in history, and many a writer or journalist has realized too late that it's not their immediate fact checker or editor whose standards matter but that of millions of internet-connected people with lots of time and Google as their default homepage. Linus Torvalds is credited for saying “given enough eyeballs, all bugs are shallow.” I propose a corollary, “given enough eyeballs and enough metadata, all lies become public.”

In cycling, drug testing authorities keep samples of bloods for years after events so they can test samples retroactively as better drug-detection tests are devised. Why, in the age of the internet, people continue to plagiarize is beyond me, but even if one can get away with it for the moment, everything ever written and posted online lives on until that day when the original text is indexed and made searchable and detecting the crime becomes a matter of a trivial exact match query.

Video is late to this game, though, because it's much harder to index the spoken dialogue in video. Some companies have solutions, I've seen many a demo at trade shows, and we indexed closed caption files at Hulu, too. However, it's still not easily available to consumers on a significant percentage of video online. Yet. That's what made what I'm presuming to be The Daily Show's video catalog or index so remarkable.

The third episode of Season 1 of Black Mirror, “The Entire History of You,” postulated a world in which The Daily Show's technology for trapping people with evidence of their own hypocrisy existed in our personal lives. An implant in our brains would record and index every moment of our lives, allowing us to put each other on trial for the rest of our days. It's a common downside scenario for total recall technology, mentioned in almost any article that has experimented with   prototypes.

That episode of Black Mirror ends badly, as is common in this age of somewhat bleak science fiction. Real world evidence isn't so conclusive yet. Despite the almost nightly prosecution of The Daily Show with Jon Stewart, politicians and media like Fox News don't seem to have changed their behavior much, at least not to any level I can detect. Not even the rich and powerful are above shame, but it's safe to say many of them have a higher than average tolerance.

As for having our personal hypocrisies made shallow, I can't imagine that a greater leniency towards each other wouldn't win out over continual witch hunting. Furthermore, a mutually assured destruction of reputation might naturally result in a bottoms up detente. After all, who among us hasn't said something we later wished to expunge or walk back? Some people point to internet trolling as a counter-example, but I suspect it's largely over-indexing on the loud minority over the reasonable silent majority as our human brains love to do.

Even if such technology were widespread and forced us all to be more considered before we wrote or spoke, is that so bad? Taken to an extreme, that's a terrifying Orwellian scenario, but when Nussbaum writes that “[Stewart's] brand was decency,” she understands that much of the show's appeal was his own reasonable nature. Stewart often seemed exasperated at the rigid rhetorical stances in American politics, but it's difficult to believe he would have lasted 16 years at the desk if he didn't believe, deep down, that if we just hold up a mirror to ourselves, not a black mirror, nor one one ringed with flattering warm lights, but just the clearest one available, we'd grow the hell up.

Conversation with Adam Curtis

Jon Ronson interviewed Adam Curtis over email. Good stuff.

On social networks as echo chambers (a common lament about the internet):

But I do really agree with you about Twitter domestically. Twitter – and other social media – passes lots of information around. But it tends to be the kind of information that people know that others in that particular network will like and approve of. So what you get is a kind of mutual grooming. One person sends on information that they know others will respond to in accepted ways. And then, in return, those others will like the person who gave them that piece of information.

So information becomes a currency through which you buy friends and become accepted into the system. That makes it very difficult for bits of information that challenge the accepted views to get into the system. They tend to get squeezed out.

I think the thing that proves my point dramatically are the waves of shaming that wash through social media – the thing you have spotted and describe so well in your book. It's what happens when someone says something, or does something, that disturbs the agreed protocols of the system. The other parts react furiously and try to eject that destabilising fragment and regain stability.

...

I have this perverse theory that, in about ten years, sections of the internet will have become like the American inner cities of the 1980s. Like a John Carpenter film – where, among the ruins, there are fierce warrior gangs, all with their own complex codes and rules – and all shouting at each other. And everyone else will have fled to the suburbs of the internet, where you can move on and change the world. I think those suburbs are going to be the exciting, dynamic future of the internet. But to build them I think it will be necessary to leave the warrior trolls behind. And to move beyond the tech-utopianism that simply says that passing information around a network is a new form of democracy. That is naive, because it ignores the realities of power.

On the failings of modern journalism:

The thing that fascinates me about modern journalism is that people started turning away from it before the rise of the internet. Or, at least, in my experience that's what happened. Which has made me a bit distrustful of all that "blame the internet" rhetoric about the death of newspapers.

I think there was a much deeper reason. It's that journalists began to find the changes that were happening in the world very difficult to describe in ways that grabbed their readers' imagination.

It's intimately related to what has happened to politics, because journalism and politics are so inextricably linked. I describe in the film how, as politicians were faced with growing chaos and complexity from the 1980s onwards, they handed power to other institutions. Above all to finance, but also to computer and managerial systems.

But the politicians still wanted to change the world – and retain their status. So in response they reinvented other parts of the world they thought they could control into incredibly simplistic fables of good versus evil. I think Tony Blair is the clearest example of this – a man who handed power in domestic policy making over to focus groups, and then decided to go and invade Iraq.

And I think this process led journalism to face the same problem. They discovered that the new motors of power – finance and the technical systems that run it, algorithms that try and read the past to manage the future, managerial systems based on risk and "measured outcomes" – are not just obscure and boring. They are almost impossible to turn into gripping narratives. I mean, I find them a nightmare to make films about, because there is nothing visual, just people in modern offices doing keystrokes on computers.

Where I'm often most frustrated with modern journalism is in its coverage of areas it does not understand well, technology being one of them. I'm not saying you have to be a programmer to be a tech journalist or a filmmaker to be a movie critic, but not having domain knowledge limits the scope of your critique. One more layman's point of view isn't all that useful at the margins, and as with things like the last financial crisis, the lack of understanding from the financial press removed what we think of as one of the watchdogs of democracy, the fourth estate.

The one saving grace of the internet is that many technology domain experts can chime in. Still, for many reasons, most do not. They may be too busy, or they may bite their tongue for competitive or political reasons (technology is a heavily connected industry).

Given technology's growing political, economic, and cultural power, a vigilant and independent check is needed. A Gawker or Valleywag picks off just the most egregious and obvious of moral failings, but much of that is distraction from far more complex and significant issues.

Buzzfeed: giving you what you'll share

The heavyweight personalization that social networks do either through something like the follow graph on Twitter or through more algorithmic approach on Facebook is designed to give people the stuff that they want. What we're focused on is giving people stuff that they think is worth sharing with other people. We want the stories we're doing to have the biggest possible impact. So if we do personalization, it would be more of a personalization about what you’re most likely to share or discuss with your friends.

That's Jonah Peretti on what type of personalization Buzzfeed focuses on, emphasis mine on what is a subtle but important focusing distinction for the service.

In the traditional news bundle, say in old school print newspapers, the mix of serious versus entertaining content was weighted much more to the former. Now that technology has allowed the creation of more personalized bundles of information, sites like Buzzfeed and the social networks like Facebook and Twitter are showing us people's natural preference in the mix of heavy versus light, and it turns out that ratio is much more weighted towards the fun.

As in the newspaper days, the entertaining content still subsidizes, to a large extent, the serious journalism. Buzzfeed is starting to do some original reporting, but more likely than not it's ad revenue from listicles and the more “frivolous” content that will pick up the tab for both. Plus ça change.