Federer and the paradox of skill

[CORRECTION: I originally titled this Federer and the paradox of luck, but it's actually more correctly termed the paradox of skill, so I've amended the title of this post. It's a term I first read in Michael Mauboussin's The Success Equation: Untangling Skill and Luck in Business, Sports, and Investing, a book I highly recommend.]

I was curious about a moment in the Federer-Murray Australian Open semifinal when the commentators and cameras caught Federer saying something to Murray and giving him a brief menacing stare after a long rally at 6-5 in the fourth set.

Without microphones on court, the commentators weren't sure what he said or why, but they briefly showed Murray responding with a exaggerated nod and smirk. The commentators did detect and remark on that brief moment of tension, and given how rarely we see tennis antagonism manifest itself in a visible way on court, it stuck in my brain as curious mystery.

My buddy Ken sent me this article which clarifies the incident a bit.

Murray prevailed in 15-stroke rally with a forehand winner, with both players finishing the point near the net. But Federer, on the brink of defeat, appeared to have taken issue with a slight mid-rally [hesitation] by Murray, and shouted “you [expletive]-ing stopped!” across the net. Murray appeared at first surprised, then amused, twisting his face into an exaggeratedly satisfied smirk, laughing and nodding toward his player’s box.

Federer was known for being a hothead early in his career, but I never saw much of it firsthand. Since his ascension into tennis immortality, he's largely been seen as a very level-headed sportsman.

One thing I have noticed a few times that seems to bother Federer is that when he plays one of the other Big Four (Djokovic, Murray, Nadal), he is particularly sensitive to any points they win by luck. The article above mentions that BBC commentators had to apologize on air for audible obscenities from Federer during the semi against Murray.

Federer’s first clearly audible obscenity in his semifinal loss to Andy Murray came with Murray serving at 4-5, 15-30. Murray fired a body serve which Federer could just get his backhand in front of and sent him into mostly indistinguishable muttering, punctuated with a loud, hard expletive in the middle.

Federer’s second audible offense came with Murray serving at 3-4, 40-40, in the fourth set. Murray won a 17-shot rally, and Federer exclaimed that his opponent had been “lucky,” preceding that word with a choice adverb.

I suspect most of you are thinking of the same adverb I am, so if I don't write it out I hope you don't see it as "ducking" the question [rimshot].

But a more memorable example is that extraordinary forehand return Djokovic hit against Federer in the 2011 U.S. Open semifinal. Down match point and 5-3 in the fifth set, Djokovic crushed a sideline-grazing crosscourt winner off of a Federer first serve (you can see it at 8:12 of this video).

In the press conference after that match, which Federer eventually lost , he was unusually testy when asked about that Djokovic shot.

"It's awkward having to explain this loss," a tetchy Federer said, "because I feel like I should be doing the other press conference."

There followed a string of excuses and justifications which not only were barely sustainable given the evidence but seriously disrespected the winner.

Asked about the quite remarkable forehand winner Djokovic hit to save match point, Federer reckoned the Serb did not look at that point like someone "who believes much anymore in winning. To lose against someone like that, it's very disappointing, because you feel like he was mentally out of it already. Just gets the lucky shot at the end, and off you go."

Djokovic was honest enough to admit the shot was a gamble – but Federer was reluctant to give him credit even for that courage in a crisis, preferring to regard it as desperate.

"Confidence? Are you kidding me?" he said when it was put to him the cross-court forehand off his first serve – described by John McEnroe as "one of the all-time great shots" – was either a function of luck or confidence.

"I mean, please. Some players grow up and play like that – being down 5-2 in the third, and they all just start slapping shots. I never played that way. I believe hard work's going to pay off, because early on maybe I didn't always work at my hardest. For me, this is very hard to understand. How can you play a shot like that on match point? Maybe he's been doing it for 20 years, so for him it was very normal. You've got to ask him."

Translated, Federer hates that tennis might be decided in any way by luck rather than skill. It makes sense, that someone who might be the most skilled tennis player of all time might be disgusted that luck plays any part in outcomes of majors.

It will be fascinating to see if Federer alters his game in any way this next year or two given his age and the competition from his three chief rivals. I suspect deep down Federer has always believed he is more skilled than any of his opponents, and that might explain one of his chief weaknesses, an unwillingness to be more aggressive on service returns. If you believe you are better than your opponent in every aspect of the game, it's sufficient to put the ball back in play on the return because you believe you'll win the subsequent point more often than not.

But the paradox of skill is that the more evenly matched opponents are in skill, the more of a role luck plays in determining the final outcome. As beautiful as Federer's game remains (in a sense, the continued aesthetic beauty of his shots makes it hard to measure his decline), in today's power baseline game, his rivals are a close match to him in both movement and groundstrokes. You can make a strong case that one or more of them are superior to him in areas like serve, return, footspeed, and the backhand.

Given that he no longer has that discernible skills gap to his chief rivals, a healthier acceptance of the role of luck might shift his strategy in ways that help him capture that next major. For example, it wouldn't hurt him to be more aggressive on return, to take some chances to go for the big winner and shorten some points. Can someone who is still so good and who can still recall with vivid detail the time when he had no rival be self-aware enough to change?

Self help posits dual selves

if you, too, have reckoned with the size and scope of the self-help movement, you probably share my initial intuition about what it has to say about the self: lots. It turns out, though, that all that surface noise is deceptive. Underneath what appears to be umptebajillion ideas about who we are and how we work, the self-help movement has a startling paucity of theories about the self. To be precise: It has one.

Let us call it the master theory of self-help. It goes like this: Somewhere below or above or beyond the part of you that is struggling with weight loss or procrastination or whatever your particular problem might be, there is another part of you that is immune to that problem and capable of solving it for the rest of you. In other words, this master theory is fundamentally dualist. It posits, at a minimum, two selves: one that needs a kick in the ass and one that is capable of kicking.

This model of selfhood is intuitively appealing, not least because it describes an all-too-familiar experience. As I began by saying, all of us struggle to keep faith with our plans and goals, and all of us can envision better selves more readily than we can be them. Indeed, the reason we go to the self-help section in the first place is that some part of us wants to do something that some other part resists.

That's Kathryn Schulz in New York Magazine in one of the articles I've been catching up on from the holiday break. It's a thought-provoking read.

Schulz questions whether this idea of two selves, the metaphor at the heart of the self-help industry, is inherently flawed. Could that explain why the self-help industry continues to fail to solve so many of our problems, thus perpetuating its own existence? 

My personal survey of recent popular self-help books seems to indicate that the current predominant model of self is that we are fundamentally defective or prone to self-destructive behavior in ways we cannot overcome, and so the best way to help ourselves is to hack our environment. In essence, we must trick ourselves.

Take for example the Paleo diet, which says we can eat as much as we want, as long as its the right types of food. Or the book Nudge, which says we can get users to make better choices by making the default choices ones that are better for them. Or the book The Power of Habit, which says we are caught in almost subconscious habit loops that we can simply reprogram. Or the book What Shamu Taught Me About Life, Love, and Marriage, which recommends training your husband like you would an circus animal.

On the one hand, this conception of the self removes some self-loathing as it is predicated on the belief that we are inherently defective in certain ways. On the other hand, thinking of ourselves as being unaware of our own self-destructive behavior and having to trick ourselves into breaking these dangerous loops is a fairly grim view of human nature.

Perhaps we can take solace in simply being smart enough to be aware of what we still don't understand. It may not make us skinnier or more productive, but consciousness of the limitations of our current understanding of self gives us some hope that we might someday decipher it.

The Walking Dead

The Walking Dead is a great piece of pop entertainment. I should clarify, though, that I'm referring to the graphic novel which just passed 100 issues a short while ago. In a series chock full of grim and horrifying developments, issue 100 was one of most brutal, ending with another in a series of emotional punches to the gut.

Zombie apocalypse stories interest me not for the literal mechanics of surviving against flesh-eating zombies but for their exploration of social institutions. They're a magnitude of order more intellectually fascinating to me than modern vampire stories, the other horror movie archetype that just won't die (pun intended).

[Note I said "modern vampire stories" as older vampire and monster stories are of immense fascination for me. They were, prior to Westerns, one of the earlier genres exploring the tension between individual freedom and social norms. It's just the modern incarnation of vampire stories, co-opted as skeletons for tales of racism or teen romance, that seem intellectually lacking.]

Like stories such as The Lord of the Flies, zombie stories begin with a scenario that explodes human constructs like society, law, civilization, and return us to a more primal, transactional world. The zombies are literal embodiments of humans in their most primal state, lower even than animals, as each zombie is concerned only with eating the flesh of other living creatures, human or otherwise. Unlike viruses that can mutate and kill themselves off, though, zombies don't eat one other.

In a clever bit of irony, the only way to kill a zombie is to permanently damage its brain, traditionally the locus of human thought, either by beheading the zombie or severely damaging its skull. This despite the fact that zombies are already brainless. Vampires, on the other hand, must be staked through the heart. This disparity in the location of their vulnerability is not coincidence. Vampires tend to be explorations of human desire, sexuality, and emotion, and the heart has always been the poetic locus of those feelings.

Where The Walking Dead and other zombie stories are most compelling is in those high pressure moments when an encounter with a zombie forces snap decisions on how to treat other humans in the vicinity. My feelings about the AMC TV show ebb and flow, but I'm most interested in the show when it adheres most closely to the graphic novel's relentless pace of these types of do-or-die encounters rather than conjuring absurd soap opera side plots that surpass the acting abilities of the cast.

[Many people have written that all they want to see on the TV show The Walking Dead are more zombies being killed, but what I suspect will set the show apart is not the volume of those encounters, most of which are depicted with subpar computer graphics, but the volume of such encounters that force humans to make snap moral judgements.]

Encountering a zombie typically ends with one of four results: death at the hand of the zombie, successful violence against the zombie, violence among the humans trying to escape the zombie, or some form of cooperation among the humans. The drama lies in whether humans retain the compassion that we expect of civilized humans or resort to primordial violence against each other, Lord of the Flies style. The horror at the heart of zombie stories is less the literal terror of being chased by flesh-eating humans (the ones in The Walking Dead are typical of the majority of movie zombies in that they stumble around slowly like drunks) but the idea that with a gentle nudge, the social conventions humanity has built over so many centuries will come toppling down.

What's unique about The Walking Dead (the graphic novel) among zombie stories is its length. Already over 100 issues, The Walking Dead is likely the longest zombie story ever told, and that opens the possibility for it to tell an even more epic story, that of the the rise of society and government. In such a ruthless world, how do humans group together, and what arrangements do they come up with to provide food and security, and then beyond that, perhaps even higher order human needs like love and sex. In this way, The Walking Dead might break down society and government only to retrace the rise of those institutions. By dint of its sheer duration, The Walking Dead has an opportunity to show the rise of human society.

Or perhaps its ultimate demise? Perhaps The Walking Dead is an epic depiction of mankind's long journey into extinction, with the zombie disease as a stand-in for any number of apocalyptic scenarios. I can't imagine either the graphic novel or the TV show embracing such a bleak conclusion, but it would be daring, wouldn't it?

The end of Season 2 was one example, when Rick gives a speech establishing himself as the dictator of the group, and anyone who isn't comfortable with that can go off on their own. That's a blueprint for any number of stories in human history, including the rise of fascism. The various other bands of people Rick and his crew encounter in Season 3, and presumably beyond, if they follow the graphic novel even roughly, will show us a variety of models for constructing society.

I Am Legend (the book, not the movie) took a similarly interesting arc over a shorter duration. It is about vampires, not zombies as many who watched the movie believe, but it has had an inordinate influence on the zombie genre. I don't want to spoil the novel with a plot summary (it has a killer of a twist ending), but it is a fascinating social fable, much more so than the movie, to no one's surprise.

Genre stories can be both mass entertainment and intellectually satisfying. My fingers are crossed that the TV show can live up to the thematic ambition of the graphic novel, even as it moves on to its third showrunner.

Audience as affordance: Twitter versus Facebook

Last November Matt Haughey of Metafilter fame wrote a great post at Medium that saw lots of pickup: Why I love Twitter and barely tolerate Facebook.

There’s no memory at Twitter: everything is fleeting. Though that concept may seem daunting to some (archivists, I feel your pain), it also means the content in my feed is an endless stream of new information, either comments on what is happening right now or thoughts about the future. One of the reasons I loved the Internet when I first discovered it in the mid-1990s was that it was a clean slate, a place that welcomed all regardless of your past as you wrote your new life story; where you’d only be judged on your words and your art and your photos going forward.

Facebook is mired in the past. My spouse resisted Facebook for many years and recently I got to watch over her shoulder as she signed up for an account. They asked her about her birth and where she grew up and what schools she attended, who her family might be. By the end of the process, she was asking me how this website figured out her entire social circles in high school and college. It was more than a little creepy, but that’s where her experience began.

I feel the same as the title of Haughey's post, and I agree with much of what he says, but my main reason for sharing his sentiment is different (or at least I think it is; Matt's a lot smarter than I am so it could be that I'm about to lay out a subset of his thesis).

Unlike Matt, I don't feel any pressure on Facebook to conform to any single consistent image of what people think I am or what I have been in the past. Many people who know me are in my Twitter follower graph also, and it's easy enough for anyone to associate my Twitter account with my real identity, so I don't think of it as a clean slate where I can be completely inconsistent with my identity elsewhere on the Internet.

I suspect many of my grade school friends who have just started tracking me again on Facebook after years of not having seen me might be surprised by my odd sense of humor, interests, and career choices, but it hasn't ever felt like a shackle. If anything I feel more pressure on Twitter to live up to the expectations of so many people I don't know who've chosen to follow me without any real-life connection.

That last point gets at what I find to be the primary difference between the two networks. To take a McLuhan-esque view of medium and message, the audience selection on each of those social networks is the primary affordance that shapes the content I create for them.

My Facebook graph is hundreds of people I've met through the years: immediate family and relatives, classmates from grade school through college, coworkers from various companies I've worked at. It's an emergent contact book more than anything else.

What it isn't, however, is an audience I can easily write for. It turns out that an assemblage of people you've met through the years is too diverse and random an assortment of people to treat as a coherent audience. What could I possibly write as a status update that would be interesting to my father, one of my coworkers from my first job out of college, the friend of a friend who met me at a pub crawl and friended me, and someone who followed me because of a blog post I wrote about technology?

This odd assortment of people all friended me on Facebook because they know me, and that doesn't feel like a natural audience for any content except random life updates, like relationship status changes, the birth of children, job changes, the occasional photo so people know what you look like now.

So unlike Haughey, what I struggle with about Facebook is not the constraint to be consistent with a single conception of myself, it's the struggle to target content to match multiple versions of myself. Judith Rich Harris' great book The Nurture Assumption was a revelation to me as it explained much of the childhood tension in my life. Harris' insight was that the influence of parents on their children's mental and emotional development paled in comparison to that of the child's peers.

More than that, though, Harris made explicit something that most of us do without ever being conscious of. That is, we play different versions of ourselves among different groups of people. Early in life, our first split in personality comes between school and home. We play one role with our parents, a different role with our classmates at school. It explains why we're often embarrassed when our friends would come over to our house and see how our parents interacted with us, because we felt it exposed a version of our personality that we tried to hide from our classmates when at school.

Later, in adult life, we have a version of ourselves that we play with our spouse or the person we're dating, another version of ourselves with our coworkers, yet another with our siblings and parents, and a different version of ourselves with our kids. Some people accumulate online peer groups, for example people they play online video games with, and that creates yet another identity.

My followers on Twitter, in contrast, are largely people I don't know. Most people who follow me on Twitter choose to do so only because they find my tweets interesting, or at least that's how I interpret a follow, especially when it's someone I've never heard of (yes, I'm aware this is reflective of some non-trivial level of self-regard, but then again I am a person who still writes a blog; I struggle all the time with the amount of self-absorption inherent in having such an online presence).

Since I interpret each new follower that way, I think of my followers as a set of people who wandered past my stage at the circus and decided to stop and watch. Each additional follower reinforces that what I was writing on Twitter before must be of some interest to them, and so it reinforces my urge to write more of the same.

Facebook, with people from all those groups as the audience, forces us to collapse all our representations into one. It's a reverse network effect as a publishing platform, where as your graph grows, the urge to publish diminishes. I see more noise in my news feed now, and it feels more and more futile for me to post anything worthy of the attention of this odd assemblage of people whose sole misfortune was meeting my corporal self.

As my Twitter follower count grows, I feel more incentive to raise my game and provide a consistent or increasing pace of high quality content. With Facebook, the more friends I add from more walks of life, the more paralyzed I feel about writing or posting anything.

Facebook does provide tools for you to solve this issue. You can divide your friends into different groups and post content to those specific subgroupings. In the opposite direction, you can filter stories of specific types and from specific people. Facebook also works hard to tune its algorithm for choosing which stories to show to whom to try to keep the signal to noise of the news feed high. And let's not forget that the Facebook graph, one that represents so many of the people I've met in real life, has its own value as it grows. It's become a valuable self-healing, self-growing address book.

But as a social interaction space, it feels like a party that's gotten too crowded. Organizing hundreds of people into groups is no fun, and I'm like most people in not even bothering (Twitter has lists, too, and I've never bothered putting my followers into any such lists). Algorithmic efforts to tune my News Feed aren't anywhere close to working judging by my recent visits. It feels like more work than it's worth to mute stories or particular people, the noise to signal ratio is so high now.

If there's one way I do feel something similar to what Haughey feels, it's in feeling more disembodied on Twitter. My existence on Twitter has always felt like it lived inside my head, in the twists and turns of my attention. My content on Twitter is ideas, links to articles of interest, mental debris. I feel more corporeal on Facebook because people post pictures of me and most of the people in my graph there have seen me in real life.  Because of that, I feel uncomfortable with the fact that my avatar on Twitter now is an actual picture of myself while my avatar on Facebook is a picture of a Theo Epstein Cubs jersey. It feels reversed.

The medium shapes the message. There's a reason that the photos I post to Instagram are so different from those I post to Flickr (and it's intimately related to why it will be harder than so many people think for Flickr to co-opt the ground that Instagram has claimed). There's a reason I check-in to more places on Foursquare than I do on Facebook, why I suspect Snapchat will carry vastly different content than something like WhatsApp.

It's why, when designing a social product today, it's so important to think through the flow by which new users build their graph. Facebook's suggested user algorithm constantly finds people it thinks I know in the real world, and so that's how my graph grows. Twitter, by contrast, is constantly suggesting users who they tell me are similar to those I've just followed. Because those suggestions are likely constructed from collaborative filtering across follow patterns, and because follow actions on Twitter tend to be based on the content that those people find interesting, what my Twitter graphs have become are really finely tuned content publishing graphs, in both directions.

I agree with Hunter Walk that it will be extremely difficult for Facebook to be a supergraph that just subsumes all subgraphs by sheer size. When Hunter speculates that "each generation needs a space to call their own," I suspect that what he might be honing in on is related not to generational shifts but several natural inflection points in a person's identity. When you start going to school, your personality splinters in two, between your self with parents and your self with your classmates. For some, the shift to high school serves as another transition.

The next major transition is leaving home for college. There's a reason so many people  become lifelong friends with people they meet in college but lose touch with friends from grade school or high school, because often our personalities and selves shift in huge ways until college, when we find a stable adult self.

After that, there are possible inflection points, but not ones that affect as many people. Jumping into a long-term relationship can be one (fertile cinematic ground for Judd Apatow), marriage is another for some people, and having children is yet another seismic event, though often it's less about personality than responsibility. When their kids leave the next, couples often have a moment for redefinition, which some seize. And of course, there's that moment when every woman or man morphs into that person who just doesn't give a damn anymore and just says whatever they think, becoming a sort of grumpy truthteller (think Maggie Smith's Dowager Countess in Downton Abbey or Judi Dench's M in the James Bond movies). 

Hunter Walk notes that a social graph has never lasted 10 years at scale. I think there have been too many factors in play to extrapolate too much from that pattern; a lot of that was just products gone bad, or new products gone better. But chief among challenges for all graphs that are rooted in identity-related content is the difficulty of surviving the leap across these major inflection points in our personalities and selves.

One last thought on this topic: while it may be tempting to use Twitter or Facebook as an authentication system for your new website or service to try to jumpstart the growth of your product's graph, first consider if that audience is the right one for your product. They're the right audience much less often than new services and apps think.

There is no one graph to rule them all because we have so many conceptions of ourselves. The exceptions to this rule of multiple selves tend to be people with asymmetric popularity (celebrities or internet luminaries who have many many more followers than people they follow) since they tend to build up the same audience on any network they join. Rather than sharing various sides of themselves, they are just reinforcing themselves on every medium to maintain the image which brings them great wealth and/or popularity.

I used to wonder why Superman ever bothered being humble reporter Clark Kent at all. With infinite energy and the ability to fight crime effectively 24/7 on a global basis, Superman should spend all his time flying around the world checking criminals and natural disasters. Any other use of his time is criminal under-leverage of his skills.

Now that I'm older, I suspect he might just be an introvert who simply needs psychic time away from the spotlight of being Superman to maintain his sanity. On Twitter, Superman has tens of millions of followers to whom he posts photos of his heroic exploits, favorited and retweeted thousands of times, but on Facebook, as Clark Kent, he has just a few hundred friends, and he posts poorly lit photos of himself out on dinner dates with Lois Lane. Each of those photos get about ten or twelve likes each, along with frequent comments from his mother: "Cutest couple ever!!!"

Computer vision

Today I spoke with someone who's an expert in computer vision. I didn't know anything about the state of that field, but one thing he told me was that computers can detect, with a high degree of accuracy, whether a figure they "see" is a woman or a man.

The two primary determinants are the length of the person's hair and the amount of skin exposed. Women have more of both, generally.

So the types of men that can trip up computer vision on this problem are men with long hair who show lots of flesh. So I'm thinking common false positives include Axl Rose, Fabio, and some members of the Hell's Angels?