“Platform” risk

Last night, Twitter curtailed Meerkat's access to its graph. I saw lots of discussion on Twitter (I'd say this was ironic but it's just expected) about why and whether Twitter should just compete on its own merits with its recent acquisition Periscope.

Some have termed what happened to Meerkat “platform risk,” and it is, but one must be willfully naive to consider ad-monetized social graphs like Facebook and Twitter to be capital P Platforms. I prefer to call them little p “platforms” (I'm drawing air quotes with my fingers in case you aren't watching me live on Meerkat as I write this).

Amazon Web Services (AWS) is a Platform. That is, you can count on it even if you use it to compete with its parent company Amazon. Netflix still uses AWS in their tech stack even as Amazon Instant Video is spending over a billion dollars on content to battle it out with Netflix in the video streaming space, to name one example, and I've yet to hear of any company of any size getting bounced from AWS because they were competitive to Amazon. You could even start a retail company and use AWS. It's a utility like the power company.

The reasons why lie in both Amazon's business model and philosophy. AWS isn't free. This is crucial because Amazon makes money off of its AWS customers regardless of what business they're in. As for AWS's philosophy, you can call it altruistic or just pragmatic or both, but if Amazon wants to compete with a company that uses AWS, Amazon will try to beat them in the marketplace. If they can't, they still get a bite of that competitor's income through AWS fees. It's a win either way, and considering AWS is a fast-growing platform that's a critical piece of the world's technology stack, it's more than a minor one.

Compare this to free tech platforms offered by companies like Facebook and Twitter that make money off of ads targeted at their social graphs. If a company like Meerkat comes along and piggybacks off the Twitter graph to explosive growth and captures a unique graph, in this case around live video-casting, Twitter doesn't make any money. On the contrary, since the network effects of graph-based products tend to lead to “winner takes all” lock-in, Twitter just ends up having armed a formidable competitor that it might have to spend a lot to buy or compete with later. It's a no-win situation.

Facebook has similar ambivalence as a platform. Anyone familiar with the tech space in recent years can name more than one company that rode the Facebook graph and News Feed to explosive growth only to plummet off a cliff when Facebook turned a knob behind the scenes or just cut off access.

None of this should be surprising unless you're some “don't be evil” idealist. Take a more realpolitik view of tech and put yourself in Twitter and Facebook's shoes. Why do they want developers to build off of their platforms?  The most ideal developers on their platforms would be apps and services that publish lots of great content into Facebook's News Feed and Twitter's Timeline such that users spent more time in either service seeing ads.

The worst kind of developer would be one that used either the News Feed or Timeline just as a captive notification stream to build their own competitive social graph. Meerkat is guilty of at least one part of that. Meerkat leaves random links in Twitter that take users out of Twitter's timeline to some other app to experience content, and Meerkat's stale links just sit in Twitter timelines like branding debris or worse, as spam.

For all its press these past few weeks, Meerkat's graph is relatively shallow. However, the potential for being first to get traction as another near real-time medium of note was rising with every live broadcast notification from another tech influencer. As Twitter knows better than anyone, it's not necessarily how many users you convert in the beginning of your journey to create a high-value graph, it's who you convert, and Meerkat had captured the imagination of some real luminaries. Furthermore, Meerkat is actually more real-time than Twitter, which lays claim to being the best publicly available real-time social network.

Notifications are the most valuable communication channel of the modern age given the ubiquity of smartphones, and Facebook and Twitter are among the most valuable information streams to tap into given their large user bases and extensive graphs. Email is no longer the summit of the communication hierarchy, and both Facebook and Twitter want to avoid the spam issue that polluted email's waterfalls.

This conflict of interest is why I refer to Facebook and Twitter as little p platforms. Developer beware. Unless they change their business model, any developer trying to build some other graph off of Facebook or Twitter should have a second strategy in place in case of explosive growth because access won't persist.

Even before Facebook and Twitter, this type of platform risk from ad-supported businesses lay in wait to trap unsuspecting companies. Google search engine traffic is one of the more well-known ones. Google's PageRank algorithm is, for the most part, a black box, and I've encountered many a company that fell on hard times or went out of business after Google tweaked PageRank behind the scenes and turned off the bulk of a their organic traffic overnight. As Google enters more and more businesses, that platform risk only escalates.

Alternative Platforms do exist, even if they're not perfect, and that matters because AWS, as developer friendly as it is, doesn't offer a useful graph for companies looking for viral growth.

The most important such platform to date might be Apple's contact book. It's certainly one of the largest graphs in the world, and Apple doesn't rely on advertising to those users for income. The App Store is not completely open, but it's reasonably so, and once you're approved as an app it's rare that Apple would pull the rug out from underneath you the way Facebook and Twitter have.

Phone numbers were the previous generation's most accessible and widespread key for identity and the social graph, and Apple's iOS and Google's Android operating systems and the rise of the smartphone suddenly opened a gateway to that graph. Many messaging apps bootstrapped alternative or parallel social graphs just that way. I doubt the telcos were looking that many moves ahead on the chess board, and even if they had, I'm not sure they would have had much recourse even if they had wanted to prevent it from happening.

Meerkat is a very specific situation though, and the reason I still think of Twitter and Facebook as valuable platforms, even if it's with a lowercase p, is that both developers and Twitter and Facebook can benefit from lots of other more symbiotic relationships with each other. These relationships are possible specifically because of the nature of Twitter and Facebook primary ad unit.

Both companies could do a better job of clarifying the nuance of just what types of relationships qualify. This would head off more developer frustration and prevent them from just writing off those two platforms entirely, as many already have. Given how many developers have been burned in the past, distrust is high, but I believe a lack of clear and predictable rules makes up more of the platform risk here than is necessary.

More on that in a follow-up post.

The curse of discernment

One of my favorite Schwartzisms is this: If you ever aren't sure if you attended the very best party or bought the very best computer, just settle for "good enough." People who do this are called "satisficers," and they're consistently happier, he's found, than are "maximizers," people who feel that they must choose the very best possible option. Maximizers earn more, Schwartz has found, but they're also less satisfied with their jobs. In fact, they're more likely to be clinically depressed in general.
 
The reason this happens, as Schwartz explained in a paper with his Swarthmore colleague Andrew Ward, is that as life circumstances improve, expectations rise. People begin comparing their experiences to peers who are doing better, or to past experiences they've personally had that were better:
 
As people have contact with items of high quality, they begin to suffer from “the curse of discernment.” The lower quality items that used to be perfectly acceptable are no longer good enough. The hedonic zero point keeps rising, and expectations and aspirations rise with it. As a result, the rising quality of experience is met with rising expectations, and people are just running in place. As long as expectations keep pace with realizations, people may live better, but they won’t feel better about how they live.
 

Olga Khazan in The Atlantic on Barry Schwartz and his recommendations for how to avoid the depression that comes from “the curse of discernment.” Stewart is the author of The Paradox of Choice: Why More is Less, one of the most cited consumer psychology books of recent memory.

The internet has intensified this curse, no one can make a purchase decision without reading a bunch of reviews online, or Googling “what is the best [X]” and trying to sift through a bunch of spammy websites to find some authoritative-sounding article. After all, the internet has democratized information and put it at our fingertips, isn't it our own fault if we don't own the best SLR or printer or kitchen blender, or if we don't go to the best ramen house in Tokyo on our one visit there?

I will try to take this to heart as maximization can be so contextual as to be meaningless. However, having had so much Korean BBQ in Los Angeles' Koreatown, I'm afraid my hedonic zero point for that cuisine has risen so far from my college days that just about every Korean BBQ restaurant in San Francisco I once loved now seems like a monastic compromise.

Do you see the black and blue?

The dress meme was a bit exasperating for a few days there in its oppressive ubiquity. It was the overexposed photo that launched a thousand think pieces about perception and epistemological humility. If there is an optical illusion hall of fame the dress can assume its position next to old lady/young lady and which line is longer.

I thought I was done with that damn dress forever, but this is a clever newsjacking of the meme by the Salvation Army in South Africa, even if the practice of newsjacking, a term I'd never heard of until I saw this piece, sounds a bit shady, as so many content marketing practices do.

Maybe this is how it ends, with lots of early retirements

Today, Yahoo Sports reported that 49ers linebacker Patrick Willis is retiring at the age of 30. It was a surprise to many that the five-time All Pro and seven-time Pro Bowler would retire with seemingly more good years ahead of him.

With the severity of the physical trauma of football becoming clearer and clearer by the day, I wonder if more and more players will choose to retire earlier in their careers, after accumulating some threshold of wear and tear and/or wealth.

Economically, it makes sense. A star player can make many tens of millions of dollars in a the span of a short career, and even a regular starter can pocket millions. If you are frugal, you'll achieve a very comfortable nest egg. At that point, is each additional marginal million dollars worth perhaps several years of your life, or a remaining lifetime of debilitating pain, or worse?

The more we hear about players suffering terribly in their old age, the more the cost side of the economic equation increases. Given that data, more and more will come to see just how rational Willis' decision is.

UPDATE (16 Mar 2015): Chris Borland of the 49ers announced today that he's retiring after just one season, and a great one, because of concerns over the impact of repetitive head trauma on his health. It's happening even sooner than I anticipated.

"I just honestly want to do what's best for my health," Borland told "Outside the Lines." "From what I've researched and what I've experienced, I don't think it's worth the risk."
 
Borland becomes the most prominent NFL player to leave the game in his prime because of concerns about brain injuries. More than 70 former players have been diagnosed with progressive neurological disease following their deaths, and numerous studies have shown connections between the repetitive head trauma associated with football, brain damage and issues such as depression and memory loss.
 
"I feel largely the same, as sharp as I've ever been. For me, it's wanting to be proactive," Borland said. "I'm concerned that if you wait till you have symptoms, it's too late. ... There are a lot of unknowns. I can't claim that X will happen. I just want to live a long healthy life, and I don't want to have any neurological diseases or die younger than I would otherwise."

I, for one, welcome our new female overlords

The great transformation of the past two centuries—the slow but relentless decline of male supremacy—can be attributed in part to the rise of Enlightenment ideas generally. The liberation of women has advanced alongside the gradual emancipation of serfs, slaves, working people and minorities of every sort. 
 
But the most important factor has been technology, which has made men’s physical strength and martial prowess increasingly obsolete. Male muscle has been replaced to a large extent by machines and robots. Today, women operate fighter jets and attack helicopters, deploying more lethal force than any Roman gladiator or Shogun warrior could dream of. 
 
As women come to hold more power and public authority, will they become just like men? I don’t think so. Show me a male brain, and I will show you a bulging amygdala—the brain’s center of fear and violence—densely dotted with testosterone receptors. Women lack the biological tripwires that lead men to react to small threats with exaggerated violence and to sexual temptation with recklessness.
 

From a good piece on how women are taking more and more leadership positions in the world, and why that's a good thing. The problem with men is testosterone, which manifests in aggression. That may have been useful in a state of constant hand to hand combat war, but it's counterproductive in modern society. When Ali G asked that feminist, “Which is better, man or woman?” it turns out he was on to something, even as she deflected the question.

I suspect the future ideal for humans to emulate in thought and action is a computer, a machine that is immune from the types of logical mistakes driven by emotion and human biology. That ideal brings its own flaws, including software bugs and a lack of some threshold of compassion which humans value so highly, but when it comes to large-scale optimization of measures like happiness, lives saved, misery avoided, computers are likely far superior to human brains.