Paradox of loss aversion

...but it is interesting that, all of a sudden, baseball teams and football teams become, in general, more strategically correct when they have more on the line.
In the World Series, closer usage is a lot better, when you bring your best guy in in the eighth inning or the seventh inning. You see in the NFL teams will go for two more often in the playoffs, go for it on fourth down more often in the playoffs. Which is a hint that when the stakes are low, culture tends to prevail. When the stakes are high and the outcome of the game is all that matters, then things are different.

An answer from Nate Silver in a conversation with Tyler Cowen, emphasis mine. The context is sports, but this is a paradox that occurs in real life so often, especially in the business world.

[BTW, a conversation between those two should be self-recommending, but if it isn't, consider this a recommendation. Sports and politics (what's the difference, really?), New York real estate, food, travel...they cover all their favorite topics.]

So often, companies will try anything that what is the right move, just because it's culturally expedient. Pressure from shareholders, investors, employees, reporters, your current users, and the public at large prevent companies from doing the right thing until they're at death's door.

It may be uncomfortable to do something controversial when things are going well (why rock the boat?), but it's far more pleasant to make changes when you're still growing than when you've hit the shoulder of the S-curve and are starting to flatline.

Mathematically, you should never buy insurance in Blackjack, even if you are dealt a Blackjack yourself, but depending on how you've been running up until then, you might buy insurance, just to break a losing streak. When things are going bad, you might stop hitting on 16 or 12 even when the probability says you should, even though it's sub-optimal strategy.

So many times at Amazon we'd do things that the stock market would punish, but Bezos had enough control of the company and the conviction to ignore the cultural pressure and make the right long-term move. It's not surprising that some of the most successful technology companies today like Facebook, Amazon, and Google are able to shrug off external pressures. The founders have a controlling stake, of course, but also the will to enable their employees to play optimal strategy all the time, even if the risk of failure is higher.

Successful modern technology CEOs remind me some of studio executives in the 60's, many of whom, like Robert Evans, enabled directors to take creative risks. Now that most movie studios are public corporations, largely set up as financing vehicles, that type of movie is harder to get off the ground. The economics have changed, too, but the larger point is that leaders who understand that loss aversion is sub-optimal strategy need to create an environment in which the right types of failures are tolerated, even encouraged.

Loss aversion is most damaging, I suspect, when it's your current users who protest the loudest. If you're not willing to do your own customer segmentation, the market does it for you, and the largest segment is the the silent majority that choose not to use your product/service.

Especially for networks, analyzing the people who've tried and abandoned your service is just as important, if not more so, than rejoicing over those that have. The latter are of course critical to understanding your product-market fit, but the former are just as critical to assess when you're likely to hit the upper shoulder of the S-Curve.

The paradox of loss aversion is that so much of tech is winner-take-all, like tournament poker instead of a cash game, so loss aversion is often the strategy that leads directly to total loss.

The network's the thing

Last week Instagram announced it was supporting more than just square aspect ratios for photos and videos. This led of course to a Slate article decrying the move, because Slate is that friend that has to be contrarian on every topic all the time, just to be annoying.

The square confines Instagram users to a small area of maneuver. It forces us to consider what details are essential, and which can be cropped out. It spares us from indulgence of the landscape and the false promise of the panorama.
But Instagram, which is owned by Facebook, is in the business of accommodating its users, not challenging them. One of the problems with the square, the company explained in its announcement, is that “you can’t capture the Golden Gate Bridge from end to end.” This example speaks to the needs of a certain kind of Instagram user who enjoys planting his flag on settled territory. Like an iPhone videographer at a Taylor Swift concert, the guy Instagramming the Golden Gate Bridge is not creating a rare or essential document, only proof that he saw it with his own eyes.
And why did he bother doing that, anyway? Clearly, because photographs cannot really capture the scope of the Golden Gate Bridge, or St. Peter’s Basilica, or the view from your car window as you drive up the Pacific Coast Highway. The impulse to capture these moments on camera is shaded by the knowledge that the moment, in all its immediacy, is too large to fit in a frame of any size.

I don't think my friend who snapped a pic of her daughter this morning or the friend who memorialized the little leaf the barista made in the foam on his latte was contemplating how wonderful it was that they were sparing me from the “indulgence of the landscape and the false promise of the panorama” but what do I know. I'm fairly certain the guy Instagramming the Golden Gate Bridge (I've done that a few times on Instagram) realizes he's not “creating a rare or essential document” but it never hurts to remind him, I'm sure he appreciates being set in his artistic place.

I'm glad Instagram is accommodating the additional aspect ratios, and it's a sign of how powerfully their network has matured. People confuse arbitrary limits on social networks—Twitter's 140 character limit, Instagram's square aspect ratio and limited filters, to take two prominent examples—with their core asset, which is the network itself. Sure, the limits can affect the nature of the content shared, but Instagram is above else a pure and easy way to share visual content with other people and get their feedback. That they started allowing videos and now differing aspect ratios doesn't change the core value of the network, which is the graph.

In fact, this move by Instagram validates the power of their network. If they were failing they either wouldn't have survived long enough to make such a move or it would be positioned as some desperate pivot. Instagram is dealing from a position of strength here, expanding the flexibility of its tools to meet the needs of a still growing user base.

In the same way, Twitter should have lifted the 140 character limit on DMs much earlier than they did. The power of Twitter, deep down, is that it's a public messaging protocol. The 140 character limit is not its secret power. The network is.

I'd actually remove the 140 character limit on Tweets as well, though such a move would undoubtedly spawn even more of a public outcry than Instagram's move since so many power users of Twitter are journalists. Yes, a 140 character limit enforces some concision in writing, rewarding the witty among us, but it also alienates a lot of people who hate having to edit a thought multiple times just to fit in the arbitrary limit. Lots of those people abandoned Twitter and publish on Facebook instead. Twitter could always choose to limit how much of a Tweet to display in the Timeline so as to allow for a higher vertical density of Tweets in the timeline, when people are scanning.

Look at how many users of Twitter have to break long thoughts across multiple Tweets, in Tweetstorms or just long linked series of Tweets. Many of those are power users, yet I still see power users do it incorrectly every day, making it really difficult to follow the entire sequence. Users who want to link tweets in a Tweetstorm or just to link their own Tweets together in a series should reply to each of their Tweets, removing their own username in the process. This allows readers who click one tweet to easily see the rest of the Tweets in the series, and removing one's own username adds back some characters for the content and prevents it from seeming as if you're talking to yourself like a crazy person. That many have no idea how to do it is just one of Twitter's usability issues. It's a wonderfully elegant public messaging protocol, but its insistence on staying so low level is crazy. Don't even get me started on putting a period before a username in a Tweet, try explaining that to your mother with a straight face.

Here's another example. Look at how many experienced Twitters users now turn to apps like OneShot to attach screenshots of text to their Tweets as photos, to circumvent the 140 character limit. I happen to really enjoy those screenshorts, as they're sometimes called now, and they demonstrate how Twitter could expand their 140 character limit without overwhelming the Timeline: just truncate at some point and add a click to expand function. This is yet another example of users generating useful innovation on top of Twitter when it should be coming from within the company.

Rather than force users to jump through all these hoops to publish longer content, Twitter could just allow users to write more than 140 characters in one Tweet, truncating the whole of it after some limit and posting a Read More button to allow readers to see the rest of the thought. Blasphemy! many will shout. I can already see the pitchforks in the distance. Some good old blasphemy is just what Twitter needs.

Longer character limits would likely increase the ability to follow conversations and dialogues on the service, too. One of the wonderful things about Twitter is that conversations between specific users can be read by other users. That's one of the greatest things about Twitter as a public messaging protocol. But because replies have to fit within 140 characters, often they need to be broken up into multiple Tweets. Many who reply don't realize that unless they hit the reply button on the previous Tweet in the conversation, the dialogue link is broken. Many mistakenly compose a new Tweet to continue the dialogue, not realizing that any reader clicking on that Tweet will not automatically see other Tweets in that conversation. Instead, it will just display by itself, as an orphan.

I run into this multiple times every day on the service, clicking on a Tweet without any easy way to figure out what it was in response to. If a lot of time has passed, it's often impossible to piece the conversation back together. It drives me crazy. I tried explaining how to piece broken conversation threads like this back together to a few people who abandoned Twitter and then realized I sounded like a madman. Why, in this day and age, should they have to learn such low level nonsense? Threaded conversations are, for the most part, a solved UI issue in this day and age.

I'm not done with the character limits, so hold your disgust. You may wish to bring more than just your pitchforks after I'm done. Every Twitter conversation that involves more than two people devolves into a short series of retorts that eventually dies because each additional username consumes more of the 140 character limit, until there is no more room for actual dialogue.

It's absurd, but it's always been that way. Why usernames count towards the 140 character limit has always befuddled me. Meaningful conversation always has to migrate off of Twitter to some other platform, for no reason other than a stubborn allegiance to an arbitrary limit which made sense in the SMS age but now is a defect. If you're going to keep a character limit (could we at least double it?), let's not have usernames count against the limit. In fact, if I hit reply to someone's Tweet, do we even need to insert that person's username at the front of the Tweet? You can still send the notification to that user that I replied to their Tweet, and suddenly my reply won't seem so oddly formatted to the average reader. There are plenty of ways to indicate who the message is addressed to through contextual formatting, and if I wanted to mention them explicitly I could always write @username in the Tweet. But it's unnecessary to insert it by default.

Vine is perhaps the only network whose chief content-creation limit seems intrinsically part of the network, and that's because video is one type of content which can't be scanned, in which each additional second of content imposes a linear attention cost of one second on the viewer. A six minute video costs the reader 60X the attention cost that a 6 second video does, and to even create a 6 second video of any interest requires some clever editing to produce a coherent narrative. A Vine video joke has its own distinct pace, it's like a two line riddle, often a 4.5 second setup with a 1.5 second punchline (at least that's the pacing in most Vines in my home feed).

This 6-second limit still constrains the size of Vine's userbase, and they may be okay with that. I think that's fine. I enjoy Vine, it's its own art form. Still, the 6 second limit means a lot of people don't turn to it for a lot of their video sharing. It's not easy to come up with a succinct 6 second video clip.

Look at how Snapchat has evolved to see another company realizing that its power is not the initial constraint but the network. Snapchat still imposes a 10 second limit on video length. But now you can string many videos together into My Story. This was brilliant on their part; it allows viewers to skip any boring clip with one tap, but it allows the creator to tell longer stories simply by shooting multiple snaps in sequence. They lowered the content generation cost on creators without meaningfully increasing it for viewers.

Furthermore, Snapchat now allows you to download your Stories to your camera roll. Those who claim ephemerality is the key to Snapchat's success might panic at such a change, but all it demonstrates is that they realize they now have users for whom ephemerality isn't the main draw of the service. They haven't confused an arbitrary early limit for being the root of their success, and they understand the underlying power of their platform.

Perhaps more than any other social network, Facebook has long recognized that their chief asset is their graph. They've made all sorts of major changes to their interface, changes that always leads to huge initial outcries from their users, followed by a fade to silence as users continue to access the service in increasing numbers.

That they recognized this and had the courage of their convictions from such an early stage is not to be discounted. Plenty of companies live in fear of their early adopters, who often react negatively at any change. This leaves these companies paralyzed, unable to grow when they hit  saturation of their early adopter segment. Because the global market of users has been expanded by the unprecedented reach of connected smart phones, early adopter segments can now number in the tens of millions, confusing companies into thinking that their early adopter segment is actually the mass market.

Twitter, more than any other company, needs to stop listening to its earliest users and recognize that deep down, its core strength is not the 140 character limit per Tweet, nor is it the strict reverse chronological timeline, or many other things its earliest users treat as gospel.

It's not even the ability to follow people, though for its current power users that has proved a useful way to mine some of the most relevant content from the billions of Tweets on the service. If Twitter realizes this, they'll understand that their chief goal should not necessarily be to teach the next several hundred million users how to follow hundreds of people, the way that the early adopters did. To do so is to mistake the next wave of users as being identical in their information consumption preferences and habits as the first 300 million, or whatever the true active count is among that number (I'm going to guess it's in the range of 40 to 80 million truly active daily users, though it's hard to tell without seeing the data).

Twitter's chief strength is that it's an elegant public messaging protocol that allows anyone to write something quickly and easily, and for anyone in the world to see that writing. It's a public marketplace of information. That's an amazing network, and the reason people struggle to describe Twitter is that a platform like that can be used for so many things.

If Twitter realizes that, then they'll realize that making that information marketplace much more efficient is the most critical way to realize the full potential of what is a world-changing concept. How do you match content from people who publish on Twitter with the readers who'd enjoy that content?

A people follow model is one way, but a topic-based matching algorithm is another. Event-based channels are just a specific version of that. Search is one option, but why isn't there browse? I can think of a dozen other ways to turbocharge that marketplace off the top of my head, and the third party developer community, kicked out of the yard by Twitter so many times like stray dogs, could likely come up with dozens of others if they were allowed back in.

Twitter can leave the reverse chronological timeline in place for grumpy early adopters. It can be Twitter Classic. Most of those early adopters are largely happy with things the way they are, and if Twitter is scared to lose them, leave the current experience in place for them. I honestly don't think they'd abandon the service if Twitter raised the 140 character limit, or allowed for following of topics, or any number of other changes suggested here, because I think the power of the network is the network itself, but if the company has any such trepidations, it's not a big deal to leave Twitter Classic in place. The company has a huge engineering and product team, it's easy to park that experience in maintenance mode.

When social networks come into their own, when they realize their power is not in any one feature but in the network itself, they make changes like this that seem heretical. They aren't. Instead, these are fantastic developmental milestones, indicative of a network achieving self-awareness. A feature is trivial to copy. A network, on the other hand, is like a series of atoms that have bonded into a molecule. Not so easy to split.

It's a post for another day, but one of the defining features of our age is the rise of the network. Software may be eating the world, but I posit that networks are going to eat an outsized share because they capitalize disproportionately on the internet. Journalism, advertising, video, music, publishing, transportation, finance, retail, and more—networks are going to enter those spaces faster than those industries can turn themselves into networks. That some of our first generation online social networks have begun self-actualizing is just the beginning of that movement.

“People need dramatic examples to shake them out of apathy and I can't do that as Bruce Wayne. As a man, I'm flesh and blood, I can be ignored, I can be destroyed; but as a symbol... as a symbol I can be incorruptible, I can be everlasting.” Bruce Wayne, Batman Begins


A subreddit for early Apple Watch owners to find others to share taps, drawings, and heartbeats with. I suspect it's only a matter of time before someone turns this into an app for people to find others to do this without having to share one's iMessage email publicly. The Apple Watch seems like the ideal device for this type of simple interaction to tackle one of the internet's two most popular use cases, that is: you are not alone.

Going undercover on social networks

Results show that men in relationships and with large on-line networks are more like to look at women they do not know. In contrast, single men with large networks are more likely to look at women they do know.

From this Harvard Business School paper (PDF) which "proposes that networks can act as covers which allow actors to participate in markets while maintaining a plausible excuse that they are not. Such covers are most valuable to actors in long-term relationships, as those who are already employed or in a long-term romantic relationship should not be seen as participating in the market for a new relationship."

I wish, like OkCupid did with OkTrends ("Frequent tweeters have shorter real-life relationships than everyone else, probably via some hack"), social networks like LinkedIn and Facebook revealed more aggregate insight into human behavior culled from their gazillions of users. There's much of interest there, but it's locked away.

I have many theses I'd love to test. For example, the relationship between vain status updates on Facebook and the number of total profile photos you have uploaded, or the relationship between people who cheat at Words with Friends and people who cheat on their taxes. Foursquare checkins or Instagram food photo restaurant sources as a predictor of annual income.  I need to finagle access to such data and turn it into a bestselling pop psychology book.

Audience as affordance: Twitter versus Facebook

Last November Matt Haughey of Metafilter fame wrote a great post at Medium that saw lots of pickup: Why I love Twitter and barely tolerate Facebook.

There’s no memory at Twitter: everything is fleeting. Though that concept may seem daunting to some (archivists, I feel your pain), it also means the content in my feed is an endless stream of new information, either comments on what is happening right now or thoughts about the future. One of the reasons I loved the Internet when I first discovered it in the mid-1990s was that it was a clean slate, a place that welcomed all regardless of your past as you wrote your new life story; where you’d only be judged on your words and your art and your photos going forward.

Facebook is mired in the past. My spouse resisted Facebook for many years and recently I got to watch over her shoulder as she signed up for an account. They asked her about her birth and where she grew up and what schools she attended, who her family might be. By the end of the process, she was asking me how this website figured out her entire social circles in high school and college. It was more than a little creepy, but that’s where her experience began.

I feel the same as the title of Haughey's post, and I agree with much of what he says, but my main reason for sharing his sentiment is different (or at least I think it is; Matt's a lot smarter than I am so it could be that I'm about to lay out a subset of his thesis).

Unlike Matt, I don't feel any pressure on Facebook to conform to any single consistent image of what people think I am or what I have been in the past. Many people who know me are in my Twitter follower graph also, and it's easy enough for anyone to associate my Twitter account with my real identity, so I don't think of it as a clean slate where I can be completely inconsistent with my identity elsewhere on the Internet.

I suspect many of my grade school friends who have just started tracking me again on Facebook after years of not having seen me might be surprised by my odd sense of humor, interests, and career choices, but it hasn't ever felt like a shackle. If anything I feel more pressure on Twitter to live up to the expectations of so many people I don't know who've chosen to follow me without any real-life connection.

That last point gets at what I find to be the primary difference between the two networks. To take a McLuhan-esque view of medium and message, the audience selection on each of those social networks is the primary affordance that shapes the content I create for them.

My Facebook graph is hundreds of people I've met through the years: immediate family and relatives, classmates from grade school through college, coworkers from various companies I've worked at. It's an emergent contact book more than anything else.

What it isn't, however, is an audience I can easily write for. It turns out that an assemblage of people you've met through the years is too diverse and random an assortment of people to treat as a coherent audience. What could I possibly write as a status update that would be interesting to my father, one of my coworkers from my first job out of college, the friend of a friend who met me at a pub crawl and friended me, and someone who followed me because of a blog post I wrote about technology?

This odd assortment of people all friended me on Facebook because they know me, and that doesn't feel like a natural audience for any content except random life updates, like relationship status changes, the birth of children, job changes, the occasional photo so people know what you look like now.

So unlike Haughey, what I struggle with about Facebook is not the constraint to be consistent with a single conception of myself, it's the struggle to target content to match multiple versions of myself. Judith Rich Harris' great book The Nurture Assumption was a revelation to me as it explained much of the childhood tension in my life. Harris' insight was that the influence of parents on their children's mental and emotional development paled in comparison to that of the child's peers.

More than that, though, Harris made explicit something that most of us do without ever being conscious of. That is, we play different versions of ourselves among different groups of people. Early in life, our first split in personality comes between school and home. We play one role with our parents, a different role with our classmates at school. It explains why we're often embarrassed when our friends would come over to our house and see how our parents interacted with us, because we felt it exposed a version of our personality that we tried to hide from our classmates when at school.

Later, in adult life, we have a version of ourselves that we play with our spouse or the person we're dating, another version of ourselves with our coworkers, yet another with our siblings and parents, and a different version of ourselves with our kids. Some people accumulate online peer groups, for example people they play online video games with, and that creates yet another identity.

My followers on Twitter, in contrast, are largely people I don't know. Most people who follow me on Twitter choose to do so only because they find my tweets interesting, or at least that's how I interpret a follow, especially when it's someone I've never heard of (yes, I'm aware this is reflective of some non-trivial level of self-regard, but then again I am a person who still writes a blog; I struggle all the time with the amount of self-absorption inherent in having such an online presence).

Since I interpret each new follower that way, I think of my followers as a set of people who wandered past my stage at the circus and decided to stop and watch. Each additional follower reinforces that what I was writing on Twitter before must be of some interest to them, and so it reinforces my urge to write more of the same.

Facebook, with people from all those groups as the audience, forces us to collapse all our representations into one. It's a reverse network effect as a publishing platform, where as your graph grows, the urge to publish diminishes. I see more noise in my news feed now, and it feels more and more futile for me to post anything worthy of the attention of this odd assemblage of people whose sole misfortune was meeting my corporal self.

As my Twitter follower count grows, I feel more incentive to raise my game and provide a consistent or increasing pace of high quality content. With Facebook, the more friends I add from more walks of life, the more paralyzed I feel about writing or posting anything.

Facebook does provide tools for you to solve this issue. You can divide your friends into different groups and post content to those specific subgroupings. In the opposite direction, you can filter stories of specific types and from specific people. Facebook also works hard to tune its algorithm for choosing which stories to show to whom to try to keep the signal to noise of the news feed high. And let's not forget that the Facebook graph, one that represents so many of the people I've met in real life, has its own value as it grows. It's become a valuable self-healing, self-growing address book.

But as a social interaction space, it feels like a party that's gotten too crowded. Organizing hundreds of people into groups is no fun, and I'm like most people in not even bothering (Twitter has lists, too, and I've never bothered putting my followers into any such lists). Algorithmic efforts to tune my News Feed aren't anywhere close to working judging by my recent visits. It feels like more work than it's worth to mute stories or particular people, the noise to signal ratio is so high now.

If there's one way I do feel something similar to what Haughey feels, it's in feeling more disembodied on Twitter. My existence on Twitter has always felt like it lived inside my head, in the twists and turns of my attention. My content on Twitter is ideas, links to articles of interest, mental debris. I feel more corporeal on Facebook because people post pictures of me and most of the people in my graph there have seen me in real life.  Because of that, I feel uncomfortable with the fact that my avatar on Twitter now is an actual picture of myself while my avatar on Facebook is a picture of a Theo Epstein Cubs jersey. It feels reversed.

The medium shapes the message. There's a reason that the photos I post to Instagram are so different from those I post to Flickr (and it's intimately related to why it will be harder than so many people think for Flickr to co-opt the ground that Instagram has claimed). There's a reason I check-in to more places on Foursquare than I do on Facebook, why I suspect Snapchat will carry vastly different content than something like WhatsApp.

It's why, when designing a social product today, it's so important to think through the flow by which new users build their graph. Facebook's suggested user algorithm constantly finds people it thinks I know in the real world, and so that's how my graph grows. Twitter, by contrast, is constantly suggesting users who they tell me are similar to those I've just followed. Because those suggestions are likely constructed from collaborative filtering across follow patterns, and because follow actions on Twitter tend to be based on the content that those people find interesting, what my Twitter graphs have become are really finely tuned content publishing graphs, in both directions.

I agree with Hunter Walk that it will be extremely difficult for Facebook to be a supergraph that just subsumes all subgraphs by sheer size. When Hunter speculates that "each generation needs a space to call their own," I suspect that what he might be honing in on is related not to generational shifts but several natural inflection points in a person's identity. When you start going to school, your personality splinters in two, between your self with parents and your self with your classmates. For some, the shift to high school serves as another transition.

The next major transition is leaving home for college. There's a reason so many people  become lifelong friends with people they meet in college but lose touch with friends from grade school or high school, because often our personalities and selves shift in huge ways until college, when we find a stable adult self.

After that, there are possible inflection points, but not ones that affect as many people. Jumping into a long-term relationship can be one (fertile cinematic ground for Judd Apatow), marriage is another for some people, and having children is yet another seismic event, though often it's less about personality than responsibility. When their kids leave the next, couples often have a moment for redefinition, which some seize. And of course, there's that moment when every woman or man morphs into that person who just doesn't give a damn anymore and just says whatever they think, becoming a sort of grumpy truthteller (think Maggie Smith's Dowager Countess in Downton Abbey or Judi Dench's M in the James Bond movies). 

Hunter Walk notes that a social graph has never lasted 10 years at scale. I think there have been too many factors in play to extrapolate too much from that pattern; a lot of that was just products gone bad, or new products gone better. But chief among challenges for all graphs that are rooted in identity-related content is the difficulty of surviving the leap across these major inflection points in our personalities and selves.

One last thought on this topic: while it may be tempting to use Twitter or Facebook as an authentication system for your new website or service to try to jumpstart the growth of your product's graph, first consider if that audience is the right one for your product. They're the right audience much less often than new services and apps think.

There is no one graph to rule them all because we have so many conceptions of ourselves. The exceptions to this rule of multiple selves tend to be people with asymmetric popularity (celebrities or internet luminaries who have many many more followers than people they follow) since they tend to build up the same audience on any network they join. Rather than sharing various sides of themselves, they are just reinforcing themselves on every medium to maintain the image which brings them great wealth and/or popularity.

I used to wonder why Superman ever bothered being humble reporter Clark Kent at all. With infinite energy and the ability to fight crime effectively 24/7 on a global basis, Superman should spend all his time flying around the world checking criminals and natural disasters. Any other use of his time is criminal under-leverage of his skills.

Now that I'm older, I suspect he might just be an introvert who simply needs psychic time away from the spotlight of being Superman to maintain his sanity. On Twitter, Superman has tens of millions of followers to whom he posts photos of his heroic exploits, favorited and retweeted thousands of times, but on Facebook, as Clark Kent, he has just a few hundred friends, and he posts poorly lit photos of himself out on dinner dates with Lois Lane. Each of those photos get about ten or twelve likes each, along with frequent comments from his mother: "Cutest couple ever!!!"

Participation rate and user backlash

As many have pointed out, the reach of Instagram's TOS aren't significantly different than those of services like YouTube and Twitter. But users don't view all social services as equal (and yes, I treat YouTube as a nascent high potential social service, perhaps Google's best chance to build an elite social network). I don't think Twitter users ever worry about their tweets being turned into advertisements, the concept seems extremely unlikely. As for YouTube, from its earliest days the value of having a video hosting service that offers global distribution immediately, for free, seemed worth any amount of advertising.

More importantly, though, I hypothesize one reason the outcry over the modifications to Instagram's TOS have been so much louder is that the ratio of content creation to content consumption on Instagram is higher than for those other services, or for just about any other social network I use. Almost everyone I follow on Instagram seems to post photos from time to time, certainly more than write on Twitter or post videos to YouTube regularly. We need a name for this metric for content sharing social networks:

Number of users who create content / Number of active users on that content network

I'll just call it participation rate for now. My guess is the higher the participation rate on a content social network, the more the users feel like they're creating the bulk of the value on that network. I don't think that's fair to Instagram as they were nearly perfect in creating the purest of social networks, and they do host and distribute a gazillion photos a day. But no matter, that's how users feel, and how they feel determines how they react when the company imposes a value capture mechanism (or in this case, hints at how they'll do it).

What exacerbated the issue was that the new TOS had language that implied that Instagram was going to use your photos to earn money and not provide you with any financial compensation. The users already felt like they'd created a huge percentage of the value in the network, and now it sounded like they'd be exploited to make the company's owners, who had already earned enough money to live out the rest of their days like some of their more well-heeled users, even more money.

[Note that for professional photographers and celebrities on the service, this is actually a serious monetary issue. Naysayers kept mocking regular users for thinking their terrible food and sunset photos would be monetizable in any way, but I follow a bunch of professional photographers and celebrities who make a huge percentage of their living off of monetizing their photographs, and the idea that Instagram could just jump in and take that is absolutely a non-starter. I hope they don't all flee because where else am I going to get my regular fix of pics from the fairy tale life of baddiebey to leave in a constant state of capitalist envy?]

Without knowing what the participation rate is for the leading social networks, it's difficult to test this hypothesis, but one other way to test this would be to look at those who grumbled the most. I suspect they came largely from those who actively post photos instead of just consuming them.

If anyone has any data on this, I'd love to hear it. Namely, it would be fascinating to compare Instagram to two services which I'm guessing have much lower participation rates: Flickr and 500px. Both of those services embrace what I suspect is an audience composed more of a large population of viewers and only partially of contributors (those who upload lots of photos). Both have designed their service with that user distribution in mind, choosing to monetize by targeting only the power users, that sliver of their population that actually upload high-res photos in volume and who value things like higher upload capacity or limits.

It doesn't seem like a strategy Instagram can easily borrow given their viewer/contributor distribution. What fraction of their users could they tax, and for what features? If it's true they have a more evenly distributed base of contributors, i.e. a high participation rate, it may be easier for them to just show ads for all their viewers. That seems the most likely path.