The Passion of the Christ: Blooper Reel

Pulling links to old McSweeneys humor pieces, I couldn't help but think of one other classic humor piece when I was reading back through the Unused commentary by Howard Zinn and Noam Chomsky for The Fellowship of the Ring, and that is The Passion of the Christ: Blooper Reel.

Christ, shackled to a stone, is being scourged by Roman soldiers. Blood runs down his gory back. His pain is palpable. 
 
Jesus: [writhes in pain, hands shaking]
 
[Cell phone rings.]
 
Jesus: [hands shake furiously]
 
[Cell phone rings. Caviezel looks up, sheepish.]
 
Roman soldier: Jim? That you?
 
Jesus: Yeah.
 
[Cell phone rings.]
 
Soldier: Want me to get it?
 
Jesus: Yeah.
 
[Roman soldier gingerly reaches into Caviezel’s blood-soaked loincloth, pulls out phone and opens it, then holds the phone to Caviezel’s ear.]
 
Off Camera: [laughter]
 
Jesus: Hey, Mom.

 

...

The Last Supper. Jesus is in the upper room with his disciples. Judas (Luca Lionello) is seated nearby.

Jesus: If the world hate you, ye know that it hated me before it hated you. If ye were of the world, the world would love his own: but because ye are not of the world, but I have chosen you out of the world, therefore the world—ah, Christ.

Judas: Hateth you. 

Jesus: Who’s on first, right?

Judas: [laughs]

Jesus: [rolls eyes at camera] John could write gospel, but, you know, could he write dialogue?

Off Camera: [laughter] Cut!

Taylor Swift, a Socratic Dialogue

TAYLOR SWIFT: Tell me, Socrates, must the player always play, play, play?
 
SOCRATES: Well, that depends on what it is to be a player and what it means to play. Could you be more specific?
 
SWIFT: I’m thinking of the dirty, dirty cheats of the world. Those about whom so many get down and out while they could be getting down to sick beats. Alcibiades, for example, abandoned Athens and sought refuge in Sparta, then left Sparta for Persia before finally returning to Athens, leaving an inter-imperial trail of broken hearts.
 
SOCRATES: Yes, I see. Alcibiades is, in fact, a player who will play, play, play.
 
SWIFT: Yes, very much so.
 
SOCRATES: But must he? That is the question at hand.
 
SWIFT: I believe he must for, consider that the hater must hate, hate, hate, and the faker must fake, fake, fake. Why should the player be different?
 

From McSweeneys, in sort of what I think of as an exemplar of what one of their homepage humor pieces is like, a bit of a highbrow-lowbrow remix. Since I'm here, some of my other favorites from McSweeneys.

Unused commentary by Howard Zinn and Noam Chomsky for The Fellowship of the Ring:

ZINN: Well, power needs to have its proxies. That way the damage is always deniable. As long as the Hobbits have the ring, no one will ever question the plot Gandalf has hatched. So here is the big scary ring, and all that happens when Gandalf moves to touch it is that he sees a big flaming eye. And notice it is a… different kind of eye — not like our eye.
 
CHOMSKY: Almost a cat-like eye.
 
ZINN: It’s on fire. Somehow being an on-fire eye is this terrible thing in the minds of those in Middle Earth. I think this is a way of telling others in Middle Earth to be ashamed of their eyes. And of course you see the Orcs’ eyes are all messed up, too. They’re this terrible color. And what does Gandalf tell Frodo about the ring? “Keep it secret. Keep it safe.”
 
CHOMSKY: “Let’s leave the most powerful object in all of Middle Earth with a weak little Hobbit, a race known for its chattering and intoxication, and tell him to keep it a secret.”
 
ZINN: Right. And here we receive our first glimpse of the supposedly dreadful Mordor, which actually looks like a fairly functioning place.
 
CHOMSKY: This type of city is most likely the best the Orcs can do if all they have are cliffs to grow on. It’s very impressive, in that sense.
 
ZINN: Especially considering the economic sanctions no doubt faced by Mordor. They must be dreadful. We see now that the Black Riders have been released, and they’re going after Frodo. The Black Riders. Of course they’re black. Everything evil is always black. And later Gandalf the Grey becomes Gandalf the White. Have you noticed that?
 
CHOMSKY: The most simplistic color symbolism.
 

It's Decorative Gourd Season, Motherfuckers:

I don’t know about you, but I can’t wait to get my hands on some fucking gourds and arrange them in a horn-shaped basket on my dining room table. That shit is going to look so seasonal. I’m about to head up to the attic right now to find that wicker fucker, dust it off, and jam it with an insanely ornate assortment of shellacked vegetables. When my guests come over it’s gonna be like, BLAMMO! Check out my shellacked decorative vegetables, assholes. Guess what season it is—fucking fall. There’s a nip in the air and my house is full of mutant fucking squash.
 
I may even throw some multi-colored leaves into the mix, all haphazard like a crisp October breeze just blew through and fucked that shit up. Then I’m going to get to work on making a beautiful fucking gourd necklace for myself. People are going to be like, “Aren’t those gourds straining your neck?” And I’m just going to thread another gourd onto my necklace without breaking their gaze and quietly reply, “It’s fall, fuckfaces. You’re either ready to reap this freaky-assed harvest or you’re not.”
 

“Toto's 'Africa'” by Ernest Hemingway:

The plane’s wings were moonlit and reflected the stars. The moonlight had guided him there, toward this salvation. He had stopped an older man along the way, hoping to find some long forgotten words, or perhaps an ancient melody, for such an occasion. The old man had said nothing at first, and instead stared cryptically into the sodden earth. Then he raised his head and turned slowly.
 
“Hurry, boy. It’s waiting there for you,” the old man had said.
 

The one I read every winter during the holidays, as classic as Little Drummer Boy, In Which I Fix My Girlfriend's Grandparents' WiFi and am Hailed as a Conquering Hero:

Some in the kingdom thought the cause of the darkness must be the Router. Little was known of the Router, legend told it had been installed behind the recliner long ago by a shadowy organization known as Comcast. Others in the kingdom believed it was brought by a distant cousin many feasts ago. Concluding the trouble must lie deep within the microchips, the people of 276 Ferndale Street did despair and resign themselves to defeat.
 
But with the dawn of the feast of Christmas did a beacon of hope manifest itself upon the inky horizon. Riding in upon a teal Ford Focus came a great warrior, a suitor of the gentlefolks’ granddaughter. Word had spread through the kingdom that this warrior worked with computers and perhaps even knew the true nature of the Router.
 
The people did beseech the warrior to aid them. They were a simple people, capable only of rewarding him with gratitude and a larger-than-normal serving of Jell-O salad. The warrior considered the possible battles before him. While others may have shirked the duties, forcing the good people of Ferndale Street to prostrate themselves before the tyrants of Comcast, Linksys, and Geek Squad, the warrior could not chill his heart to these depths. He accepted the quest and strode bravely across the beige shag carpet of the living room.
 

Ack, I need to stop. McSweeneys humor pieces are like Doritos or Pringles, you can't eat just one.

Past imperfect

Yet apparently none of this – not the demographic facts, Atticus’ denial of being a radical, nor the iconoclast’s commentary – prepared readers for the July publication of Go Set a Watchman, which depicts Atticus as a reactionary defender of Jim Crow set against the advance of civil rights. Many readers of Mockingbird never fully accounted for the fact that we receive our understanding of Atticus through the eyes of his eight-year old daughter, whose failure to make the above inferences is not evidence that Atticus is free of racism, but of her limited understanding. These readers projected their own views of racial equality onto Atticus, willing into existence the anachronism of a southerner born in the nineteenth century having the racial views of, say, a liberal baby boomer. I suspect that some readers only remember the film version in any event, which does not contain Atticus’ convincing denial of radicalism. And it may be particularly difficult to imagine the liberal Californian Gregory Peck, who portrayed Atticus (listed by the American Film Institute in 2003 as the greatest hero in American film), as thinking about race as a 1930s white southerner did.
 
While it might then appear that I am in accord with the iconoclastic critics of the Mockingbird Atticus, I am not. I understand the misreading of Atticus in a way that parts company with them. The problem is not, as these critics seem to think, that Mockingbird readers saw Atticus as a hero because they mistakenly thought he was a true racial egalitarian. The problem is that the readers saw Atticus as a radical egalitarian because, for other reasons, he was a hero, and it alleviates cognitive dissonance to believe our heroes are unsullied and uncompromised. Which is to say that, despite my criticisms of the Mockingbird Atticus, he is heroic. Facing considerable risks, he tried to save a man he hardly knew from a false charge of a capital crime.
 

I have not read Go Set A Watchman, but I'm fairly confident I won't read a more intriguing review of it than this one. The early reviews, at least among my peers, was unanimously harsh. Richard McAdams offers a more complex verdict on the book, one well worth reading to the end of his review for.

I also enjoyed this bit.

When it comes to preventing or correcting injustice, sometimes the world works this way: the courageous but compromised individual accomplishes more than the principled but timid one. Here, I think of the fact that the Yad Vashem in Israel bestows the honorific “righteous among the nations” on non-Jews for having risked their lives to save Jews during the Holocaust, not for holding the best, most enlightened views of Jews. There was never reason to think Atticus had the most enlightened views of African-Americans, but when the racist mob came for Tom Robinson, he did more than speak out.

Our precious soil

Top cities became hotbeds of innovative activity against which other places could not easily compete. The people clustered together boosted each others’ employment opportunities and potential income. From Bangalore to Austin, Milan to Paris, land became a scarce and precious resource as a result; the economic potential of a hectare of a rural Kentucky county is dramatically lower than that of a hectare in Silicon Valley’s Santa Clara county. And there is only so much of Santa Clara to go around.
 
Yet more Santa Clara could be built, were it not for the second and more distressing factor behind land’s return: the growing constraint imposed by land-use regulation. The Santa Clara town of Mountain View, for instance, is home to some of the world’s leading technology firms. Yet nearly half of the city’s homes are single-family buildings; the population density is just over 2,300 per square kilometre, three times lower than in none-too-densely populated San Francisco.
 
The spread of land-use regulation is not hard to understand. The clustering that adds to local economic vibrancy has costs, too, as the unregulated urban booms of the 19th century made clear. Crowded slums were fertile soil for crime and epidemics; filthy air and water afflicted rich and poor alike. Officials began imposing new rules on those building in cities and, later, on those extending them: limiting heights and building designs; imposing maximum densities and minimum parking requirements; setting aside “green belts” on which development was prohibited. Such regulations have steadily expanded in scope and spread to cities around the world.
 
As metropolitan economies recovered from their mid-20th-century slump populations began growing again. The numbers of people living in the central parts of London and New York have never been higher. And as demand for quality housing increased the unintended consequences of the thicket of building regulation that had grown up in most cities became apparent.
 

Great piece in the Economist about how land has become a constrained commodity, a brake on our economic growth. A lot of wealth has been made off of rent capture, and most of it accrues to the already wealthy while the middle class are saddled with mortgage debt and the poor, who rent, are just priced out of desirable regions.

This is a nightmarish scenario for the economy. It's bad any time you have any constraint on growth, but when that scarce commodity is land, it seems particularly difficult to remove because it has to be done through a political system that is in thrall to the moneyed, connected, real-estate-owning class.

The ugliest effect of the return of land, though, may be the brake it applies to the economy as a whole. One of the main ways economies increase worker productivity, and thus grow richer, is through the reallocation of people and resources away from low-productivity segments to more efficient ones. In business this means that bad firms go bust and good ones grow to great size. Something similar should hold for cities. Where workers can be put to use at high levels of productivity labour scarcity will lead to fast growing pay packets. Those pay packets will attract workers from other cities. When they migrate and find new, high-paying work, the whole economy benefits.
 
Mediterranean Avenue to Boardwalk
But that process is now breaking down in many economies. For workers to move to the high wages on offer in San Francisco, they must win an auction for a home that provides access to the local labour market. The bidding in that auction pushes up housing costs until there are just enough workers interested in moving in to fill the available housing space. Salaries that should be sending come-hither signals are ending up with rentiers instead, and the unfairness can trigger protest, as it has in San Francisco. Many workers will take lower-paying jobs elsewhere because the income left over after paying for cheaper housing is more attractive. Labour ends up allocating itself toward low-productivity markets, and the whole economy suffers.
 
Chang-Tai Hsieh, of the University of Chicago Booth School of Business, and Enrico Moretti, of the University of California, Berkeley, have made a tentative stab at calculating the size of such effects. But for the tight limits on construction in California’s Bay Area, they reckon, employment there would be about five times larger than it is. In work that has yet to be published they tot up similar distortions across the whole economy from 1964 on and find that American GDP in 2009 was as much as 13.5% lower than it otherwise could have been. At current levels of output that is a cost of more than $2 trillion a year, or nearly $10,000 per person.
 

First and foremost, let's acknowledge that this is all solvable if we just relax land use regulations, build more housing, and increase the population density of our urban centers. Supply and demand still works in this universe. For a variety of reasons, some clear to me, like NIMBYism, some not clear to me, that seems intractable. Does the new David Simon show explain how hopeless it all is? I need to watch it so I find some entertainment value in my despair.

If that's a road to nowhere, can the germ of a solution come from the private sector? Tech companies tend to be creative about trying to solve problems that constrain their growth because they arise from a culture of ignoring accepted impediments. For now, though, they haven't made a ton of progress on this issue. At most they've turned to providing shuttle services with wi-fi that widen the geographic footprint in which their employees can live and still get to work.

But that doesn't work if real estate is expensive everywhere within that radius. What's next? Perhaps a deeper investment in conferencing or virtual reality technology to amplify the sensation of proximity and intimacy of even the furthest flung workers? It's long been a promise of technology, but it has yet to be realized in full.

What about turning office space into living space at night when it lies idle? It's sounds ridiculous, but a company did that with the slack time of cars and seems to raise a billion dollars of capital every other week. I know, it sounds terrible, living at the office, but I'm wary of making paternalistic prescriptions about how a person should spend their free time. Some of my closest friends in life are people I spent a lot of time with at the office at Amazon during my years there.

Maybe tech companies will start their own housing developments? The economic productivity of the average productive tech worker likely still exceeds their compensation, and tech companies, led by Google in particular, have been aggressive in pushing into that gap with increased salary, benefits, stock options, etc. Housing is just another form of investment, and they need not provide it for free, it could just be subsidized. Of course, that means they'd need to acquire land, and that, paired with a thicket of land-use regulations, still restrict the human density achievable with even the most aggressive development.

Finally, solving the problem for just tech workers doesn't feel like a path towards solving it for the rest of the population. Frankly, no one feels much sympathy for tech workers these days (with the exception of some Amazonians who are working long hours, though I'm still suspicious; a lot of it feels secretly like Schadenfreude in disguise). Bay Area complaints about high real estate costs are going to fall on deaf ears, even if it's symptomatic of a dangerous trend for our economy, as noted by the Economist article.

Group all the things I miss the most about NYC, and the root of all of them was the unmatched human density. It's not just the visceral sensation of the people around you (which some dislike), but the diversity of businesses and communities that can sprout and thrive when the potential customer base is so large and tightly packed. The variety of entertainment, like theater and museums, the variety of local cuisine, the ability to find someone to share almost any interest, from curling to revivalist arthouse cinema to hip hop dancing.

I have no answers, only a longing for the Bay Area to experience the liveliness of density in the physical world to match that of the the networks and communities they've built in the virtual space, where real estate is cheap and plentiful.

The network's the thing

Last week Instagram announced it was supporting more than just square aspect ratios for photos and videos. This led of course to a Slate article decrying the move, because Slate is that friend that has to be contrarian on every topic all the time, just to be annoying.

The square confines Instagram users to a small area of maneuver. It forces us to consider what details are essential, and which can be cropped out. It spares us from indulgence of the landscape and the false promise of the panorama.
 
But Instagram, which is owned by Facebook, is in the business of accommodating its users, not challenging them. One of the problems with the square, the company explained in its announcement, is that “you can’t capture the Golden Gate Bridge from end to end.” This example speaks to the needs of a certain kind of Instagram user who enjoys planting his flag on settled territory. Like an iPhone videographer at a Taylor Swift concert, the guy Instagramming the Golden Gate Bridge is not creating a rare or essential document, only proof that he saw it with his own eyes.
 
And why did he bother doing that, anyway? Clearly, because photographs cannot really capture the scope of the Golden Gate Bridge, or St. Peter’s Basilica, or the view from your car window as you drive up the Pacific Coast Highway. The impulse to capture these moments on camera is shaded by the knowledge that the moment, in all its immediacy, is too large to fit in a frame of any size.
 

I don't think my friend who snapped a pic of her daughter this morning or the friend who memorialized the little leaf the barista made in the foam on his latte was contemplating how wonderful it was that they were sparing me from the “indulgence of the landscape and the false promise of the panorama” but what do I know. I'm fairly certain the guy Instagramming the Golden Gate Bridge (I've done that a few times on Instagram) realizes he's not “creating a rare or essential document” but it never hurts to remind him, I'm sure he appreciates being set in his artistic place.

I'm glad Instagram is accommodating the additional aspect ratios, and it's a sign of how powerfully their network has matured. People confuse arbitrary limits on social networks—Twitter's 140 character limit, Instagram's square aspect ratio and limited filters, to take two prominent examples—with their core asset, which is the network itself. Sure, the limits can affect the nature of the content shared, but Instagram is above else a pure and easy way to share visual content with other people and get their feedback. That they started allowing videos and now differing aspect ratios doesn't change the core value of the network, which is the graph.

In fact, this move by Instagram validates the power of their network. If they were failing they either wouldn't have survived long enough to make such a move or it would be positioned as some desperate pivot. Instagram is dealing from a position of strength here, expanding the flexibility of its tools to meet the needs of a still growing user base.

In the same way, Twitter should have lifted the 140 character limit on DMs much earlier than they did. The power of Twitter, deep down, is that it's a public messaging protocol. The 140 character limit is not its secret power. The network is.

I'd actually remove the 140 character limit on Tweets as well, though such a move would undoubtedly spawn even more of a public outcry than Instagram's move since so many power users of Twitter are journalists. Yes, a 140 character limit enforces some concision in writing, rewarding the witty among us, but it also alienates a lot of people who hate having to edit a thought multiple times just to fit in the arbitrary limit. Lots of those people abandoned Twitter and publish on Facebook instead. Twitter could always choose to limit how much of a Tweet to display in the Timeline so as to allow for a higher vertical density of Tweets in the timeline, when people are scanning.

Look at how many users of Twitter have to break long thoughts across multiple Tweets, in Tweetstorms or just long linked series of Tweets. Many of those are power users, yet I still see power users do it incorrectly every day, making it really difficult to follow the entire sequence. Users who want to link tweets in a Tweetstorm or just to link their own Tweets together in a series should reply to each of their Tweets, removing their own username in the process. This allows readers who click one tweet to easily see the rest of the Tweets in the series, and removing one's own username adds back some characters for the content and prevents it from seeming as if you're talking to yourself like a crazy person. That many have no idea how to do it is just one of Twitter's usability issues. It's a wonderfully elegant public messaging protocol, but its insistence on staying so low level is crazy. Don't even get me started on putting a period before a username in a Tweet, try explaining that to your mother with a straight face.

Here's another example. Look at how many experienced Twitters users now turn to apps like OneShot to attach screenshots of text to their Tweets as photos, to circumvent the 140 character limit. I happen to really enjoy those screenshorts, as they're sometimes called now, and they demonstrate how Twitter could expand their 140 character limit without overwhelming the Timeline: just truncate at some point and add a click to expand function. This is yet another example of users generating useful innovation on top of Twitter when it should be coming from within the company.

Rather than force users to jump through all these hoops to publish longer content, Twitter could just allow users to write more than 140 characters in one Tweet, truncating the whole of it after some limit and posting a Read More button to allow readers to see the rest of the thought. Blasphemy! many will shout. I can already see the pitchforks in the distance. Some good old blasphemy is just what Twitter needs.

Longer character limits would likely increase the ability to follow conversations and dialogues on the service, too. One of the wonderful things about Twitter is that conversations between specific users can be read by other users. That's one of the greatest things about Twitter as a public messaging protocol. But because replies have to fit within 140 characters, often they need to be broken up into multiple Tweets. Many who reply don't realize that unless they hit the reply button on the previous Tweet in the conversation, the dialogue link is broken. Many mistakenly compose a new Tweet to continue the dialogue, not realizing that any reader clicking on that Tweet will not automatically see other Tweets in that conversation. Instead, it will just display by itself, as an orphan.

I run into this multiple times every day on the service, clicking on a Tweet without any easy way to figure out what it was in response to. If a lot of time has passed, it's often impossible to piece the conversation back together. It drives me crazy. I tried explaining how to piece broken conversation threads like this back together to a few people who abandoned Twitter and then realized I sounded like a madman. Why, in this day and age, should they have to learn such low level nonsense? Threaded conversations are, for the most part, a solved UI issue in this day and age.

I'm not done with the character limits, so hold your disgust. You may wish to bring more than just your pitchforks after I'm done. Every Twitter conversation that involves more than two people devolves into a short series of retorts that eventually dies because each additional username consumes more of the 140 character limit, until there is no more room for actual dialogue.

It's absurd, but it's always been that way. Why usernames count towards the 140 character limit has always befuddled me. Meaningful conversation always has to migrate off of Twitter to some other platform, for no reason other than a stubborn allegiance to an arbitrary limit which made sense in the SMS age but now is a defect. If you're going to keep a character limit (could we at least double it?), let's not have usernames count against the limit. In fact, if I hit reply to someone's Tweet, do we even need to insert that person's username at the front of the Tweet? You can still send the notification to that user that I replied to their Tweet, and suddenly my reply won't seem so oddly formatted to the average reader. There are plenty of ways to indicate who the message is addressed to through contextual formatting, and if I wanted to mention them explicitly I could always write @username in the Tweet. But it's unnecessary to insert it by default.

Vine is perhaps the only network whose chief content-creation limit seems intrinsically part of the network, and that's because video is one type of content which can't be scanned, in which each additional second of content imposes a linear attention cost of one second on the viewer. A six minute video costs the reader 60X the attention cost that a 6 second video does, and to even create a 6 second video of any interest requires some clever editing to produce a coherent narrative. A Vine video joke has its own distinct pace, it's like a two line riddle, often a 4.5 second setup with a 1.5 second punchline (at least that's the pacing in most Vines in my home feed).

This 6-second limit still constrains the size of Vine's userbase, and they may be okay with that. I think that's fine. I enjoy Vine, it's its own art form. Still, the 6 second limit means a lot of people don't turn to it for a lot of their video sharing. It's not easy to come up with a succinct 6 second video clip.

Look at how Snapchat has evolved to see another company realizing that its power is not the initial constraint but the network. Snapchat still imposes a 10 second limit on video length. But now you can string many videos together into My Story. This was brilliant on their part; it allows viewers to skip any boring clip with one tap, but it allows the creator to tell longer stories simply by shooting multiple snaps in sequence. They lowered the content generation cost on creators without meaningfully increasing it for viewers.

Furthermore, Snapchat now allows you to download your Stories to your camera roll. Those who claim ephemerality is the key to Snapchat's success might panic at such a change, but all it demonstrates is that they realize they now have users for whom ephemerality isn't the main draw of the service. They haven't confused an arbitrary early limit for being the root of their success, and they understand the underlying power of their platform.

Perhaps more than any other social network, Facebook has long recognized that their chief asset is their graph. They've made all sorts of major changes to their interface, changes that always leads to huge initial outcries from their users, followed by a fade to silence as users continue to access the service in increasing numbers.

That they recognized this and had the courage of their convictions from such an early stage is not to be discounted. Plenty of companies live in fear of their early adopters, who often react negatively at any change. This leaves these companies paralyzed, unable to grow when they hit  saturation of their early adopter segment. Because the global market of users has been expanded by the unprecedented reach of connected smart phones, early adopter segments can now number in the tens of millions, confusing companies into thinking that their early adopter segment is actually the mass market.

Twitter, more than any other company, needs to stop listening to its earliest users and recognize that deep down, its core strength is not the 140 character limit per Tweet, nor is it the strict reverse chronological timeline, or many other things its earliest users treat as gospel.

It's not even the ability to follow people, though for its current power users that has proved a useful way to mine some of the most relevant content from the billions of Tweets on the service. If Twitter realizes this, they'll understand that their chief goal should not necessarily be to teach the next several hundred million users how to follow hundreds of people, the way that the early adopters did. To do so is to mistake the next wave of users as being identical in their information consumption preferences and habits as the first 300 million, or whatever the true active count is among that number (I'm going to guess it's in the range of 40 to 80 million truly active daily users, though it's hard to tell without seeing the data).

Twitter's chief strength is that it's an elegant public messaging protocol that allows anyone to write something quickly and easily, and for anyone in the world to see that writing. It's a public marketplace of information. That's an amazing network, and the reason people struggle to describe Twitter is that a platform like that can be used for so many things.

If Twitter realizes that, then they'll realize that making that information marketplace much more efficient is the most critical way to realize the full potential of what is a world-changing concept. How do you match content from people who publish on Twitter with the readers who'd enjoy that content?

A people follow model is one way, but a topic-based matching algorithm is another. Event-based channels are just a specific version of that. Search is one option, but why isn't there browse? I can think of a dozen other ways to turbocharge that marketplace off the top of my head, and the third party developer community, kicked out of the yard by Twitter so many times like stray dogs, could likely come up with dozens of others if they were allowed back in.

Twitter can leave the reverse chronological timeline in place for grumpy early adopters. It can be Twitter Classic. Most of those early adopters are largely happy with things the way they are, and if Twitter is scared to lose them, leave the current experience in place for them. I honestly don't think they'd abandon the service if Twitter raised the 140 character limit, or allowed for following of topics, or any number of other changes suggested here, because I think the power of the network is the network itself, but if the company has any such trepidations, it's not a big deal to leave Twitter Classic in place. The company has a huge engineering and product team, it's easy to park that experience in maintenance mode.

When social networks come into their own, when they realize their power is not in any one feature but in the network itself, they make changes like this that seem heretical. They aren't. Instead, these are fantastic developmental milestones, indicative of a network achieving self-awareness. A feature is trivial to copy. A network, on the other hand, is like a series of atoms that have bonded into a molecule. Not so easy to split.

It's a post for another day, but one of the defining features of our age is the rise of the network. Software may be eating the world, but I posit that networks are going to eat an outsized share because they capitalize disproportionately on the internet. Journalism, advertising, video, music, publishing, transportation, finance, retail, and more—networks are going to enter those spaces faster than those industries can turn themselves into networks. That some of our first generation online social networks have begun self-actualizing is just the beginning of that movement.

“People need dramatic examples to shake them out of apathy and I can't do that as Bruce Wayne. As a man, I'm flesh and blood, I can be ignored, I can be destroyed; but as a symbol... as a symbol I can be incorruptible, I can be everlasting.” Bruce Wayne, Batman Begins