Why the world is getting weirder

It used to be that airliners broke up in the sky because of small cracks in the window frames. So we fixed that. It used to be that aircraft crashed because of outward opening doors. So we fixed that. Aircraft used to fall out of the sky from urine corrosion, so we fixed that with encapsulated plastic lavatories. The list goes on and on. And we fixed them all.

So what are we left with?

As we find more rules to fix more things we are encountering tail events. We fixed all the main reasons aircraft crash a long time ago. Sometimes a long, long time ago. So, we are left with the less and less probable events.

We invented the checklist. That alone probably fixed 80% of fatalities in aircraft. We’ve been hammering away at the remaining 20% for 50 years or so by creating more and more rules.

We’ve reached the end of the useful life of that strategy and have hit severely diminishing returns. As illustration, we created rules to make sure people can’t get in to cockpits to kill the pilots and fly the plane in to buildings. That looked like a good rule. But, it’s created the downside that pilots can now lock out their colleagues and fly it in to a mountain instead.

From a great piece by Steve Coast on why the world is getting weirder. Follow the Pareto Principle long enough and you fix all the low-hanging fruit with a whole bunch of rules, leaving just the black swans unaccounted for.

Anyone who has worked on any tech product or service long enough, through many cycles, knows you can end up working on just edge cases. Often, when you hit this point, you're listening to a sliver of power users and are at the point of such diminishing returns that accommodating them might be counterproductive as a whole. All you do by adding that random feature they want is add some interface overhead and friction for majority of your users, whose problems you already solved.

At this point, if the user base is large and healthy enough, most smart and ambitious companies move on to launching new products and services with higher marginal returns on their resources 3 . The resource vacuum is often exacerbated by the fact that the most ambitious employees would rather work on the new new thing. So they move on to the latest hot top secret project, leaving the former product or service in a maintenance mode, with minimal oversight.

  1. Large and successful multi-product companies that reach this point often just kill off the product or service if the user base isn't large enough. Think Google Reader or Apple's Ping. A startup that reaches that point often pivots, sells themselves, or folds.

It's usually the right near-term economic thing to do, but it can also leave some widely used products or services with chronic issues or imperfections that puzzle users and outsiders. How, they wonder, can a company with thousands of employees not bother to fix such longstanding and seemingly trivial issues? This is why competition is healthy, even if sometimes it seems like we have too many redundant products/services in tech.

Coast's post also includes some good career advice.

On a personal level we should probably work in areas where there are few rules.

To paraphrase Peter Thiel, new technology is probably so fertile and productive simply because there are so few rules. It’s essentially illegal for you to build anything physical these days from a toothbrush (FDA regulates that) to a skyscraper, but there’s zero restriction on creating a website. Hence, that’s where all the value is today.

If we can measure economic value as a function of transactional volume (the velocity of money for example), which appears reasonable, then fewer rules will mean more volume, which means better economics for everyone.

Far and near future sci-fi

Enjoyed this tweetstorm from Noah Smith a short while back. Here's a cleaner version of it, though it doesn't look embeddable. I'll have to do this the hard way, then:

Having tried lots of demos recently, I wonder if VR will inspire a growth spurt in near-feature sci-fi movies just because it is more cinematic and compelling on screen than most of today's tech in which the primary action consists of a programmer typing on a keyboard.

Steven Spielberg is set to direct the movie adaptation of Ready Player One next, and I see it as a natural spiritual successor to Minority Report, which contained a lot of ideas from futurists that Spielberg gathered for a brainstorm session prior to production. Minority Report felt like medium-term sci-fi when it came out, and it's already clear that many of its predictions were off. From a technological point of view, if not a social one, Ready Player One reads like very-near-term sci-fi.

There's a shortage of good near-term sci-fi in the movies, often because the filmmaking cycle (from idea to spec script to script to the option to years of sitting cold to finally going into production) is still so much longer than the actual technology industry cycle. Given the momentum of VR now, it's time to mine this fertile ground for more high concept movies that explore the norms after mass adoption of the technology. Given the incredible price pressure on VFX shops in Hollywood, many of which are closing up or suffering margin compression, a spurt of movies featuring a lot of VR scenarios would be a welcome supply of work, too.

I realized the other day that I will watch, in my lifetime, a VR movie about VR technology. I'm excited. No spoilers please.

Facebook hosting doesn't change things, the world already changed

Like any industry, the media loves a bit of navel-gazing (what is the origin of this phrase, because I don't enjoy staring at my own navel; maybe mirror-preening instead?). When Facebook announced they were offering to host content from media sites like The New York Times, the media went into a frenzy of apocalyptic prediction, with Mark Zuckerberg in the role of Mephistopheles.

All this sound and fury, signifying nothing. Whether media sites allow Facebook to host their content or not won't meaningfully change things one way or the other, and much of the FUD being spread would be energy better spent focused on other far larger problems.

Let's just list all the conditions that exist and won't change one bit whether or not you let Facebook host your content:

  • News is getting commodified. The days of being special just for covering a story are over. Beyond millions of citizen journalists that the internet has unleashed, you're competing with software that can do basic reporting. Tech press inadvertently furnished evidence of the commodification of news when, in the past few years, they all did a giant game of musical chairs, seemingly everyone picking up and moving from one site to the next. Are these sites anything more than a collection of their reporters? If so, did the brands meaningfully change when everyone switched seats? I love and respect many tech reporters, but a lot of others seem interchangeable (though I like some of them, too). Instead of just reporting news, what matters is how you report it: your analysis, the quality of your writing and infographics, the uniqueness of your perspective. The bar is higher to stand out, as it tends to be when...
  • ...distribution is effectively free. Instead of pulp, our words take the form of bits that are distributed across...oh, you know. As the Unfrozen Caveman might say, “Your packets of data frighten and confuse me!” The Internet: seventh wonder of the world. This must be what it feels like to have grown up when electricity first became widespread. Or sewer systems. Okay, maybe not as great as sewer systems, I don't know how people lived before that.
  • Marketing is cheaper. You can use Twitter or Facebook or other social media to make a name for yourself. Big media companies can take advantage of that, too, but the incremental advantage is greater for the indies. Ben Thompson is one of my favorite examples, an independent tech journalist/writer living in Taiwan who built up his brand online to the point that I pay him $10 a month to have him send me email every day, and it's worth every penny. He is smarter about the tech industry than just about every “professional” journalist covering tech, and he's covered a lot of what I'm covering here already. He's just one example of how...
  • ...competition for attention is at an all-time high and getting worse. Facebook already competes with you, whether you let them host your content or not. So does Snapchat, Instagram, Twitter, IM, Yik Yak, television, cable, Netflix, video games, Meerkat/Periscope, movies, concerts, Spotify, podcasts, and soon VR. When it comes to user attention, the one finite resource left in media, most distractions are close substitutes.
  • Facebook will continue to gain audience. Even if Facebook pauses for a rest after having gained over 1 billion users, they also own Instagram, which is growing, and WhatsApp, which will likely hit 1 billion users in the near future, and Oculus, which is one part of the VR market which is one portion of the inception of the Matrix that we will all be living in as flesh batteries for Colonel Sanders in the medium-range future. If you think withholding your content from Facebook will change their audience meaningfully one way or the other, you really may be an unfrozen caveman from journalism's gilded age. The truth is...
  • Facebook and Twitter and other social media drive a huge % of the discovery of content. Media companies can already see this through their referral logs. This isn't unique to the text version of media. Facebook drives a huge share of YouTube video streams, which is why they're building their own video service, because why send all that free ad revenue to a competitor when you can move down the stack and own it yourself. And also, YouTube's ad model is not that great: those poorly targeted banner ads that pop up and cover the video in a blatant show of disrespect for the content, those pre-rolls you have to wait 5 seconds to skip...wait a minute, this sounds a lot like how...
  • ...media ad experiences are awful. I wonder sometimes if folks at media companies ever try clicking their own links from within social media like Twitter or Facebook, just to experience what a damn travesty of a user experience it is. Pop-ups that hide the content and that can't be scrolled in an in-app browser so you effectively can't ever close them to read the article. Hideous banner ads all over the page. Another pop-up trying to get you to sign up for a newsletter for the site when you haven't even read the article to see if you'd even want to get that newsletter (the answer is no, by the way). Forced account creation or login screens, also before you read a word of content. An interstitial ad that tries to load video for a few seconds while you wait patiently for a countdown timer or X button to close it out as quickly as possible. Short articles spread across 3 pages for no reason other than to inflate page views. Articles that take so long to load that you just click away because in-app browsers are already disadvantaged from a speed perspective, and media sites compound the problem by loading a ton of cruft like ad tracking and other crap all over the place, reducing the content to just a fraction of the total payload. It's the reading equivalent of being a beautiful girl at a New York bar, getting hit on by dozens of obnoxious first year investment banking analysts in pinstripe suits and Hermès ties. This is what happens when you treat your reader like a commodified eyeball to monetize and not a living, breathing human whose patronage you appreciate and wish to nurture. And this is why I'm happy when services like Flipboard or Facebook transform content into a more friendly reading experience. Chris Cox of Facebook said that reading on mobile is still a crummy experience, and amen to that. The poor media ad experience is a symptom of the fact that...
  • ...media business models are not great. Monopolies don't have to have great business models, because as Peter Thiel will tell you, being a monopoly is itself a great business model. For the longest time at media sites, and this probably still happens, the reporters sat on a different floor for the ad sales folks. This meant that the way the company made money was divorced from the product people (to use a more techie term). This works great when there isn't a lot of user choice (“No one ever got fired for buying IBM”) and the ad sales people can throw their weight around (before), but not so great when ad buyers suddenly have a whole lot more choice in where to spend their money (now). It turns out that having your best product people separate from your ad team is a dangerous game and leads to a terrible ad experience, which should come as a surprise to no one. Many still defend this practice as a way to preserve journalistic integrity, a separation of church and state that keeps the money from corrupting the writing, but the Internet has other ways to defend against that now. It's great that the New York Times has a public editor in Margaret Sullivan, but today the eyeballs of the world on your content serve as one giant collective public editor, like some human blockchain of integrity. I sympathize with media companies, though, because even if they wanted to improve on this front...
  • ...tech companies have better ad platforms than media companies. Facebook's native ad unit may not be perfect, but it's leaps and bounds better than the god awful ad experience on almost any media site. It's better not just for readers, but likely for advertisers, too. At Flipboard, we went with full-page ads a la glossy fashion magazines because our philosophy was that when content is on the screen, it deserves your full attention, and the same with ads, never the two shall meet. This is exacerbated by the smaller screen sizes of mobile phones and tablets. Trying to split user attention with banner ads is a bad idea for both readers and advertisers, and most every study on ad recall and effectiveness that I've seen bear this out. Because of tech companies' scale and technology advantage, as noted in the previous bullet, their ad platforms will continue to get better and scale, while those at media companies will not. When I was at Hulu, we shopped around for an ad platform that could meet all our needs and couldn't find one so we just rolled our own. That's possible if you can hire great developers, but if you're a media company, it's not easy, and that's because...
  • ...tech companies have a tech hiring advantage on non-tech companies. This sounds like it's self-evident, but it's critical and worth emphasizing. It's not just media but other businesses that suffer from this (which is particularly awful for consumers when it comes to information security). At this hour of the third industrial revolution, software is eating the world, but we still have a scarcity of software developers, let alone great ones. The ones that are blessed to live in this age want to work with other great developers at cool technology companies where the lunches are free, the dress codes are flexible, the hours vampiric, and ping pong tables abound. It's like being a free range chicken, but with stock options and before the death and refrigeration. Companies like that include Facebook, Google, Apple, Amazon, and so on, but they don't include most media companies, even though most of those also allow you to dress how you want, I think. Maybe someday the market will overcorrect itself and everyone will know how to program, but by that point we will probably all be living lives of leisure while AI software and robots take care of everything while we just lounge around experiencing a never-ending stream of personalized VR pleasure. If David Foster Wallace were alive to rewrite Infinite Jest, VR would be the infinite jest.
  • Design skill is not equally distributed. In an age when software comes to dominate more of the world, the returns to being great at user interface design are still high and will continue to be for some time. It's no wonder that Apple is the world's largest company now given their skill at integrated software and hardware design. That's become the most valuable user experience in the world to dominate. It's not going to let up, either. Every day I still experience a ton of terrible user experiences, from government to healthcare to education to household appliances to retail to you name it. The number of great product and design people in the world is still much too finite, and it happens that a lot of them work for tech companies. Not for companies in all the other industries I named above. Even in tech, the skills are too finite, which is why enterprise software is being disrupted by companies like Dropbox and Slack and others that simply bring a better user experience than the monstrosities that pass for most enterprise software. And yes, these people tend not to work for media companies.
  • Tech companies are rich. Take all the factors above, add it up, and it comes down to the fact that we're living through another gold rush, and this time most of the wealth is flowing into Silicon Valley. Take a bunch of companies that are extremely wealthy and employ great software developers and designers at a time when software is eating the world, add in a healthy dose of world-changing ambition, and you get companies that keep expanding their footprints, to the point where they are all competing in almost every business. People wonder why Apple might build a car, but I say why not? Above all, they are great at building computers, and what is a Tesla other than another portable computer (“The first one is an oversized iPad. The second is a revolutionary transport vehicle. And the third is a portable air conditioner. So, three things: an oversized iPad, a revolutionary transport vehicle, and a portable air conditioner. An iPad, a vehicle, and an air conditioner. An iPad, a vehicle…are you getting it? These are not three separate devices, this is one device, and we are calling it Apple Car.”)? Facebook, Apple, Google, Amazon, et al all continue to compete directly in more and more spaces because at their heart they are all software companies. I suppose they could have all decided not to compete with each other, but companies looking to maximize return in free markets usually don't behave that way, and so we'll see all of them trying to do more and more of the same things, like stream music and video, build smart phones, deliver stuff, etc. That's how a nuclear arms race happens. Your neighbor has the bomb, it's pointed at some part of your business, you get one too, if for no other reason than defensive purposes. Meanwhile, you also try to do some virgin land grabs, because networked businesses tend to reward first movers, and that's how you end up with tech companies trying to colonize space, build self-driving cars, float balloons around the world to bring the Internet to everyone, and, to bring it full circle, be the new front page for every user.

It's worth repeating: all the things above have been happening, are happening, and will continue to happen whether or not Facebook hosts your content.

By the way, you can still host your own content yourself, even if you let Facebook host yours. Getting yourself set up to host content on Facebook is largely a one-time fixed cost of some time to provide them with some feed. It was the same at Flipboard, though some companies took longer than expected because they couldn't output an RSS feed of their content out of legacy CMS systems. It was shocking to learn that a random blogger on Squarespace or Wordpress or Tumblr could syndicate their content more easily than a famous media company, but that was often the case and speaks to the tech deficit in play here.

This may all sound grim for media companies, but here's the kicker: it really is that grim. Wait, are kickers supposed to be positive? Maybe I meant kick in the butt.

Okay, I can offer some positives. A media company may not be able to be world class at every layer of the full stack, from distribution and marketing to ad sales and producing great content, but it doesn't have to be. Far better to be really good at one part of that, the one that tech companies are least likely to be good at, and that's producing great, differentiated content.

The fact is, great content is not yet commodified. That may sound like Peter Thiel's advice to be a monopoly. Self-evident, non-trivial, not useful. But many of the best advice is just that, as banal as a fortune cookie prescription but no less true.

Let's take The New Yorker as an example. They don't try to compete on breaking news, though they have beefed up on that front with their online blogs. They hire great writers who go long on topics, and thus they can charge something like $50 a year for a subscription because their content is peerless. I'm subscribed through something like 2020 (so please stop mailing me renewal solicitations, New Yorker, please!?).

Look at Techmeme. They provide value by curating all the tech news out there, using a mix of human and algorithm to prioritize the tech news stream to produce Silicon Valley's front page at any given moment in time. Curation is a key part of discovery, you don't have to focus on producing content yourself. A daily visit for me.

Look at HBO. A media company with great content that you can't easily find a substitute for, with a smart content portfolio strategy that minimizes subscriber churn. They surprised me recently by announcing they were going to launch HBO Now, ahead of when I anticipated, at the same price it costs to add it on to cable package. Kudos to them for not letting innovator's dilemma handcuff them for too long.

Look at Buzzfeed. Ignore the jealous potshots from their elders and marvel at their ability to create content you can't easily find elsewhere. That's right, I said it. Despite being the company that everyone says just rips off other people's content, Buzzfeed actually has more content I can't find substitutes for than most tech news sites. It's not just their original content and reporting, which is good and getting better. Like Vox trying to make the top news stories of the day digestible for more people, Buzzfeed takes fun content and packages it in a really consumable way. It turns out in a world of abundance, most people would prefer just a portion of their media diet from the heavy news food group. More of their daily diet is from the more fun food groups, and Buzzfeed owns a ton of shelf space in that aisle. It's something other sites can do, but many avoid because they're too proud or because it isn't part of their brand. I saw white and gold, BTW.

Look at Grantland. They also hit the fun part of the daily diet by targeting pop culture and sports with great writers and new content daily. People jab at Bill Simmons' a lot now that he is in the media penthouse, but he started as a blogger for AOL, and he was the first writer to really channel the fan's voice and point of view. It could've been you, perhaps, but it wasn't.

Look at Gruber, or Ben Thompson, or Marc Maron, or Serious Eats, or The Wirecutter, or Adam Carolla. Hell, even look at Fox News (just don't look too long). It turns out that differentiated content is differentiated. When the world's an all-you-can-eat buffet of information, you want to be the king crab legs, not the lettuce bowl.

The value of being a generalist as a reporter, someone who just shows up and asks questions and transcribes them into a summary article, is not that valuable. If you cover an industry, do you understand that industry? Take tech reporters as an example, many of them don't understand the underlying technology they write about. That may have sufficed in a bygone age, but it no longer does, which is good for Gruber's claim chowder business but not good business. Taking the time to become an expert in a domain still has value because it takes hard work, and that is also not a resource that is equally distributed in the world.

Some companies try to tackle more than one part of the stack, with some success. Look at MLBAM. They have managed to hire some strong technologists and build such a powerful platform that other media companies are syndicating it for their own use. Yeah, it's great to have content from a legally sanctioned monopoly to bootstrap your business, but credit to them for embracing the future and leveraging that content goldmine to build a differentiated technology platform.

Is it easy to replicate any of those? No, but your mother should have taught you that lesson long ago. At least what they're doing is clear and understandable to any outside observer.

If you've stuck with me this long, you may still think that hosting your content on Facebook is a Faustian bargain. Maybe Facebook changes their News Feed algorithm and your traffic vanishes overnight, like Zynga. Or maybe Facebook holds you hostage and asks for payment to promote your content more in the News Feed.

It's possible, but that risk exists whether your content is hosted there or not. Maybe hosting minimizes that risk a bit, but Facebook's first priority will always be to keep their user's attention and engagement because that's how they keep their lights on (and pay for the free lunches). If your content is engaging, it will keep a News Feed roof over its head, and if it doesn't, it won't.

Does that mean you have to write clickbait headlines and package stories up into listicles with animated GIFs? I don't think so, and if that's not your brand then by all means steer clear. That doesn't mean you shouldn't write a compelling headline. I despise clickbait headlines that just try to coax a click when the content has barely anything of substance, just to gain a cheap page view, but I appreciate a well-written headline over a dull one, too. Jeff Bezos used to caution us against the “tyranny of the or,” or false tradeoffs. This is one example. I also believe Zuckerberg and other Facebook execs when they say they'd like to weed out the more egregious clickbait from the News Feed. I understand if others don't, but my general belief about most tech companies is that they're just semi-evil.

Let's go deeper into the FUD. What if Facebook decides to go into the media business themselves? What if, instead of hosting your content, they produce their own and prefer it in the News Feed?

First of all, if that ever happens, it won't happen anytime soon. When you're in the phase of convincing folks to hop aboard your platform, you have to remove that possibility or no one will join.

Secondly, content production isn't generally a business that tech companies love. The margins aren't great, it's a headache to manage creative types, content production is messy and labor intensive, and tech companies prefer playgrounds where software economics play better.

It's far more likely that tech companies use their ample cash to license content. Remember how I said tech companies are rich? It turns out they are richer than movie studios and TV networks and newspapers and book publishers and music labels, and it turns out that writing a check for exclusive content hurts in the short-term but is great in the long run paired with the right business model, regardless of whether that's subscription or subsidized by ads. If you have the best ad units and platform, the marginal return on user attention is higher for you than the next competitor, and that means licensing can make sense. You also get to meet some celebrities, too, who are beautiful and charming.

Lastly, if Facebook wanted to go into the media business, they could do it now, or they could do it in the future, and your Facebook hosting abstinence wouldn't matter one bit. They already have all the eyeballs they need, it's not a situation like Netflix in its early days where they had to build a subscriber base first before they could consider producing their own original content (thank you First Sale Doctrine!). Long before Facebook even had a News Feed where your articles were shared, hundreds of millions of people already tuned in to see what that cute guy or girl was up to, or to see their friends' latest selfie, and other forms of ambient intimacy. I could perhaps even craft an argument where if all the sites out there stood on the sidelines it might accelerate Facebook's move into the space.

And if Facebook did, if they decided to compete with The New York Times and Grantland and all the other media companies, or to buy one or more of them, is that so bad? Maybe you could work for them, if you're unique and differentiated. If you are, you'll do just fine, in this world and the next.

Did I mention they have free lunches?

Here's how the live streaming space will play out

Though the sequencing is rough, here's how the live streaming market timeline will unfold. We know this because we've seen this play out before, and history repeats itself. Until it doesn't.

  1. The first few apps try this idea fail, for a variety of reasons. Timing matters, distribution matters, whatever the reason, they die off. Remember this?
  2. Because of high profile failures, the idea goes into hibernation. It's always darkest before the sunrise, but also right after the first sunset.
  3. The conditions that make the idea possible, though, still exist. Everyone carries phones that can shoot video, all these phones are connected to networks nearly all of the time, apps can be distributed cheaply and easily through app stores.
  4. Someone decides to release an app doing this same thing, again, because damn it, the idea just makes too much sense, right? One-to-many broadcasting works has been democratized in every other medium so far, why not video? For some reason, it sparks this time, maybe because it builds off of the Twitter graph, maybe because the right set of influencers jump in. This is Meerkat. It looks like it was designed by an engineer, or maybe a 9 year old, with cartoony graphics and a bright yellow background. It doesn't matter, though, because it was first at the right time.
  5. A collective recognition from many in the tech community that it's go time for this space, at long last. Some of those who recognize this are some folks at Twitter, who spend a few days deliberating and then quickly shut off Meerkat's unfettered access to their graph. If you're still naive about platform risk when it comes to social networks, shame on you.
  6. Twitter buys Periscope, an app that is similar to Meerkat, but sleeker in design. People on social media beat up on Twitter for not playing fair, as if the business world were governed by any such ethos.
  7. All the media buzz further boosts exposure of and interest in Meerkat, because all PR is good PR in the early days, and that hubbub furthers interest in live streaming. Think of the possibilities, write many a pundit, but they are all obvious to most everyone.
  8. Periscope launches, the spin cycle picks up speed, like your washer at the end of a wash.
  9. Then the backlash. Many people get notifications about streams that are over before they can even see them. When they do get in, they realize the early streams are largely boring. Tech early adopters can be insular (yet another tech event Meerkat!) and uninteresting to the masses. Another Q&A with a VC? Yawn. Early live streams from the masses are the equivalent of early tweets about what people are eating for lunch. A few writers resuscitate previous pieces about the narcissism of this digital age and do a Find-Replace and substitute in live streaming for photo sharing or Tweeting or whatever social media they wrote about earlier.
  10. More backlash. Someone gets a massive cell phone tab for eating through too much bandwidth, and that's when there is enough of it at all; most times the cell networks can't support the load and so streams keep cutting out. How did we ever think this would scale? At the events you most want to live stream as a form of humblebrag, like a Taylor Swift concert or an NBA Finals game, the network is the least reliable because so many people are on their phones. It's the digital age's Tragedy of the Commons.
  11. Live streaming services work on solving obvious bugs that come with an MVP type launch. Better scheduling, recording, more reliable streaming through better video and audio compression, better discovery of interesting streams by topic, region, etc.
  12. The first new undiscovered live stream star is born. I don't know who that person will be, but that person comes along for every medium. They raise the bar on creativity. I'm not sure what they'll do to take advantage of the medium, but maybe it's a citizen journalist, maybe it's a snarky commentator, maybe it's someone unusually attractive, maybe that's all just one person. Their live streams attract hundreds, then thousands, then tens of thousands of viewers. A year later, they give a TED talk, and then Kara Swisher interviews them on stage at the Code/Media conference.
  13. The collective of Vine stars starts dabbling in a spinoff focused just on live streaming, working through some of the challenges of shooting in real-time, without cuts.
  14. Buzzfeed Studios announces a new live stream division. Funny or Die release their first live stream event, it's of Seth Rogen walking around his house naked while smoking pot. Periscope opens up an LA and NY studio with high end audio and video equipment which the top live stream stars can use for their broadcasts.
  15. The first misguided celebrity tries to ban live video streaming at their concert, leading to a bunch of articles about how stupid it is, how you can't put the genie back in the bottle, and why would you try to suppress exposure when user attention is the most scarce and valuable resource in the known universe now?
  16. The first smart mega-celebrity launches a Meerkat. She is Taylor Swift. She instantly causes tens of millions of fans around the world, most of them teenage girls and their moms, but also many who are embarrassed to admit they're fans of her music, but also many who aren't (like me), to download live streaming apps for the first time, slingshotting them to a new plateau of traffic and prominence. In her first live stream, the largest in history to date, we see one of her cats dancing to Shake it Off. Millions of fans comment with emoji, but included are some spam comments, and so some engineers somewhere start beefing up comment moderation tools. Meanwhile, dozens of unlucky journalists have to write the obligatory story about her live stream, even though everyone already saw it and knew about it, and even though the Buzzfeed post on the live stream will get all the traffic anyway.
  17. The first live video stream is mentioned in a TV show, probably CSI Cyber.
  18. Jimmy Fallon launches a recurring segment around live streams from the stars. The first guest is Matthew McConaughey, he live streams while driving around in a Lincoln, muttering to himself. The video goes viral, predictably, and it pops up in your Facebook news feed and Twitter timeline a lot, and though you were too cool to post a link to it because it was too obvious a play for social media distribution, you take note of the volume of mentions on social media.
  19. The MTV Music Awards is the first to be live streamed on purpose. MTV arms a bunch of guests with a dedicated WiFi network and high end smartphones (let's be honest, it's probably Samsung) and has them live stream throughout the program. Many people watch the MTV Music Awards again for the first time in ages, sitting with four connected devices so they can follow along with multiple live streams at the same time. During  a performance by Kendrick Lamar, Taylor Swift live streams herself and her famous, beautiful friends dancing and singing along in the front row, and then the TV cameras catch her dancing and holding her phone to live stream and that plays on your television set at the same moment, and it feels magical, like some form of digital cubism. Kanye's live stream is just continuous pans over to Kim Kardashian's cleavage.
  20. Later, Kanye West briefly live streams himself having sex with Kim Kardashian, but just for a moment, leading millions of people to rush to follow along at the same moment, bringing down the power grid along the Western seaboard.
  21. AT&T and Verizon raise their data fees as they see their network usage rise. They can actually handle the traffic, they just want more money because why the hell not. Ben Thompson links back to his piece on how lots of us didn't really understand what we were bargaining for when we fought for net neutrality.
  22. Thanks to finally having achieved a critical mass of user density, live streaming captures its first journalistic coup. I don't want to speculate on what it will be, but it will be a prolonged and serious event that people have time to hear about and follow live, like the OJ car chase. Some citizen will be there up close for some reason and that person's live stream will be better than anything TV news cameras can capture. Everyone on Twitter and Facebook link to that live stream, and soon even CNN and Fox News and whatever TV news channels are left in the world are just showing that stream live, too. A few reporters rush to write think pieces about how the medium has finally grown up.
  23. Obama live streams from the podium at one of his speeches. As he pans across the faces of the crowd, we see people of all ages and races, and he intones, “The Constitution begins with 'We the people'...” This is the Obama we love, they write.
  24. Netflix signs up its first live streaming show. Directed by David Fincher, it's a show about a powerful VC, played, in a huge coup for the tech industry, by Marc Andreessen. It's released all at once as a continuous 24 hour live stream. During the show, we follow along as Andreessen takes meetings with a variety of people during one packed day in his life. Every so often, he turns his cameraphone on himself to address the audience through the fourth wall in an exaggerated Southern drawl, like Francis Underwood.
  25. Your mom emails you an article from USA Today and asks if you've heard of live streaming. Apparently it's a thing now.
  26. The first live stream ad unit. Until now, the ads have been organic, some live stream stars talking about products and services that pay them for some air time. This, however, is an official ad, from a new live stream ad platform. It runs as a pre-roll before the live stream begins, and the revenue is shared with the content creator. You guessed it, it's an ad for Squarespace, the all-in-one website builder.
  27. Facebook adds a live video streaming button to its app, then shortly after that spins it out into a separate app altogether. They name it Live, and some other company that launched an app called Live that did the same thing a year earlier complains that Facebook stole their name, but no one really pays any attention.
  28. Google launches a wearable VR camera, it is a 360 degree camera helmet that covers your entire head and makes your head look like a fly's eyeball. Some site publishes a piece on how this is the future of VR, and Gruber excerpts a passage and files it away for claim chowder. One of the first beta testers of the camera gets beaten up while walking around the Mission.
  29. Apple files a patent for a compact VR camera that will fit in an iPhone form factor. 
  30. Years later, bandwidth has improved to the point where people can live stream VR. Sort of. The experience is janky, the video super low-res, but it's possible. The first VR live streaming app launches in the iTunes App Store. It's called Cyclops.
  31. VR cameras are too bulky, the video stitching too slow to process in real time, the bandwidth requirements for streaming too onerous. The only people who own such a camera and even know how to set up such a stream are tech early adopters, and so the first VR live stream content is dull, stationary, uninspiring. You get a demo of the product from your geekiest tech friend, and the only VR live stream you can find at that moment is of a pasty-faced VC unboxing the second generation Apple Watch and walking you through its UI. Despite raising a ton of money in a seed round, Cyclops sees little adoption, burns through a ton of cash, and gets acqui-hired by Google. The app shuts down.
  32. The conditions for live streaming VR still exist, though. Somewhere, a young and tech-savvy adult movie star places an order on Kickstarter for a second generation live streaming VR camera which supposedly solves some of the first generation software user experience issues. Wouldn't it be crazy if we could suddenly be in any 3-D immersive environment we wanted at any time? It would be like teleportation! Seriously, maybe, just maybe, there's something there.

 

The genre fiction revolution

The landscape of realism has narrowed. If you think of the straight literary novels of the past decade—The Marriage Plot, The Interestings, The Art of Fielding, Freedom—they often deal with stories and characters from a very particular economic and social position. Realism, as a literary project, has taken as its principle subject the minute social struggles of people who have graduated mainly from Ivy League schools. The great gift of literary realism has always been its characteristic ability to capture the shifting weather of inner life, but the mechanisms of that inner life and whose inner lives are under discussion have become as generic as any vampire book: These are books about privileged people with relatively small problems.
 
Not that these small problems can't be fascinating. It is exactly the best realist novels of our moment which are the most miniature: In Teju Cole's Open City, a man and his thoughts wander over various cities. In Adelle Waldman's The Love Affairs of Nathaniel P., the action consists of the tiny fluctuations of the inordinate vanity and self-loathing of the main character. Both novels are superb, and both are focused on the most minute of details. They draw larger significances from those details, certainly, but the constraints are ferocious. Any discussions of politics or any broader aspect of the human condition are funneled through the characters' fine judgments.
 
In the wide-open spaces left by the narrowing of realism, genre becomes the place where grand philosophical questions can be worked out on narrative terms.
 

Stephen Marche on how genre fiction has become more important than literary fiction.

Genres that endure in any medium across time—the western, horror stories, gangster films, fairy tales, detective novels, landscape paintings, superhero origins, to name just a few—fascinate me. For a genre to resonate across long periods of time, it usually is about some deep-seated human condition or social issue, allowing the artist to make a statement about that merely by manipulating the conventional elements of the form. The genre is like a ritual known to both the artist and the audience, allowing an immediate and efficient engagement between the two parties.

Why there are faux Irish pubs everywhere

Ireland, as much of the world knows it, was invented in 1991. That year, the Irish Pub Company formed with a mission to populate the world with authentic Irish bars. Whether you are in Kazakhstan or the Canary Islands, you can now hear the lilt of an Irish brogue over the sound of the Pogues as you wait for your Guinness to settle. A Gaelic road sign may hang above the wooden bar and a fiddle may be lying in a corner. As you gaze around, you might think of the Irish—O, that friendly, hard-drinking, sweater-wearing people!—and smile. Your smile has been carefully calculated.
 
In the last 15 years, Dublin-based IPCo and its competitors have fabricated and installed more than 1,800 watering holes in more than 50 countries. Guinness threw its weight (and that of its global parent Diageo) behind the movement, and an industry was built around the reproduction of "Irishness" on every continent—and even in Ireland itself. IPCo has built 40 ersatz pubs on the Emerald Isle, opening them beside the long-standing establishments on which they were based. 
 
IPCo's designers claim to have "developed ways of re-creating Irish pubs which would be successful, culturally and commercially, anywhere in the world." To wit, they offer five basic styles: The "Country Cottage," with its timber beams and stone floors, is supposed to resemble a rural house that gradually became a commercial establishment. The "Gaelic" design features rough-hewn doors and murals based on Irish folklore. You might, instead, choose the "Traditional Pub Shop," which includes a fake store (like an apothecary), or the "Brewery" style, which includes empty casks and other brewery detritus, or "Victorian Dublin," an upscale stained-glass joint. IPCo will assemble your chosen pub in Ireland. Then they'll bring the whole thing to your space and set it up. All you have to do is some basic prep, and voilà! Ireland arrives in Dubai. (IPCo has built several pubs and a mock village there.)
 

The strange true-life story of how Ireland packaged and exported a version of its culture all over the world, and how it boomeranged back to Ireland in the form of added tourism revenue. As with technology apps these days, it's all about marketing and distribution.

The Jinx

This week I finally caught the finale of The Jinx: The Life and Deaths of Robert Durst. If you haven't seen it yet, then avoid the SPOILERS in this post ahead and move on.

The ending, as many have noted, was stunning, like some Michael Haneke movie come to life. Rarely has a still shot of an empty room been so fraught with horror. Just before then, when confronted with handwriting evidence that seemed to implicate him irrefutably, Durst started burping loudly, as if his subconscious was about to regurgitate the truth on camera. And then it did? Durst muttering “Killed them all, of course.” into a hot mic while he was in the bathroom alone couldn't be any more of a Shakespearean soliloquy if it came from the pen of the Bard himself.

The hot mic's the thing, wherein I'll catch the conscience of the king.

Like some, however, I take issue with some of Jarecki's choices. The first is his use of reenactments. I yearn for more just talking heads when it comes to documentary style, so I can understand the temptation of reenactments. Rather than just having someone talk about something that happened, you can hand the viewer a visual.

In doing so, though, you rob the viewer of their imagination, and you unconsciously bias them in all sorts of ways. One person might claim something happened. By actually enacting that moment on screen, that testimony gains corporeal form and feels more real. Or, if the reenactment is lousy, it seems less credible. Either way, the visuals overpower the spoken word, even as one is just one filmmaker's fancy.

Richard Brody writes:

Reënactments aren’t what-ifs, they’re as-ifs, replete with approximations and suppositions that definitively detach the image from the event, the vision from the experience. One of Jarecki’s reënactments leaves me strangely obsessed with an insignificant detail that takes on an outsized significance in revealing the inadequacy of his method for the emotional essence of the story. In the second episode, Kathie Durst’s friend Gilberte Najamy tells Jarecki that, before her disappearance, Kathie Durst went to a party at her house, where she told Najamy that she was afraid of Robert Durst, and insisted that, if anything happened to her, Najamy should “check it out.” To signify that there had indeed been a party at Najamy’s house, Jarecki offers a tracking shot of a table laden with platters of food—including a pasta salad with a single pitted black olive sticking up from it. I’m obsessed with that olive. Did Najamy describe to Jarecki the dishes that she served? Did she describe the table itself, the room? Did Jarecki film this scene where Najamy lived at the time, or where she lives now? Or did Jarecki assume that Najamy, or someone like Najamy (whatever he’d mean by that), would at the time have served that kind of pasta salad at a party that might look like that? Najamy’s account is powerful; Jarecki’s image is generic. Najamy is specific, concrete, and detailed. She delivers a crucial piece of her life, whole, to Jarecki—who treats it like a hack’s screenplay and makes a facile illustration of it.
 
Beyond the awe-inspiring (and sometimes awful) recollections of people involved in the past events that are at the center of the drama, Jarecki brings into play actual objects that bear a physical connection to them—which is why the objects of dubious provenance (such as a box of police records relating to Kathie Durst’s disappearance, sealed with red “evidence” tape) are such offenses to the dignity of the film’s subjects. Jarecki shows this box being taken from a shelf; he puts the camera inside the shelf and shows the box being put back there; he shows the box being unsealed and then sealed again. It’s impossible to know whether this is the actual evidence box for the case; whether the handwriting on the box is actually that of a police clerk from the time; whether the files pulled from it were handled by the actual investigators who worked on the case; whether the room where it’s stored is the actual file room or a studio mockup.
 

Jarecki doesn't just shoot conventional re-enactments, either. They are highly stylized. In my memory's eye, two shots from the series I can't shake (besides the last one of the series) are one of some actress playing Durst's mother committing suicide and the other of some actress playing Susan Berman toppling after being shot in the head. Both are images of female bodies falling, and both are played in slow motion, over and over, like something fetishistic shot from 300.

What's a shame is the series doesn't need them. Some of the reenactments are less stylized, but that just makes them harder to distinguish from live shots from the present. I don't mind a mixture of fiction and non-fiction in documentaries, but some spirit of fair play seems called for, especially when it's documentary as investigative journalism.

Many probably find all of this to be nitpicking and may not have had any problems with the series as filmed. It may be easier to understand if we examine the question using a series that many grouped with The Jinx, the podcast Serial. Imagine in Serial if, after Sarah heard testimony from a witness like Jay about seeing Hae's body in the trunk of the car at Best Buy, she put together an audio recreation of those events. If Sarah had hired some voice actors to play Adnan and Jay, recreating the conversation as Jay recalled it, layering in sound effects like a trunk popping open. Regardless of whether listeners felt Adnan was guilty, many would be uncomfortable with the technique.

The last episode steers clear of reenactments, but the cumulative effect of the one's from the first five episodes was such that I wasn't sure whether to buy the shots of Jarecki himself in the finale, speaking about how he feared for his life (this piece at Buzzfeed goes into a more in-depth stylistic breakdown of the narrative manipulation at work). Jarecki clearly doesn't shy from drama, but the use of all these tricks leads one to discount everything on screen, the way one applies a base level of skepticism to stories from a proven drama queen.

Another issue with the series is Jarecki's manipulation of the timeline. In the last episode, it seems as if Robert Durst agrees to sit with Jarecki for another interview (the now infamous one which concludes the series) only after police arrest Durst outside his brother's home. I thought for sure that was the sequence of events because it's shown in that order, and the series includes audio from a phone call from Durst to Jarecki asking for the director's help.

But when Jarecki was asked about whether he had manipulated this timeline in the NYTimes, he suddenly seemed as uncomfortable as Durst was in the last interview of The Jinx.

When did you discover the piece of audio from the bathroom, in which Mr. Durst seemed to confess?
 
Jarecki: That was at the tail end of a piece of an interview. I don’t know if you’ve ever edited anything — things get loaded into the editing machine but not everything gets loaded. The sound recorder isn’t listening after a guy gets up and says he wants a sandwich. It often doesn’t get marked and get loaded. That didn’t get loaded for quite a while. We hired some new assistants and they were going through some old material. That was quite a bit later. Let me look at my list. It was June 12, 2014.
 
So it was more than two years later. From watching the episode, it seemed as if the 2013 arrest of Robert Durst for violating the order of protection by walking on his brother Douglas’s brownstone steps happened after the second interview.
 
Jarecki (to Smerling): I’m hearing a lot of noise. And if we’re going to talk about the timeline, we should actually sit in front of the timeline. So that’s my suggestion, if that’s the subject you want to talk about.
 
I’m just trying to clarify if the arrest for being on Douglas Durst’s property happened after the second interview.
 
Jarecki: Yeah, I think I’ve got to get back to you with a proper response on that.
 

Someone check the tails of that audio recording of Jarecki's interview, maybe his mic was still hot?

Maybe, as some have put it, we're a bunch of whiny brats all that matters is we caught that murderer and got six hours of lurid, compelling TV to boot. Judging by what critical reception I've seen, The Jinx was a resounding success, and so, perhaps as the underrated movie Nightcrawler depicted, we'll happily go along with a coming wave of vigilante journalism.

Perhaps the filming of The Jinx can be the subject of Serial, Season 2. Vigilante journalism recursion, the snake eating its own tail. Who am I kidding, I wouldn't be able to look away.

The incident of the dog in the night-time

The Department of Justice’s report on the Ferguson Police Department is full of eye-catching numbers that reveal a culture plagued by significant racism. Statistically significant. For instance, nearly ninety per cent of the people who prompted a “use of force” by the F.P.D. were black. Even among such skewed percentages, there are some standouts. Among cases in which a suspect was bitten by an attack dog and the suspect’s race was recorded, what percentage were black?
 
A hundred per cent.
 
There is little nuance in the incidents described in the report; the police simply sicced their dogs on unarmed black males. According to the F.P.D’s own guidelines, handlers should not release the hounds “if a lower level of force could reasonably be expected to control the suspect or allow for the apprehension.” But the report reveals that the F.P.D. is quick to set loose its trained attack dogs—often on black children.
 

The damning DOJ report on Ferguson is a great example of data as an objective racism detector. This might be an example of dogs revealing the racism of their owners.

A 2011 study published in the journal Animal Cognition found that even expertly trained dogs and the most professional handlers cannot evade what is called the Clever Hans effect. In tests, dogs trained to detect explosives and drugs were sent, with their handlers, into a series of rooms to find non-existent contraband. In one room, there was a decoy that had been scented with sausage; in another, there was an unscented decoy accompanied by a sign telling the handler, falsely, that it smelled of contraband; a control room had no decoys. The investigators found, overall, that “human more than dog influences affected alert locations”: the meat decoy attracted more false alarms than anything in the control room, but the decoy with the sign prompted nearly twice as many false alerts as the one with the tempting scent. In other words, the dogs found their handlers’ unconscious cues significantly more compelling than the sausage. Trained animals, it turns out, are arguably better at reading our cues than we are at suppressing them.
 

Remember, there are no racist dogs, only racist owners.