Opaque intelligence

Alex Tabarrok writes about what he calls opaque intelligence.

It isn’t easy suppressing my judgment in favor of someone else’s judgment even if the other person has better judgment (ask my wife) but once it was explained to me I at least understood why my boss’s judgment made sense. More and more, however, we are being asked to suppress our judgment in favor of that of an artificial intelligence, a theme in Tyler’s Average is Over. As Tyler notes notes:

…there will be Luddites of a sort. “Here are all these new devices telling me what to do—but screw them; I’m a human being! I’m still going to buy bread every week and throw two-thirds of it out all the time.” It will be alienating in some ways. We won’t feel that comfortable with it. We’ll get a lot of better results, but it won’t feel like utopia.

I put this slightly differently, the problem isn’t artificial intelligence but opaque intelligence. Algorithms have now become so sophisticated that we human’s can’t really understand why they are telling us what they are telling us. The WSJ writes about driver’s using UPS’s super algorithm, Orion, to plan their delivery route:

Driver reaction to Orion is mixed. The experience can be frustrating for some who might not want to give up a degree of autonomy, or who might not follow Orion’s logic. For example, some drivers don’t understand why it makes sense to deliver a package in one neighborhood in the morning, and come back to the same area later in the day for another delivery. But Orion often can see a payoff, measured in small amounts of time and money that the average person might not see.

One driver, who declined to speak for attribution, said he has been on Orion since mid-2014 and dislikes it, because it strikes him as illogical.

One of the iconic moments from Hitchhiker's Guide to the Galaxy is when a supercomputer finally finishes computing, after 7.5 million years, the answer to the ultimate question of life, the universe, and everything, and spits out 42. Perhaps that is how far beyond our understanding a super-intelligent AI will be. We may no more understand them than a snail understands humans. Defined that way, opaque intelligence is just artificial intelligence so advanced we don't understand it.

Someday a self-driving car will make a strange decision that will kill someone, and the software will be put on trial, and despite all the black box data recovered we may have no idea what malfunctioned. Sometimes my iPhone randomly crashes and reboots, I couldn't begin to tell you why.

I'm waiting for the dystopic sci-fi movie that postulates an armageddon scenario much more likely than Skynet in Terminator. That is, rather than waste time building cyborg robots to hunt us down, a truly super-intelligent AI that wanted to kill off humans could just simultaneously order a million self-driving cars to speed headlong into each other, all the planes in the world to plunge into the ground, all our nuclear reactors to melt down, and a dozen other scenarios far more efficient than trying to build humanoids that walk on two legs.

Not as visually exciting enjoyable as casting Arnold, though. In a way, it's reassuring that for all the supposed intelligence of Skynet, it sends back a Terminator that still has a terrible Austrian-accented English, as if artificial speech technology was the one technology that failed to keep up despite AI making leaps as complex as gaining consciousness.

Too late

Now that the Mayweather-Pacquiao fight has been set for May 2, it's a good time to link back to my post “The fight we wanted, but not really” as nothing has really changed.

Mayweather-Pacquiao would have been a great fight five years ago, when Pacquiao and Mayweather were both younger and faster. Pacquiao, by virtue of being a southpaw with the endurance to throw an unbelievable volume of punches and the gift to throw fast from unexpected angles, would have been a real challenge to Mayweather's great defense and technical precision. Mayweather would have landed shots Pacquiao for sure since Pacman sacrifices defense for offense (and isn't the defensive whiz that Mayweather is anyhow), but on sheer punch volume, Pacquiao might have landed more total punches, making a fight that went to the judges scorecard a really dicey proposition for Mayweather.

But as is his style, Mayweather is too smart, observant, and cautious, and he knew the magnitude of threat posed by Pacquiao. As I noted in my previous post, Mayweather rarely fights opponents in their prime, when they'd be the greatest threat to him. He gets them early or he gets them late, on either shoulder of their prime, and in this case, it's Pacquiao on the downslope from his peak.

A perfect record is a valuable asset, and you can't argue with the sheer volume of money Mayweather has made over the years. His fight selection has been near impeccable, and who he fights is his call. I don't think it was fear driving his decision-making, either. Someone of his boxing genius would be a deserving favorite in every fight he's ever taken, and that includes Pacquiao then or now.

Fight fans just prefer a narrative of combat sport that casts its best fighters as fearless warriors, ready to take on any and all challengers out of the sheer need to prove indomitable. When we picture a fighter, we don't think of a calculating tactician, selecting each fight based on deep analysis of the opponent and a better than likely chance of winning.

Pacquiao and his camp also bear fault. Both sides conjured reason after reason the fight wouldn't be made: the size of the purse, how it would be split, drug testing policies, etc. At times it wasn't clear who was resorting to which excuse.

It's not just that a fight closer to their primes would have been a better fight, but it might have been the first in a classic two or three fight series. Instead boxing got a bunch of other fights in the intervening years that meant very little to most boxing fans, assuming there are any left besides the inner circle.

That Mayweather finally accepted the fight should tell you all you need to know about where Pacquiao's skill level is versus five years ago, but you can go to the videotape if you need further proof. I fully expect the line to show Mayweather as a healthy favorite, with only perhaps a large and more naive betting public pushing the line closer.

In boxing, it has almost always been true that if there's enough money, a fight will happen. It held true this time as well, only a lot of that money will be nostalgia past its expiration date.

I'll still watch the fight, I've long had a Joyce Carol Oates-like fascination with the sweet science, but I'm not springing for the PPV. I wrote that check so long ago I can't find it anymore.

Where to set the safety threshold?

Since I’ve been involved with designing and marketing play apparatus, fall surfacing, climbing walls and skateparks the issue of protecting kids from falls and the use of helmets has figured prominently throughout my five decades in this field. My experience leads me to the opinion that helmets, and other protective gear should be worn when the player has the intention of testing the limits of their skill or when the environment is unpredictable. For example helmets when dirt biking or on busy streets are a good idea but may not necessary when playing in the neighborhood.

What makes us safe is not protective devices but judgment, honed reflexes, and fundamental movement skills. The goal is to reduce the frequency and severity of injury. If you watch a toddler learning to walk they have several innate behaviors that help achieve this end. When they are about to fall forward their reaction is to resumes their crawling gait and extend their arms in what is called “protective arm reflex.” When the fall is backwards they drop to their bottoms. In both cases these instinctual reactions to the job of head protection very well.

The question arises then, what is the impact of using a safety helmet? In talking with child development physiologists they suggest several issues. First they suspect, although there is little research on this, that such protective gear may disrupt the normal progression of reflex maturation. They also are concerned that the lack of consequences when falling may retard the child’s ability to form proper assessments of their skill, i.e. reduce their judgment. Finally they speculate that it reinforces a pattern of parenting that is over protective and ultimately harmful.

From this example we can see that, what might appear as a good idea is fraught with complexity and perhaps unintended consequences.

From this post on playground design, questioning a proposal by the ASTM Playground Surfacing Committee (yes, that is a thing) to engineer more safeguards into public playgrounds.

The motivation appears to be that the goal of improving playground safety with the current standard has not significantly reduced the number of hospital visits.

To my mind this is not unlike the logic of the medieval doctor who, when their patient did not get well with one blood letting concluded that they needed more blood letting.

Parenting seems like a delicate balancing act. You can set the safety threshold too high, leaving your child too brittle for the real world they will someday inhabit without you. The anti-vaxxers seem to fall prey to that miscalculation.

I'm very curious to study the parenting style and childhood peer set of kids who become serial entrepreneurs because those are people who seem to have a better understanding than the average person of the concept of risk/reward and thus a healthier acceptance of failure. An overly cautious personality, maybe someone who has always had good grades in school, may only want to play deterministic games, where the relationship between hard work and success is linear.

Entrepreneurship, especially in tech these days, is a probabilistic game. That's not a comfortable style of game for those who bruise easily. Watch someone who isn't in a probabilistic modality sit at a blackjack table and witness their discomfort with every losing hand. Their safety threshold may be set so high the only acceptable play is to never sit down at the table at all.

[That's not to say even those who think probabilistically think they're going to lose when they sit down at a table, and that goes for entrepreneurs as well. The only way the whole system works is if 10 out of 10 entrepreneurs think they'll succeed even as they know 9 out of 10 will fail. As long as everyone thinks they're that 1 out of 10, we get that 1 out of 10.]

The secret technology of The Daily Show

Many have mourned Jon Stewart's announcement that he'll be leaving The Daily Show this year. Count me among those dressed in black; Stewart felt like my cool, whip smart Jewish uncle the past 16 years. One can claim that sometimes it's the format of a program that endures, and not the bodies filling the seats—for example, with Saturday Night Live—but with both The Daily Show with Jon Stewart and The Colbert Report, that's just wishful thinking. These two shows, like the late night talk shows, have long had their hosts very names in the titles, and for good reason; without Stewart and Colbert, the shows will become something different out of both necessity and circumstance.

Emily Nussbaum wrote a wonderful appreciation of Stewart's legacy, and one piece of it caught my eye for pointing out what I consider the show's most undervalued skill.

The truth is that Stewart was often at his most exciting when he got down in the dirt, instead of remaining decent and high-minded, your twinkly-eyed smartest friend. Five years ago, when he confronted MSNBC’s financial reporter Jim Cramer over his coverage of Wall Street, Stewart refused to be collegial. He nailed Cramer on his manipulations, airing clip after damning clip, and shouting “Roll 212!” with prosecutorial glee. He was a good interviewer with people he admired, but in some of the show’s most memorable segments he relied on search technology—in particular, his staff’s ability to cull clips and spin them into brutal montages—to expose lies that might have gone unremarked upon. Over time, he became not merely a scourge of phonies but the nation’s fact checker, training others in the craft. You can see that influence not only among hosts who started out on “The Daily Show,” including Colbert, John Oliver, and Larry Wilmore, but everywhere online. Twitter, on its best nights (and they do exist, doubters), can feel like a universe of sparkling mini-Stewarts, cracking wise but also working to mob-solve the latest crisis, and providing access to a far wider array of perspectives than any one comic could.

That kind of digging, of disrespecting authority, was a model for reinventing journalism, not comedy.

The secret technology behind The Daily Show was search.

Any viewer is, by now, familiar with the show's format. The opening third, almost always my favorite, would feature Stewart tackling a variety of the most prominent current events in politics and society and putting either some of the protagonists or the media on trial. Sometimes he'd dissect them himself, like a gifted if somewhat smug trial lawyer, but more often than not, he won by jiu jitsu. He let witnesses hang themselves on a rope of their own words.

I've never read how they do it, but the Daily Show seems to have catalogued every piece of video from every politician and reporter in the history of television. Did a politician claim one thing? Here's a clip of them from another time, contradicting themselves. Did Fox News castigate Obama for his decisiveness on a piece of foreign policy? Here's a clip of their anchors praising Bush for the same quality when it came to a similar situation. Often that opening portion of The Daily Show felt like a Three Stooges clip, with hapless politicians slapping themselves in the face, Stewart and his writing staff pulling the strings.

Do they have banks and banks of cable boxes and DVRs, recording every minute of CSPAN, Fox News, CNN, and MSNBC, converting all the dialogue to text, labeling every moment with row after row of metadata? How many researchers do they have on staff? How do they retrieve clips so quickly each day, and what is the interface for that system? Can they run searches by simply stringing together words like "Bill O'Reilly" "hypocrisy" "Iraq War"? Or is there a giant dropdown box with a bunch of predefined categories like "old white senators saying racist things"?

In turning what seems like the entire history of televisions news into a deeply catalogued primary source, The Daily Show lifted the journalistic standard of television news. This isn't a new phenomenon. The internet is, above all else, the greatest information distribution technology in history, and many a writer or journalist has realized too late that it's not their immediate fact checker or editor whose standards matter but that of millions of internet-connected people with lots of time and Google as their default homepage. Linus Torvalds is credited for saying “given enough eyeballs, all bugs are shallow.” I propose a corollary, “given enough eyeballs and enough metadata, all lies become public.”

In cycling, drug testing authorities keep samples of bloods for years after events so they can test samples retroactively as better drug-detection tests are devised. Why, in the age of the internet, people continue to plagiarize is beyond me, but even if one can get away with it for the moment, everything ever written and posted online lives on until that day when the original text is indexed and made searchable and detecting the crime becomes a matter of a trivial exact match query.

Video is late to this game, though, because it's much harder to index the spoken dialogue in video. Some companies have solutions, I've seen many a demo at trade shows, and we indexed closed caption files at Hulu, too. However, it's still not easily available to consumers on a significant percentage of video online. Yet. That's what made what I'm presuming to be The Daily Show's video catalog or index so remarkable.

The third episode of Season 1 of Black Mirror, “The Entire History of You,” postulated a world in which The Daily Show's technology for trapping people with evidence of their own hypocrisy existed in our personal lives. An implant in our brains would record and index every moment of our lives, allowing us to put each other on trial for the rest of our days. It's a common downside scenario for total recall technology, mentioned in almost any article that has experimented with   prototypes.

That episode of Black Mirror ends badly, as is common in this age of somewhat bleak science fiction. Real world evidence isn't so conclusive yet. Despite the almost nightly prosecution of The Daily Show with Jon Stewart, politicians and media like Fox News don't seem to have changed their behavior much, at least not to any level I can detect. Not even the rich and powerful are above shame, but it's safe to say many of them have a higher than average tolerance.

As for having our personal hypocrisies made shallow, I can't imagine that a greater leniency towards each other wouldn't win out over continual witch hunting. Furthermore, a mutually assured destruction of reputation might naturally result in a bottoms up detente. After all, who among us hasn't said something we later wished to expunge or walk back? Some people point to internet trolling as a counter-example, but I suspect it's largely over-indexing on the loud minority over the reasonable silent majority as our human brains love to do.

Even if such technology were widespread and forced us all to be more considered before we wrote or spoke, is that so bad? Taken to an extreme, that's a terrifying Orwellian scenario, but when Nussbaum writes that “[Stewart's] brand was decency,” she understands that much of the show's appeal was his own reasonable nature. Stewart often seemed exasperated at the rigid rhetorical stances in American politics, but it's difficult to believe he would have lasted 16 years at the desk if he didn't believe, deep down, that if we just hold up a mirror to ourselves, not a black mirror, nor one one ringed with flattering warm lights, but just the clearest one available, we'd grow the hell up.

90 yrs of The New Yorker > 40 yrs of SNL

This weekend, my social media streams were teeming with activity surrounding the Saturday Night Live 40th anniversary special that aired Sunday night. I grew up watching SNL, and my brothers and I still can fall into character from old skits like former members of some vaudeville troop. I watched all of SNL 40 live, and it felt like comfort food seeing all those old familiar faces reunited. For those easily star struck, that after party sounded like the most fun assemblage of comedians, movie stars, athletes, and musicians ever. Take any one segment of those subgroups and it wouldn't be nearly as appealing a gathering, but the intermingling of the four is something magical which only SNL has pulled off on a consistent basis.

Being on air for 40 years is a genuine accomplishment. Nothing else on TV has been with me for as long, SNL has spanned my entire life. The Simpsons is the only other show that comes close in that era, but it has fallen so far off its peak that fans are speculating that the past 20 years of the show have just been figments of a comatose Homer Simpson. The Simpsons is also an animated show while SNL has had to survive continuous turnover of real flesh and blood talent, something that adds to the degree of difficulty.

All that said, even as an unabashed SNL fan, the most powerful emotion I felt watching SNL 40 was nostalgia, and that's a feeling pointing the wrong direction. For much of the show, I wasn't laughing at anything on screen as much as I was chuckling at the recollection of funnier moments retrieved from memory. Some of the montages of clips were so cut up so fine that only an SNL die hard would know what some of the punchlines referred back to; it felt special, flattering even, to laugh at those remembered jokes considering the lineup of famous people we were sharing the laugh with. If only Chris Farley were alive, they could have run back an entire half hour of The Chris Farley Show, having him interview all the cast members there. “Remember that time when you were like...and then she was like...and then...? That was awesome.” Yes, we remember, and yes, it was.

Taken on pure comedic value, much of SNL 40 wasn't all that hilarious, and this season hasn't been the show's strongest either. The entertainment context has changed, and it's not a surprise that more and more of the funniest SNL bits each week are pre-recorded. Whereas in my childhood Saturday night was the only time all week one could watch comedy sketches, now they can be found around the clock online. Even on TV, shows like Inside Amy Schumer and Key and Peele and even, to some extent Broad City, have spread edgier and more viral sketches across the weekly calendar (and walk back the calendar a few more years, of course, and you'll find In Living Color, MADtv, and The Chappelle Show). Jimmy Kimmel and SNL alum Jimmy Fallon now produce comic skits on late night TV, something Stephen Colbert, Jon Stewart, John Oliver, and the rest of the Comedy Central late night TV show posse have been doing for years now. Lonely Island brought digital shorts to SNL at the perfect moment given the rise of fat viral pipes like YouTube, but everyone has put that memetic infrastructure to good use. If I were to name the top 10 funniest videos I've seen the past few years, I'm not sure if SNL places one on that list.

When Louis CK came on stage during SNL 40 and pointed out that the pre-recorded material was often better than the stuff performed live, it was funny for being true. Yet the live performance is the one thing that continues to set the show apart. Andy Samberg and Adam Sandler's digital short on SNL 40 poked fun at the all the times SNL performers cracked up during live performances, something Lorne Michaels is said to have hated in the beginning, but that's become an endearing tic that reminds viewers of the loose, improvisational nature of the program. Even the live studio audience is a bit of an anachronism, but a charming one.

Of course, I don't watch SNL live anymore, but in my childhood, and even after our family bought a VCR, I often did. It felt like a real treat to stay up Saturday night to catch the program, and watching live felt like watching with millions of households in the country, all tuned in at once. SNL 40 drew 23.1 million overall viewers during the 8-11:30 time slot, reminding us of that age of TV when millions would watch something at the same time that wasn't a sporting event. Now it's easier to watch SNL the next morning on Hulu or off your DVR, giving social media overnight to identify the sketches worth watching.

As long as the inimitable Lorne Michaels has the energy to guide SNL, I have no doubt it can stay on air. Saturday night is a bit of a graveyard for television anyhow, so I don't see anything else rising up to seize that slot of the weekly calendar from SNL. Capturing one night of the week isn't what it used to be, though.

SNL's 40th anniversary happened to occur the same week that The New Yorker put out its 90th anniversary issue. For the great accomplishment that surviving on TV 40 years in a row is,  maintaining cultural relevance as a magazine for 90 years might be an even more astonishing achievement. I've been a New Yorker subscriber since I was in high school, and it's the only magazine or newspaper I've read continuously that whole time. For all the troubles befalling the publishing industry, The New Yorker seems to be going as strong as ever, having built their brand not on something ephemeral, like a local monopoly on distribution or a niche perspective on a narrow interest, but on deep, world-class reporting on what matters in politics, science, medicine, technology, arts, and culture.

As with SNL, the stable of New Yorker writers and reporters has turned over many times over the decades, but while one might argue with a few of them, the assemblage of talent that has graced the pages of that magazine over the years is even more impressive than the gathering of performers on stage at the end of SNL 40. I can easily mention dozens of writers from The New Yorker that most people I know have never heard of that rank among some of the greatest journalists I've ever read.

Take for example Wolcott Gibbs. Read Backward Ran Sentences for a sampling of his brilliance. Like many of The New Yorker's best writers, he was so smart and such a gifted writer he could cover just about anything. And he did. He wrote fiction and non-fiction. He covered theater, but later he reviewed books and movies. He could profile the famous one week and capture the most notable details of an everyday moment from his own life for The Talk of the Town the next week. Much like Phil Hartman or Will Ferrell, he was just another versatile genius you wanted to see in action no matter what he applied himself to.

Of all the magazine's qualities, perhaps none elicit more of my professional jealousy than their famous house style. I have yet to find a comprehensive guide that outlines it in detail, but read enough New Yorker pieces and you know it. Tom Wolfe once described it as such: “The New Yorker style was one of leisurely meandering understatement, droll when in the humorous mode, tautological and litotical when in the serious mode, constantly amplified, qualified, adumbrated upon, nuanced and renuanced, until the magazine’s pale-gray pages became High Baroque triumphs of the relative clause and appository modifier.”

It's notable that their house style was not for everyone. Nothing precise ever is. The magazine never published any of David Foster Wallace's non-fiction pieces, to pick one example. As John Jeremiah Sullivan (himself a great essayist) wrote in a review of DFW's The Pale King:

It's worth noting, in that regard, that The New Yorker, which published some of his best fiction, never did any of his nonfiction. No shame to Wallace or The New Yorker, it's simply a technically interesting fact: He couldn't have changed his voice to suit the magazine's famous house style. The "plain style" is about erasing yourself as a writer and laying claim to a kind of invisible narrative authority, the idea being that the writer's mind and personality are manifest in every line, without the vulgarity of having to tell the reader it's happening. But Wallace's relentlessly first-person strategies didn't proceed from narcissism, far from it—they were signs of philosophical stubbornness. (His father, a professional philosopher, studied with Wittgenstein's last assistant; Wallace himself as an undergraduate made an actual intervening contribution—recently published as Fate, Time, and Language—to the debate over free will.) He looked at the plain style and saw that the impetus of it, in the end, is to sell the reader something. Not in a crass sense, but in a rhetorical sense. The well-tempered magazine feature, for all its pleasures, is a kind of fascist wedge that seeks to make you forget its problems, half-truths, and arbitrary decisions, and swallow its nonexistent imprimatur. Wallace could never exempt himself or his reporting from the range of things that would be subject to scrutiny.

I can understand Wallace's refusal to bend to New Yorker house style. Plain style can smack of a false omniscience or objectivity when I disagree with the author. For example, I believe a lot of East coast magazines and newspapers write with some bias about the tech industry. Some of it may be some jealousy over West coast institutions like Amazon, Apple, Google, Facebook, and Twitter rising up to challenge the cultural centrality of the East coast intellectual elite (hip hop and rap are not the only cultural battleground pitting the two American coasts against each other). Some of it may just be a lack of total understanding of the technology itself. In such pieces, the plain style can feel like wallpaper over faulty construction.

That quibble aside, most of the time, it is a wonder. Clean, clear, elegant. I consider The New Yorker's plain style to be a variant of what Steven Pinker calls the classic style. I can never think of what to say when people ask me which three people in history I'd most want to have dinner with, but I can say unequivocally that if I could choose one editor to edit my prose for the rest of my life it would be long time New Yorker editor Eleanor Gould. Upon her death, David Remnick said, “If it's true The New Yorker is known for the clarity of its prose, then Miss Gould had as much to do with establishing that as its more famous editors and writers.” If you need further proof, E.B. White thanked Gould in the credits of that bible of usage The Elements of Style: “The co-author, E. B. White, is most grateful to Eleanor Gould Packard for her assistance in preparation of this second edition.”

Someday I hope The New Yorker sees fit to publish a house style guide as a public service, to improve prose everywhere. Until then, we'll have to live off of the occasional scrap like this Wolcott Gibbs' memo. It includes such gems:

1. Writers always use too damn many adverbs. On one page recently I found eleven modifying the verb ‘said’. ‘He said morosely, violently, eloquently, so on.’ Editorial theory should probably be that the writer who can’t make his context indicate the way his character is talking ought to be in another line of work. Anyway, it is impossible for a character to go through all these emotional states one after the other. Lon Chaney might be able to do it, but he is dead.

2. Word ‘said’ is O.K. Efforts to avoid repetition by inserting ‘grunted’, ‘snorted’, etc., are waste motion and offend the pure in heart.

10. To quote Mr Ross again, ‘Nobody gives a damn about a writer or his problems except another writer.’ Pieces about authors, reporters, poets, etc. are to be discouraged in principle. Whenever possible the protagonist should be arbitrarily transplanted to another line of business. When the reference is incidental and unnecessary, it should come out.

11. This magazine is on the whole liberal about expletives. The only test I know of is whether or not they are really essential to the author’s effect. ‘Son of a bitch’, bastard’, and many others can be used whenever it is the editor’s judgement that that is the only possible remark under the circumstances. When they are gratuitous, when the writer is just trying to sound tough to no special purpose, they come out.

13. Mr Weekes said the other night, in a moment of desperation, that he didn’t believe he could stand any more triple adjectives. ‘A tall, florid and overbearing man called Jaeckel.’ Sometimes they’re necessary, but when every noun has three adjectives connected with it, Mr Weekes suffers and quite rightly.

15. Mr Weekes has got a long list of banned words beginning with ‘gadget’. Ask him. It’s not actually a ban, there being circumstances when they’re necessary, but good words to avoid.

20. The more ‘As a matter of facts’,  ‘howevers’, ‘for instances’, etc. etc. you can cut out, the nearer you are to the Kingdom of Heaven.

23. For some reason our writers (especially Mr Leonard Q. Ross) have a tendency to distrust even moderately long quotes and break them up arbitrarily and on the whole idiotically with editorial interpolations. ‘Mr Kaplan felt that he and the cosmos were coterminous’ or some such will frequently appear in the middle of a conversation for no other reason that that the author is afraid the reader’s mind is wandering. Sometimes this is necessary, most often it isn’t.

24. Writers also have an affection for the tricky or vaguely cosmic last line. ‘Suddenly Mr Holtzmann felt tired’ has appeared on far too many pieces in the last ten years. It is always a good idea to consider whether the last sentence of  a piece is legitimate and necessary, or whether it is just an author showing off.

25. On the whole we are hostile to puns.

28. It has been one of Mr Ross’s long struggles to raise the tone of our contributors’ surroundings, at least on paper. References to the gay Bohemian life in Greenwich Village and other low surroundings should be cut whenever possible. Nor should writers be permitted to boast about having their telephones cut off, or not being able to pay their bills or getting their meals at the delicatessen, or any of the things which strike many writers as quaint and lovable.

31. Try to preserve an author’s style if he is an author and has a style. Try to make dialogue sound like talk, not writing.

How much of anything lasts 90 years anymore, let alone remains relevant in the modern world? To endure for that long, it's enough to be stubborn, but to remain fresh and thrive for that long speaks to some evolutionary fitness. I'm not sure SNL will outlive me, but The New Yorker most likely will.

Happiness is a skill

2014 was the year I got serious about happiness.

It was a strange thing to look at my life and realize how rarely I was happy. I'm making a good living as a writer, which has always been my dream. I have a wonderful family, and we all have our health. It felt like I had hit all the necessary milestones to feel both very adult and very content, but my brain rarely rewarded me with the sort of happiness I craved.

I've often heard that happiness is a skill, not a feeling, and I realized how little time I was spending working on the skill of happiness, while waiting passively for the feeling to reach me. It also seemed like my love of gaming and pop culture was hindering this journey, not helping.

From Steam sales to streaming content there was always so much to do, so many piles of shame, that even free time began to feel overwhelming and stressful as I tried to get through everything I wanted to do in the rare time I had for my "fun" pursuits after the children went to sleep. When Netflix, the Kindle app, a gaming laptop and gaming consoles both new and classic offered nearly endless choices, it's easy to become overwhelmed without playing or consuming anything you used to find enjoyable.

This is how I deal with these feelings, and it's a combination of many small things that led me to be much more content and less skittish about not only gaming in particular, but life in general. You're free to take or reject any bit of this advice, everyone is different and you may already be perfectly content with life, but if even one of these things I've learned helps you, that's a win. Here we go!

Odd to find a random nugget of wisdom on a gaming website, but I find myself nodding along at the notion that happiness is a skill, not a feeling. I suspect many of our emotions are actual human constructs, and not, as we are often led to believe, some intrinsic neurological wiring. The importance of believing happiness is a skill is that it puts the control of your own happiness in your own hands.

Facebook and Plato's cave

Plato is a great philosopher of information without the word being there. When it comes to the classic image of the myth of the cave, you can reinterpret the whole thing today in terms of the channel of communication and information theory: who gets access to which information. The people chained in front of the wall are effectively watching television, or glued to some social media. You can read it that way without doing any violence to the text. That shows two things. First, why it is a classic. A classic can be read and re-read, and re-interepreted. It never gets old, it just gets richer in consequences. It’s like old wine, it gets better with time. You can also see what I mean when I say we’ve been doing the philosophy of information since day one, because really the whole discussion of the cave is just a specific chapter in the philosophy of information. The point I try to glean from that particular feature in the great architecture of the Republic is the following: some people have their attention captured constantly by social media – it could be by cats on Facebook. They are chained to that particular social media – television yesterday, digital technology today. Some of these people can actually unchain themselves and acquire a better sense of what reality is, what the world really is about. What is the responsibility of those who have, as it were, unchained themselves from the constant flow, the constant grab of attention of everyday media, and are able to step back, literally step out of the cave? Are they supposed to go back and violently force the people inside to get away, as the text says? Updated that would mean, for example, implementing legislation. We would have to ban social media, we could forbid people from having mobile phones, we’d put some kind of back doors into social media because we want control. Or do we have to exercise toleration? If so, it would be a matter of education. We’d have to go back and talk to them. In essence here Plato, by addressing these questions, is giving us a lesson in the philosophy of information. 

From an interview of Luciano Floridi, Oxford Professor of Philosophy and Ethics of Information and a member of Google's advisory council around the “right to be forgotten” court case in Europe.

People addicted to social media as the people chained in front of the wall in Plato's cave allegory. I wish I'd thought of that.

Robots taking all the jobs, cont.

By studying the brains of drivers when they were negotiating a race-track, the scientists were intrigued to find that during the most complex tasks, the experts used less brain power. They appeared to be acting on instinct and muscle memory rather than using judgement as a computer programme would. 

“It looks as if the skilled race car drivers are able to control their cars with very little cognitive load,” said Prof Gerdes. 

Mr Vodden agreed saying in difficult manouvres experience kicked in. "If you're thinking you're going too slow."

You'd think from that excerpt that the human driver remains superior, but it turns out the driverless car beat the track champion by 0.4 seconds on a track in Northern California.

One race track, the worlds' greatest driver (whoever is the Michael Schumacher of the moment) versus the best computer driver. I don't enjoy watching auto racing on TV, but I'd watch one that pits man and machine against machine and machine.

One more wrinkle for AI to learn: how and when to cheat.

In the race between Shelley and Mr Vodden, the racing driver left the track at a sharp corner, rejoining the race ahead of the robot car. 

“What we’re doing as humans we’re weighting a number of different things,” added Prof Gerdes. 

“We’re not driving within the lines, we’re balancing our desire to follow the law with other things such as our desire for mobility and safety. 

“If we really want to get to the point where we can have a car that will drive as well as the very best drivers with the car control skills and also the judgment it seems to me that we really need to have a societal discussion about what are the different priorities we place on mobility and safety on judgement and following the law.”