In fast-paced modern world, no room for slow talkers, slow walkers

Slowness rage is not confined to the sidewalk, of course. Slow drivers, slow Internet, slow grocery lines—they all drive us crazy. Even the opening of this article may be going on a little too long for you. So I’ll get to the point. Slow things drive us crazy because the fast pace of society has warped our sense of timing. Things that our great-great-grandparents would have found miraculously efficient now drive us around the bend. Patience is a virtue that’s been vanquished in the Twitter age.
 
Once upon a time, cognitive scientists tell us, patience and impatience had an evolutionary purpose. They constituted a yin and yang balance, a finely tuned internal timer that tells when we’ve waited too long for something and should move on. When that timer went buzz, it was time to stop foraging at an unproductive patch or abandon a failing hunt.
 
...
 
But that good thing is gone. The fast pace of society has thrown our internal timer out of balance. It creates expectations that can’t be rewarded fast enough—or rewarded at all. When things move more slowly than we expect, our internal timer even plays tricks on us, stretching out the wait, summoning anger out of proportion to the delay.
 
...
 
Make no mistake: Society continues to pick up speed like a racer on Bonneville Speedway. In his book, Social Acceleration: A New Theory of Modernity, Hartmut Rosa informs us that the speed of human movement from pre-modern times to now has increased by a factor of 100. The speed of communications has skyrocketed by a factor of 10 million in the 20th century, and data transmission has soared by a factor of around 10 billion.
 

From Nautilus.

It shouldn't be any surprise that technology has amplified our impatience. Tech companies were among the first to measure and quantify the monetary value of time. Early on at Amazon, we did an A/B test and realized that every additional millisecond of load time in search meant some users would just abandon their shopping session and bounce. The same applied to web pages. I've read that Google learned the same when testing their search results.

I've been listening to more and more podcasts, and to try and finish more of them in my limited free time, I've been dialing up playback speed. At first I was at 1.25X, then 1.5X, and now I can almost listen at 2X and still understand what folks are saying (contrary to popular belief, all decent podcast apps can do this playback without altering the pitch; when I playback at 2X, Marc Maron doesn't sounds like a chipmunk). The unpleasant side effect, though, is that everyone in the real world seems to speak way......too......slowly. I sometimes find myself losing concentration mid-sentence.

I'm in Hawaii for a wedding, and the wifi on the United flight over was the slowest I've ever used on a plane, and I could feel my blood pressure rising as I hit refresh and reload buttons like a rat in a Skinner box. Yes, I've seen the Louis CK clip about this, and yes, this is a problem of entitlement of the highest order, but it isn't going away, it's worsening.

At its worst, impatience can lead to violence. Few sights make me as pessimistic about humanity as a driver exploding in near apoplectic road rage over a delay of a few seconds. I've seen grown adults slam on their brakes, get out of their cars, and engage in fistfights with another driver over the slightest of inconveniences. Forget the zombie apocalypse, simply failing to start moving immediately when the light turns green can reduce civilization to Lord of the Flies in an instant.

The article notes that our internal timers may be overclocked, leading to an acceleration of the vicious cycle of impatience --> rage --> impulsive behavior.

Recent research points to a possibility that could make this cycle worse. As my slow-walking friend and I strolled at a snail’s pace down the street, I started to fear that we would be so late for our reservation that we would miss it. But when we got to the restaurant, we were no more than a couple minutes behind. My sense of time had warped.
 
Why? Rage may sabotage our internal timer. Our experience of time is subjective—it can fly by in a flash, or it can drag out seemingly forever. And strong emotions affect our sense of time most of all, explains Claudia Hammond in her 2012 book Time Warped: Unlocking the Mysteries of Time Perception. “Just as Einstein’s theory of relativity tells us that there is no such thing as absolute time, neither is there an absolute mechanism for measuring time in the brain,” she writes.
 
Time stretches when we are frightened or anxious, Hammond explains. An arachnophobe overestimates the time spent in a room with a spider; a fearful novice skydiver, the time spent hurtling to Earth. People in car accidents report watching events unfold in slow motion. But it’s not because our brains speed up in those situations. Time warps because our experiences are so intense. Every moment when we are under threat seems new and vivid. That physiological survival mechanism amplifies our awareness and packs more memories than usual into a short time interval. Our brains are tricked into thinking more time has passed.
 
On top of that, our brains—in particular, the insular cortex, linked to motor skills and perception—may measure the passing of time in part by integrating many different signals from our bodies, like our heartbeats, the tickle of a breeze on our skin, and the burning heat of rage. In this model, the brain judges time by counting the number of signals it is getting from the body. So if the signals come faster, over a given interval the brain will count more signals, and so it will seem that the interval has taken longer than it actually has.
 

[As an aside, I do love the metaphor for how our bodies measure time, using natural cues like heartbeats, breaths, and other natural cues of cadence.]

With devices like mobile phones and tablets and ubiquitous network connectivity filling every empty moment in our lives, however brief, like water into cracks in the ground, we may be past the point of no return on our ability to live in the moment. Often today I find myself magnetized by people who can sit and chat with me without glancing at their phone every few minutes; they are an increasingly rare species. 

This next generation of kids will grow up with such devices from the moment they can handle one; will any of them escape addiction to the screens around them, or will they regard long face-to-face conversation as an antiquated tradition? Perhaps those who can eschew their phone for long stretches and give their companions their full attention will be seen as superhuman. I've read so many times that Bill Clinton's most amazing gift is his ability to make anyone, no matter how humble, feel like the absolute center of his attention. 

Still, I'm not optimistic. What happens to our impulse control when technologies like virtual reality tempt us every moment with the possibility of going anywhere, experiencing anything, instantaneously? Technology is just beginning to expose the shadow prices of living in the real world, and the profit motive and Moore's Law are working in favor of just one side.

Now if you'll excuse me, I'm going to go meditate. Using an app on my iPhone, of course.

Theory of moral standing of virtual life forms

I have killed three dogs in Minecraft. The way to get a dog is to find a wolf, and then feed bones to the wolf until red Valentine’s hearts blossom forth from the wolf, and then it is your dog. It will do its best to follow you wherever you go, and (like a real dog) it will invariably get in your way when you are trying to build something. Apart from that, they are just fun to have around, and they will even help you fight monsters. If they become too much of a nuisance, you can click on them and they will sit and wait patiently forever until you click on them again.
 
I drowned my first two dogs. The first time, I was building a bridge over a lake, but a bridge that left no space between it and the water. The dog did its best to follow me around, but it soon found itself trapped beneath the water’s surface by my bridge. Not being smart enough to swim out from under the bridge, it let out a single plaintiff yelp before dying and sinking. Exactly the same thing happened to my second dog, as it was this second episode that made this particular feature of dogs clear to me. I know now to make dogs sit if I’m building bridges. I’m not sure what happened to the third dog, but I think it fell into some lava. There was, again, the single yelp, followed by a sizzle. No more dog.
 
I felt bad each time, while of course fully realizing that only virtual entities were being killed. 
 

From an excerpt from Charlie Huenemann's How You Play the Game: A Philosopher Plays Minecraft. It is true, we humans are wired to form attachments, even to inanimate objects or virtual life forms which have no feelings. We visit the places of our youth and experience almost nauseating waves of nostalgia. At an objective level, these seem irrational, even as we recognize the tendency in ourselves. What is the value of these feelings? Is there an evolutionary purpose? Are they a form of rehearsal for our attachments to living things, or maybe just a reflex we can't turn off selectively?

The point is that we form attachments to things that may have no feelings or rights whatsoever, but by forming attachments to them, they gain some moral standing. If you really care about something, then I have at least some initial reason to be mindful of your concern. (Yes, lots of complications can come in here – “What if I really care for the fire that is now engulfing your home?” – but the basic point stands: there is some initial reason, though not necessarily a final or decisive one.) I had some attachment to my Minecraft dogs, which is why I felt sorry when they died. Had you come along in a multiplayer setting and chopped them to death for the sheer malicious pleasure of doing so, I could rightly claim that you did something wrong.
 
Moreover, we can also speak of attachments – even to virtual objects – that we should form, just as part of being good people. Imagine if I were to gain a Minecraft dog that accompanied me on many adventures. I even offer it rotten zombie flesh to eat on several occasions. But then one day I tire of it and chop it into nonexistence. I think most of would be surprised: “Why did you do that? You had it a long time, and even took care of it. Didn’t you feel attached to it?” Suppose I say, “No, no attachment at all”. “Well, you should have”, we would mumble. It just doesn’t seem right not to have felt some attachment, even if it was overcome by some other concern. “Yes, I was attached to it, but it was getting in the way too much”, would have been at least more acceptable as a reply. (“Still, you didn’t have to kill it. You could have just clicked on it to sit forever….”)

That moment when we played with syntax

In That Way We're All Writing Now, Clive Thompson investigates the rise of a form of writing I'll refer to as the stand alone subordinate clause, often accompanied by an image, a sequential grid of images, an animated GIF, or a Vine.

One of the most popular forms is the “when you” subordinate clause:

Popular sub genres include “when your” and “when your ex,” but let your imagination roam and you will find yourself deep down many a rabbit hole.

Another popular form is “that moment when.” Here's one that functions also as a humblebrag, a more sophisticated instance of the form.

Searching Twitter for “that awkward moment when” is the new, low-fi form of America's Funniest Home Videos, which is ironic since the form defies any highbrow incarnations.

Thompson offers a couple explanations for the rise of this type of expression online.

1) It creates a little puzzle.
Grammatically speaking, what’s going on here is the rise of the “subordinate clause.” A subordinate clause isn’t a sentence on its own. As the name implies, it requires another sentence fragment to complete it, as with this example that McCulloch and I looked at on Yik Yak:
Usually you can quickly deduce what the missing part would be. Maybe it’s something like You, sadly, always know what to do when she’s holding a dog on her Tinder and you’re like, “cute dog.” Or maybe the full sentence that emerges in your head is more convoluted, like Nothing is more bittersweet than reflecting on the challenges of dating someone who is superficially attractive but owns a pomeranian and thus, you worry, has all sorts of dog/partner priority issues, which you can instantly intuit when you’re using a dating app and see someone when she’s holding a dog on her Tinder and you’re like, “cute dog.”
 
The point is, it’s up to you imagine the rest of the utterance. It’s like the author is handing you a little puzzle. Subordinate-clause tweets and Yik-Yak postings seduce us into filling out that missing info, McCulloch says. “Our brain has to work a little bit harder to figure out what it’s referring to, and so making that connection is very satisfying. It’s like getting a joke. You have to draw that connection for yourself a little bit — but because you can do it, it works really well.”
 
A historic parallel? The crazy, long chapter headings in 19th-century novels, which often were also dependent clauses, inviting the reader to imagine the rest of the baroque narrative. “In Which Our Protagonist Meets A Dashing Stranger,” McCulloch jokes. “The ‘in which’ is doing a very similar thing.”
 

The ordering of the best of these subordinate clauses is critical; the punch line needs to come at the end. Like a joke, the setup comes in the first part of the clause so the closing can knock the pins down with maximum effect.

I'm more partial to Thompson's final explanation, which isn't a reason as much as it is an observation of how much we tinker with syntax online.

What’s happening now is different. Now we’re messing around with syntax — the structure of sentences, the order in which the various parts go and how they relate to one another. This stuff people are doing with the subordinate clause, it’s pretty sophisticated, and oddly deep. We’re not just inventing catchy new words. We’re mucking around with what makes a sentence a sentence.
 
“Playing with syntax seems to be the broad meta trend behind a whole bunch of stuff that’s going on these days,” McCulloch tells me. And it goes beyond this subordinate-clause trend. Many of the biggest recent language memes were about syntax experimentation, such as the “i’ve lost the ability to can” gambit (which I wrote about a few months ago), or the gnarly elocution of doge, or the “because” meme. (Indeed, Zimmer points out, the American Dialect Society proclaimed “Because” the Word of the Year for 2013, largely because it had been revitalized by this syntax play.)
 
Why would we be suddenly messing around with syntax? It’s not clear. McCulloch thinks it may be related to a larger trend she’s identified, which she calls “stylized verbal incoherence mirroring emotional incoherence”. Most of these syntax-morphing memes consist of us trying to find clever new ways to express our feelings.
 

I consider these to be a formal type of internet expression just as haikus and sonnets are forms of poems. Just as genres of movies are constraints within which artists can focus their creativity, this form of social network post has its own formal conventions within which everyone can exercise their wit. As soon as the reader's eye spots the opening “that awkward moment” or sees a grid of images with giant text overlaid, their mind is primed for the punch line.

Opaque intelligence

Alex Tabarrok writes about what he calls opaque intelligence.

It isn’t easy suppressing my judgment in favor of someone else’s judgment even if the other person has better judgment (ask my wife) but once it was explained to me I at least understood why my boss’s judgment made sense. More and more, however, we are being asked to suppress our judgment in favor of that of an artificial intelligence, a theme in Tyler’s Average is Over. As Tyler notes notes:

…there will be Luddites of a sort. “Here are all these new devices telling me what to do—but screw them; I’m a human being! I’m still going to buy bread every week and throw two-thirds of it out all the time.” It will be alienating in some ways. We won’t feel that comfortable with it. We’ll get a lot of better results, but it won’t feel like utopia.

I put this slightly differently, the problem isn’t artificial intelligence but opaque intelligence. Algorithms have now become so sophisticated that we human’s can’t really understand why they are telling us what they are telling us. The WSJ writes about driver’s using UPS’s super algorithm, Orion, to plan their delivery route:

Driver reaction to Orion is mixed. The experience can be frustrating for some who might not want to give up a degree of autonomy, or who might not follow Orion’s logic. For example, some drivers don’t understand why it makes sense to deliver a package in one neighborhood in the morning, and come back to the same area later in the day for another delivery. But Orion often can see a payoff, measured in small amounts of time and money that the average person might not see.

One driver, who declined to speak for attribution, said he has been on Orion since mid-2014 and dislikes it, because it strikes him as illogical.

One of the iconic moments from Hitchhiker's Guide to the Galaxy is when a supercomputer finally finishes computing, after 7.5 million years, the answer to the ultimate question of life, the universe, and everything, and spits out 42. Perhaps that is how far beyond our understanding a super-intelligent AI will be. We may no more understand them than a snail understands humans. Defined that way, opaque intelligence is just artificial intelligence so advanced we don't understand it.

Someday a self-driving car will make a strange decision that will kill someone, and the software will be put on trial, and despite all the black box data recovered we may have no idea what malfunctioned. Sometimes my iPhone randomly crashes and reboots, I couldn't begin to tell you why.

I'm waiting for the dystopic sci-fi movie that postulates an armageddon scenario much more likely than Skynet in Terminator. That is, rather than waste time building cyborg robots to hunt us down, a truly super-intelligent AI that wanted to kill off humans could just simultaneously order a million self-driving cars to speed headlong into each other, all the planes in the world to plunge into the ground, all our nuclear reactors to melt down, and a dozen other scenarios far more efficient than trying to build humanoids that walk on two legs.

Not as visually exciting enjoyable as casting Arnold, though. In a way, it's reassuring that for all the supposed intelligence of Skynet, it sends back a Terminator that still has a terrible Austrian-accented English, as if artificial speech technology was the one technology that failed to keep up despite AI making leaps as complex as gaining consciousness.

Too late

Now that the Mayweather-Pacquiao fight has been set for May 2, it's a good time to link back to my post “The fight we wanted, but not really” as nothing has really changed.

Mayweather-Pacquiao would have been a great fight five years ago, when Pacquiao and Mayweather were both younger and faster. Pacquiao, by virtue of being a southpaw with the endurance to throw an unbelievable volume of punches and the gift to throw fast from unexpected angles, would have been a real challenge to Mayweather's great defense and technical precision. Mayweather would have landed shots Pacquiao for sure since Pacman sacrifices defense for offense (and isn't the defensive whiz that Mayweather is anyhow), but on sheer punch volume, Pacquiao might have landed more total punches, making a fight that went to the judges scorecard a really dicey proposition for Mayweather.

But as is his style, Mayweather is too smart, observant, and cautious, and he knew the magnitude of threat posed by Pacquiao. As I noted in my previous post, Mayweather rarely fights opponents in their prime, when they'd be the greatest threat to him. He gets them early or he gets them late, on either shoulder of their prime, and in this case, it's Pacquiao on the downslope from his peak.

A perfect record is a valuable asset, and you can't argue with the sheer volume of money Mayweather has made over the years. His fight selection has been near impeccable, and who he fights is his call. I don't think it was fear driving his decision-making, either. Someone of his boxing genius would be a deserving favorite in every fight he's ever taken, and that includes Pacquiao then or now.

Fight fans just prefer a narrative of combat sport that casts its best fighters as fearless warriors, ready to take on any and all challengers out of the sheer need to prove indomitable. When we picture a fighter, we don't think of a calculating tactician, selecting each fight based on deep analysis of the opponent and a better than likely chance of winning.

Pacquiao and his camp also bear fault. Both sides conjured reason after reason the fight wouldn't be made: the size of the purse, how it would be split, drug testing policies, etc. At times it wasn't clear who was resorting to which excuse.

It's not just that a fight closer to their primes would have been a better fight, but it might have been the first in a classic two or three fight series. Instead boxing got a bunch of other fights in the intervening years that meant very little to most boxing fans, assuming there are any left besides the inner circle.

That Mayweather finally accepted the fight should tell you all you need to know about where Pacquiao's skill level is versus five years ago, but you can go to the videotape if you need further proof. I fully expect the line to show Mayweather as a healthy favorite, with only perhaps a large and more naive betting public pushing the line closer.

In boxing, it has almost always been true that if there's enough money, a fight will happen. It held true this time as well, only a lot of that money will be nostalgia past its expiration date.

I'll still watch the fight, I've long had a Joyce Carol Oates-like fascination with the sweet science, but I'm not springing for the PPV. I wrote that check so long ago I can't find it anymore.