The game theory of the toilet seat problem

By toilet seat problem I refer to the problem of a couple living together, one man and one woman, sharing one toilet. To be more mathematically specific:

For Marsha the seat position transfer cost is 0 since all operations are performed with the seat in the down position. For John the cost is greater than 0 since seat position transfers must be performed.
 
Let p be the probability that John will perform a #1 operation vs a #2 operation. Assume that John optimizes his seat position transfer cost (see remark 3 below.) Then it is easy to determine that John’s average cost of seat position transfer per toilet opeation is
 
B = 2p(1-p)C
 
where B is the bachelor cost of toilet seat position transfers per toilet operation.
 
Now let us consider the scenario where John and Marsha cohabit and both use the same toilet. In our analysis we shall assume that John and Marsha perform toilet operations with the same frequency (see remark 4 below) and that the order in which they perform them is random. They discover to their mutual displeasure that their cohabitation adversely alters the toilet seat position transfer cost function for each of them. What is more there is an inherent conflict of interest.
 

This is one of the more rigorous game theory considerations of the toilet seat problem I've read. The solution proposed at the end seems sensible enough.

Let's not allow our current technological constraints and limited imagination confine our solution set, however. I propose a different, even more ideal solution.

We develop a toilet seat that is in communication with the Apple Watch worn by both the man and the woman. When the woman walks into the bathroom, her Apple Watch authenticates itself to the toilet seat which then automatically lowers itself. Meanwhile, when the man walks in, the toilet seat remains in whatever position it's in, per the widely accepted bachelor toilet seat strategy. One could try to further optimize for the man by learning, Nest-style, the general pattern of #1 and #2 operations and caching the last 24 to 48 hours worth of such operations, but the added complexity may only capture a slight marginal decrease in cost to him.

There is yet another solution, brought to mind by episode 4 of season 4 of Curb Your Enthusiasm, in which Larry David admits to peeing sitting down. Optimal for her, and, David claims, good for him as well.

“If I pee twenty times in a day I can get through the whole New York Times, for god's sake!”

That's two posts today that mention bathroom operations. My mind is really in the toilet.

David Chase breaks down the last scene of The Sopranos

It's almost a Norman Rockwell scene with a group of Cub Scouts, young lovers, football hero murals, and locals enjoying the warm and homey atmosphere. Chase says time itself is the raw material of the scene as the suspense builds with pinpoint editing while Journey's "Don't Stop Believin'" propels the action to its climax—a heart-stopping cut to black.
 
Chase was after the dreamy, chilling feeling he admired at the end of Stanley Kubrick's 2001: A Space Odyssey in which time expands and contracts as life and death merge into one. And there, as in the concluding instant of The Sopranos, who knows what really happens. "When it's over," Chase offers, "I think you're probably always blindsided by it. That's all I can say."
 
It was my decision to direct the episode such that whenever Tony arrives someplace, he would see himself. He would get to the place and he would look and see where he was going. He had a conversation with his sister that went like this. And then he later had a conversation with Junior that went like this. I had him walk into his own POV every time. So the order of the shots would be Tony close-up, Tony POV, hold on the POV, and then Tony walks into the POV. And I shortened the POV every time. So that by the time he got to Holsten's, he wasn't even walking toward it anymore. He came in, he saw himself sitting at the table, and the next thing you knew he was at the table.
 

David Chase breaks down all the shots from that famous last scene of The Sopranos. Brilliant. If you watched the show, it's a must read.

One of my longstanding issues with those who claim TV has surpassed movies is that the brutal deadlines of TV often lead to the most mechanical of camera framing and shot sequencing, or very cookie cutter episode structure. It can feel, at times, like a somewhat brutish and blunt art, even if the consistent and timely output of TV inspires its own awe. 1

  1. I suspect a lot of the shift in support from movies to TV are about the increased quality and convenience of the TV viewing experience and less about what's actually shown. With the proliferation of cable channels and streaming services and connected boxes, we have greater supply of TV than at any time in history. With high definition television sets and a surge in high definition content, the quality is better than ever. With DVRs and streaming services and mobile devices, the convenience is greater than at any time in history. Meanwhile, movies still require you to go to a theater at a specific time, find parking, fight others for good seats, and deal with all the other patrons. But all that said, if you judge movies against TV just based on the content itself, movies still reach greater peaks. It's not a fair comparison because an hour and a half movie has way more budget and time allotted for its creation than even many hours of TV, but that's just the nature of the art forms.

What The Sopranos brought to TV was a higher level of craft. Movies and TV shows that are constructed with real artistic intent provide a larger surface area for analysis, and they work on you in ways both conscious and subconscious. Even more than that, they reward repeat viewing in a way that most television does not.

Chase's love of music always reflected itself in very exacting editing. The rhythm of the shots in the show had a lyrical feel. Many TV shows have a very consistent shot length and sequence of shot sizes from scene to scene. Watch your basic sitcom or medical/legal procedural with a stopwatch and verify for yourself. Shows like The Sopranos, or more recently Breaking Bad, don't follow strict templates. In their more varied cinematography they resemble movies. It helps, of course, that season lengths for shows like that are much shorter than for most network TV shows. It allowed for more time to craft each episode, and that shows through.

I love the timing of the lyric when Carmela enters: 'Just a small town girl livin' in a lonely world, she took the midnight train goin' anywhere.' Then it talks about Tony: 'Just a city boy,' and we had to dim down the music so you didn't hear the line, 'born and raised in South Detroit.' The music cuts out a little bit there, and they're speaking over it. 'He took the midnight train goin' anywhere.' And that to me was [everything]. I felt that those two characters had taken the midnight train a long time ago. That is their life. It means that these people are looking for something inevitable. Something they couldn't find. I mean, they didn't become missionaries in Africa or go to college together or do anything like that. They took the midnight train going anywhere. And the midnight train, you know, is the dark train.
 

Chase doesn't say whether Tony dies or not at the end. My opinion is he did die, but when you read this piece and hear Chase discuss the ending, it's clear that question doesn't really matter. Whether it's a narrative death for Tony, or just the death of the show, the greater point of the cut to black was of endings in general.

I thought the ending would be somewhat jarring, sure. But not to the extent it was, and not a subject of such discussion. I really had no idea about that. I never considered the black a shot. I just thought what we see is black. The ceiling I was going for at that point, the biggest feeling I was going for, honestly, was don't stop believing. It was very simple and much more on the nose than people think. That's what I wanted people to believe. That life ends and death comes, but don't stop believing. There are attachments we make in life, even though it's all going to come to an end, that are worth so much, and we're so lucky to have been able to experience them. Life is short. Either it ends here for Tony or some other time. But in spite of that, it's really worth it. So don't stop believing.

Second Star Wars: The Force Awakens teaser

The second teaser for this Christmas' release of Star Wars: The Force Awakens has dropped.

Get your first look at the new Star Wars: The Force Awakens teaser #2! Lucasfilm and visionary director J.J. Abrams join forces to take you back again to a galaxy far, far away as "Star Wars" returns to the big screen with "Star Wars: The Force Awakens."

If we're being honest, this teaser and its predecessor aren't really all that remarkable. They're largely just montages of random characters, spliced in some random order, like the famously opaque teasers for the next episode of Mad Men. 1

  1. The studios forced showrunner Matthew Weiner to have bumpers teasing the next episode, against his wishes, so he countered by assembling them in such a non-linear, random fashion, almost like some William Burroughs narrative deconstruction, that essentially they give away nothing.

In fairness, teasers often contain the most minimal of narrative coherence and are merely meant to whet one's appetite with any seemingly finished footage. In essence, teasers are meant to, umm, tease.

But if I'm being honest, this trailer excited me more than the last one, largely because it includes more familiar callbacks than the previous one. Star Wars is a shockingly enduring franchise. Even my nephews who haven't seen a single Star Wars movie love the franchise and know most of its characters. Through cartoons, toys, books, and other forms of merchandise, the story has become one of the defining mythologies of modern entertainment. It has both a grandeur of scale and yet an intimacy that characterize some of history's most operatic epics. Not even three cinematic duds in a row were enough to kill off the franchise, it is that indestructible.

Given our long and deep collective history with the saga, anyone cutting a trailer can tap that mother lode of nostalgia with the gentlest of tugs. Just hearing the first cue of John Williams elegiac Star Wars score and seeing a desert landscape are enough to summon an image of a young Luke Skywalker, standing in the sand, staring out towards the horizon of Tatooine, wondering if there's anything more to his life than working his Uncle's farm.

Christmas can't come soon enough.

Design theater

You've heard of security theater. As Bruce Schneier, who coined the term, defines it:

Security theater refers to security measures that make people feel more secure without doing anything to actually improve their security. An example: the photo ID checks that have sprung up in office buildings. No-one has ever explained why verifying that someone has a photo ID provides any actual security, but it looks like security to have a uniformed guard-for-hire looking at ID cards. Airport-security examples include the National Guard troops stationed at US airports in the months after 9/11 -- their guns had no bullets. The US colour-coded system of threat levels, the pervasive harassment of photographers, and the metal detectors that are increasingly common in hotels and office buildings since the Mumbai terrorist attacks, are additional examples.
 

Karen Levy and Tim Hwang argue we're going to see the rise of design theater.

Here’s a speculation of science fiction that is rapidly manifesting into a real nuts-and-bolts design debate with wide-ranging implications: should self-driving cars have steering wheels?
 
The corporate battle lines are already being drawn on this particular issue. Google announced its autonomous car prototype last year, drawing much attention for its complete absence of a steering wheel. The reason for this radical departure? The car simply “didn’t need them.
 
...
 
Take a step back: a steering wheel implies a need to steer, something that the autonomous car is designed specifically to eliminate. In a near future of safe autonomous driving technologies, the purpose of the steering wheel is largely talismanic. More than actually serving any practical function, the steering wheel seems bound to become a mere comfort blanket to assuage the fears of the driver.
 
This is a classic problem. Consumers refuse to adopt a new technology if it visibly disempowers them or departs radically from trusted patterns of practice. This is the case even when the system is better at a task than a human operator — as in the case of the self-driving car, which is safer than a human driver.
 

I'm not so sure a steering wheel is superfluous given what I know of self-driving cars today. Many situations can't be handled by those cars now, and may not be easy to handle for many many years, so I suspect most self-driving cars will need a steering wheel to allow manual takeover in such situations.

That aside, the piece is a fantastic read. I loved this link to an article about a car proposal from 1899 car that included a giant wooden horse head stuck on the front of the vehicle. Remember, if you ask users what they want, they'll say they want a faster vehicle with a wooden horse head in front.

The first hood ornament: a giant horse head? Maybe in The Godfather someone was just trying to tell Jack Woltz they stole his car?

Not all design theater is nefarious. The authors elaborate:

How should we think about the ethics of design theater? Our initial reaction might be that misleading consumers about the nature of a technology is always wrong. In lots of areas, we enforce the idea that people have a right to know what they’re buying (consider rules about honest packaging and labeling, from knowing what ingredients are in our food to being informed about the possible health consequences of exposure to certain substances). But just as humans’ front stage performances are necessary for social life to function, it’s important for technologies to integrate into social life in ways that make them usable and understandable. Though some designers find skeuomorphism ugly or aesthetically inauthentic, it’s tough to find a serious ethical problem with a design feature that’s genuinely intended to guide usability.
 
There also doesn’t seem to be a tremendous ethical problem with theaters designed for certain laudable social purposes, like safety and protection. Nothing makes this clearer than artificial engine noise. Because modern electric cars are so much quieter than their internal-combustion predecessors, it’s much harder for pedestrians to hear them approaching. Since we’re used to listening for engine noise as a safety cue, a silent vehicle can more readily “sneak up” on us and cause accidents. Over time, if all vehicles become silent, many of us would no doubt lose this subconscious reliance — but the consequences of losing the cue altogether can be very dangerous in the shorter term, especially for pedestrians with visual impairments.

The end of routine work

In recessions of the 1960s and 1970s, routine jobs would fall during the recession but quickly snap back. But after the recession in 1990, something changed. Routine jobs fell and, as a share of the population, never recovered. In the recessions in 2001 and in 2007-09 they fell even further. The snapback never occurred, suggesting that many firms began coping with recessions by scrapping tasks that could be automated or more easily outsourced.
 
For his part, Mr. Siu thinks jobs have been taken away by automation, more than by outsourcing. While some manufacturing jobs have clearly gone overseas, “it’s hard to offshore a secretary.” These tasks more likely became unnecessary due to improving technology, he said.
 
In the late 1980s, routine cognitive jobs were held by about 17% of the population and routine manual jobs by about 16%. Today, that’s declined to about 13.5% and 12%. (The figures are not seasonally adjusted and so are displayed in the chart as 12-month moving averages, to remove seasonal fluctuations).

...

But they are not among the labor market’s pessimists who fear that robots will render humans obsolete. Their work shows the economy has continued to generate jobs, but with a focus on nonroutine work, especially cognitive. Since the late 1980s, such occupations have added more than 22 million workers.
 

By Josh Zumbrun. The idea that U.S. unemployment has jumped to a higher plateau because of jobs moving overseas or because they're replaced by technology is not a new one, but it's useful to see data and charts to support the claim.

Brad Delong comments:

Note that these jobs are “routine” only in the sense that they involve using the human brain as a cybernetic control processor in a manner that was outside the capability of automatic physical machinery or software until a generation ago. In the words of Adam Smith (who probably garbled the story):
 
In the first fire-engines, a boy was constantly employed to open and shut alternately the communication between the boiler and the cylinder, according as the piston either ascended or descended. One of those boys, who loved to play with his companions, observed that, by tying a string from the handle of the valve which opened this communication to another part of the machine, the valve would open and shut without his assistance, and leave him at liberty to divert himself with his playfellows. One of the greatest improvements that has been made upon this machine, since it was first invented, was in this manner the discovery of a boy who wanted to save his own labour…
 
And Siu and Jaimovich seem to have gotten the classification wrong: A home-appliance repair technician is not doing a routine job–those jobs are disappearing precisely because they are not routine, require considerable expertise, are hence expensive, and so swirly swapping out the defective appliance for a new one is becoming more and more attractive.
 

As any economist would prescribe, it's become more critical when thinking about one's career and education to focus on humans' comparative advantage versus computers. But I also recommend people focus on their own unique comparative advantage: any intelligent person can do many things, but what can you do better than most anyone else? In winner-take-all markets, more common in this third industrial revolution, it's ideal to give yourself the best chance to be one of those winners, and consider it a bonus that those areas often overlap with one's personal passions (whatever you think of the 10,000 hour rule, most would agree it's easier to sustain through so many hours if one is more emotionally invested).

It's also best to accept that one will have to learn new skills many times in one lifetime. It used to be that once you finished college, the education phase of life was considered over. This is already obsolete for many. Even most programmers, supposedly the most insulated workers from technological job obsolescence, have to learn new programming languages or technologies on the job every few years now.

In the future, education will generally be accepted to mean a lifelong process. Continuing education will be the default. An undergraduate degree will simply be a first milestone in signaling one's skills and sociability to potential employers. More time will have to be set aside to level up, and resources and services to support this lifelong education continue to proliferate. I view the need for lifelong learning as a positive. Who was it that said that you're young at heart to the degree that what you want to learn exceeds what you already know?

Seriously, who said that? I don't know. Add that to my list of things to learn.