Under the Skin

Two movies from the Toronto Film Festival last year really stuck in my memory. One was Gravity for its almost minimalist, allegorical story structure. Everyone leaving the theater after the premiere knew it would be a huge hit; it was accessible and recalled what movies on the big screen can do that no other medium can match. In the moment, I thought it had a very good chance to be the next Best Picture winner (which went to another movie that played the festival, 12 Years a Slave, which I couldn't get a ticket for).

However, while I enjoyed Gravity, I was haunted by Under the Skin. It was the final movie I saw at the festival, and it's the type of movie that stuns you at a film festival because a catalog description can't really do it justice; so much of its effect comes not from plot but from the cinematography, score, editing, and other elements unique to film. The hook that got me to pick the movie out of the catalog was not Scarlett Johansson as the lead, though I enjoyed her work in Ghost World and Lost in Translation, but the director, Jonathan Glazer.

As a film director, Glazer isn't prolific. Most of his output has been music videos (e.g. the music video for Karma Police by Radiohead) and commercials. Until Under the Skin, he had directed just two narrative films, Sexy Beast and Birth, and the latter came out ten years ago. Birth, in particular, had a sequence that had such beautiful, seamless melding of music and onscreen action that I analyzed it as a dance.

I said the movie couldn't be understood from its plot description, but some exposition is helpful for understanding why the movie's formal elements are so successful.

[MINOR SPOILERS AHEAD. Though the plot of the movie can't being to describe the sensation of watching it (imagine describing the plot of 2001), if you plan to see the movie, I recommend reading as little about the movie as possible]

Johansson plays an alien wearing the body of a human. In the most fascinating performance of her career, Johansson deadens all evidence of recognizable inner life both in her facial expression and physical movements. As the movie progresses, Johansson starts to let a few recognizable glimmers of humanity emerge, clueing us in to the alien's growing sympathy for the human subject.

Part of the horror of the movie comes in a mixture of some secret live footage mixed into the movie. To capture scenes of the alien seducing human men, Glazer put Johansson in a wig and behind the wheel of a truck and had her drive around Glasgow picking up strange men off the street. Glazer hid in the back of the truck listening to the audio being captured through a hidden mic while concealed cameras captured the interaction.

One could argue that if aliens came to Earth in the form of Scarlett Johansson in bright red lipstick, we'd likely be done for (well, at least the straight men in the world), but Glazer and cinematographer Daniel Landin manage to render the seduction scenes terrifying. The moments in the truck were shot from concealed cameras that couldn't be moved, and the odd and fixed camera angles are different enough from the conventional Hollywood camera angles used to film conversations in automobiles that we feel we're watching found documentary footage. The subsequent seductions take place in a seemingly black void, choreographed like some avant garde dance performance. If it's possible to make Scarlett Johansson disrobing seem nonerotic, Glazer and team come as close as one could imagine.

Much of the movie's cinematography sits in strange space between a video or documentary look and something more cinematic. Most of the lighting is natural, and many scenes are dark, even murky. To be able to shoot undetected, and with maximum freedom of movement for the actors, the cinematographer worked with a London studio to develop a custom camera that was used to shoot some of the movie's scenes. The unique look is both familiar yet unique, contributing to the movie's otherworldly feel.

The score by Mica Levi is a masterpiece. You can hear snippets from it in the trailer. It's the perfect music for the next time you plan to host an Eyes Wide Shut inspired orgy. When the score isn't playing the movie is largely dialogue-free, almost like a silent movie or a voiceover-free documentary like Leviathan. At times it evokes Ligeti, but like the other elements of the movie it is like nothing else I've heard before. 

Under the Skin is a masterpiece, but plenty walked out of the Toronto premiere befuddled or disturbed. It's not a conventional narrative, it unfolds at its own pace and cares little for evoking any familiar human emotions, so seriously does it take its mission of showing us our world through alien eyes. On that point it succeeds, too well for some people I spoke to afterwards. More than any movie I can recall, it clinically observes and presents so many of the oddities of humanity—desire, compassion, lust, and death—and its success is in making us see them for their bizarre and wondrous qualities.

Fitt's Law, the Tesla Model S, and touchscreen car interfaces

Fitts’s Law can accurately predict the time it will take a person to move their pointer, be it a glowing arrow on a screen or a finger tip attached to their hand, from its current position to the target they have chosen to hit.
 

Much more about Fitt's Law here from Tog. This bit was instructive:

Paul Fitts was not a computer guy.  He was working on military cockpit design when he discovered his famous Law. Paul Fitts never had to deal with the the issue of stability because stuff inside aircraft cockpits is inherently stable. The few things that do move only do so because the pilot moved them, as when he or she pushes a control stick to the side or advances the throttle forward.  The rest of the targets the pilot must acquire—the pressure adjustment on the altitude indicator,  the Gatling gun arm switch, the frequency dial on the radio, and fuel pump kill switch—stay exactly where they were originally installed. Everything in that cockpit is a predictable target, either always in the same place or, in the case of things like the throttle, within a fixed area and exactly where you left it. Once you become familiar with the cockpit and settle into flying the same plane hour after hour after hour, you hardly look at your intended targets at all. Your motor memory carries your hand right to the target, with touch zeroing you in.
 

I had heard of Fitt's Law but didn't know its history, and it came to mind as I was driving my Tesla Model S recently.

In almost every respect, I really love the car. I took ownership of my Model S in December 2012 after having put down a deposit over 3.5 years earlier, and I long ago stopped thinking of it as anything other than a car, perhaps the most critical leading indicator as to whether it can cross the chasm as a technology. I've forgotten what it's like to stop and pump gas (what are gas prices these days anyway?), it's roomy enough I can throw my snowboard, road bike, and other things in the back with room to spare, and I still haven't tired of occasionally flooring it and getting compressed back into my seat like I'm being propelled by a giant rubber band that has been released after being stretched to its limit. Most of all, it's still a thrill when I fire the car up to find a new software update ready to install, almost as if the Model S were a driveable iPad.

It's the ability to update the interface via software that gives me hope that a few things in the interface might be adjusted.* In a Model S, most of the controls are accessible via a giant touch screen in the center of the console. There aren't many buttons or switches except on the steering wheel which you can use to handle some of the more common actions, like changing the thermostat, adjusting volume on the sound system, skipping ahead on a musical track, and making phone calls.

When the car first came out, one of the early complaints was the lack of physical controls. I was concerned as well. Physical controls are useful because, without looking at the road, I can run my fingers across a bunch of controls to locate the one I want without activating the wrong ones by mistake as I search. With a touch screen, there is no physical contour differentiating controls, you have to actually look at the screen to hit the appropriate control, taking your eyes off of the road.

[I also confess to some nostalgia for physical controls for their aesthetics: controls in a car give physical manifestation to the functionality of a car. The more controls a car has, the more it appeals to geeks who love functionality, and physical controls also give car designers the opportunity for showing off their skills. I find many old school car dashboards quite sexy with all their knobs and switches and levers. Touch screens tend to hide all of that which has more of a minimalist appeal that may be more modern.]

In practice, I have not missed them as much as I thought I would because a lot can be operated by physical controls on the steering wheel.

However, one task that a touch screen makes difficult, in practice, is hitting a button while in a car that's in motion. It turns out that road vibration makes it very hard to keep your arm and hand steady and to hit touch targets on a touchscreen with precision. That's why I rely so much on my steering wheel controls in the Model S to do things like adjust the volume or change the temperature. Not only are the controls accessible without having to move my hands or look at the touchscreen, but the steering wheel acts as an anchor for my hand, taking road vibration out of the equation.

Maybe there is an analogue to Fitt's Law for touch screens interfaces in cars or other places where your body is being jostled or in motion. What you'd like is maximum forgiveness in the UI in such cases because it's hard to accurately hit a specific spot on the screen.

Matthaeus Krenn recently published a proposal for touch screen car interfaces that takes this idea to the logical extreme. You can read about it and watch a video demo as well. Essentially Krenn transforms the entire touchscreen in the Tesla into one single control with maximum forgiveness for your fingers to be jostled horizontally since only the vertical movement of your hand matters. By using the entire screen and spreading the input across a larger vertical distance, you can have a much larger margin of error to get the desired change. Krenn also tracks the number of fingers on the screen to allow access to different settings.

This is an interesting proposal, but for some of the most accessed functions of the car, controls on the steering wheel are still superior. The left scrollwheel on the Model S steering wheel is more convenient for changing the volume of the stereo and toggling between play and stop (you can press the scrollwheel) than the touchscreen. The right scrollwheel is more convenient for changing the car temperature and turning the climate control on and off than the touchscreen. Both scrollwheels allow you to keep both hands on the steering wheel rather than having to take the right hand off to access the touchscreen.

Actually, the ideal solution to almost all of these problems is a combination of the steering wheel controls and another interface that already exists in the car: voice. The ideal car interface from a safety standpoint would allow you to keep your eyes on the road and both hands on the steering wheel at all times. The scrollwheel and steering wheel buttons and voice commands satisfy both conditions.

In the Model S, to issue a voice command, you press and hold the upper right button on the steering wheel and issue your voice command, after which there is a delay while the car processes your command.

Unfortunately, for now, the number of voice commands available in the Tesla Model S are quite limited:

  • Navigation — you can say "navigate to" or "drive to" or "where is" followed by an address or destination
  • Audio — you can say "play" or "listen to" and then say an artist name or song title and artist name and it will try to set up the right playlist or find the specific track using Slacker Radio (one of the bundled audio services for Model S's sold in the U.S.)
  • Phone — if you connect a phone via Bluetooth, you can say "call" or "dial" followed by the name of a contact in your phone contact book

I'm not sure why the command list is so limited. When I first got the car I tried saying things like "Open the sunroof" or "Turn on the air conditioning" to no avail.

Perhaps the hardware/software for voice processing in the car aren't powerful enough to handle more sophisticated commands? Perhaps, though it seems like voice commands are sent to the cloud for processing which should enable more sophisticated voice processing when you have cellular connectivity. Or perhaps the car can offload voice processing to select cell phones with more onboard computing power.

In time, I hope more and more controls are accessible by voice. I'd love to have voice controls passed through to my phone via Bluetooth, too. For example, I'd love to ask my phone to play my voicemails through the car's audio system, or read my latest text message. For safety reasons, it's better not to fiddle with any controls while driving, analog or touchscreen-based.

Perhaps this is a problem with a closing window given the possibility of self-driving cars in the future, but that is still a technology whose arrival date is uncertain. In the meantime, with more and more companies like Apple and Google moving into the car operating system space, I hope voice controls are given greater emphasis as a primary mode of interaction between driver and car.

* One other thing I'd love to see in a refresh of the software would be less 3D in the digital instrument cluster above the steering wheel. I have some usability concerns with the currently vogue flat interfaces in mobile phone UI's, but the digital instrument cluster in a car is not meant to be touched, and the strange lighting reflection and shadow effects used there in the Tesla feel oddly old-fashioned. It's one interface where flat design seems more befitting such a modern marvel.

Problem with local discovery

[Via Hunter Walk]

Pat Kinsel with a smart post on why local discovery is a tough tough business.

Why is local search and discovery a flawed concept? Simple: query volume.

No matter how compelling or timely a service’s results are, people just don’t ask themselves “Where should I go right now?” that often. People do rely on directions and place lookup as utilities to get themselves to an already intended destination, but they just don’t walk around town desperate to discover where they should visit next. Look at Foursquare. In its early days, it was a game and drove usage accordingly. As they’ve abandoned the game elements and emphasized their search and discovery capabilities, usage has declined precipitously. People just don’t need to ‘discover nearby’ that often.

Search businesses require massive query volumes to succeed. How often do you ask yourself, “what should I do this afternoon?” or “where should I visit right now?” Once a month? Every week? With these usage patterns, a service would need hundreds of millions of users to have a meaningful, monetizable search audience.
 

More interesting thoughts within, including why startups focused on that space make for tough acquisitions.

It seems likely the frontrunners to succeed in this space are players that wrap local discovery inside a local directory service, like Yelp or Google Maps/Waze, though I'm glad a startup like Foursquare exists to try and crack this nut.

Nobody knows if HFT is good or bad

The publication of Michael Lewis' new book Flash Boys: A Wall Street Revolt has pushed discussion of high frequency trading (HFT) to the fore.

Noah Smith opines that nobody knows if HFT is really good or bad.

Do market-makers increase or decrease liquidity? Do front-runners increase or decrease it? What about informational efficiency of prices? What about volatility and other forms of risk, at various time scales? What about total trading costs? Good luck answering any of these questions. Actually, Stony Brook people are working on some of these, as are researchers at a number of other universities, but they are huge questions, and our data sets are incredibly limited (data is expensive, and a lot of stuff, like identities of traders, just isn't recorded). And keep in mind, even if we did know how each of these strategies affected various market outcomes, that wouldn't necessarily tell us how the whole ecosystem of those strategies affects markets - after all, they interact with each other, and these interactions may change as the strategies themselves evolve, or as the number and wealth of the people using each strategy changes. 

Confused yet? OK, it gets worse. Because even if we did know how HFT affects markets, we don't really know if it's good or bad on balance. For example, HFT defenders often say HFT provides "liquidity". Is liquidity good for markets? How much is liquidity worth, are there different kinds of liquidity, and does it matter when the liquidity comes? If I have a bunch of totally random trading, that certainly makes markets liquid, but is that a good thing? Actually, maybe yes! In lots of models of markets, you need random, money-losing "liquidity traders" in order to overcome the adverse selection problem, thus inducing informed traders to trade, and getting them to reveal their information. But HFTs don't lose money, they make money - is their liquidity provision worth the cost?

To know that, even if we knew the impact of HFTs on informational quality of prices, we'd have to know the economic value of informational efficiency. Suppose the true worth of GE stock. according to the best information humanity has available, is $100. Suppose the price is $100.20. How bad is that? How much is it worth, in economic terms, to push the price from $100.20 to $100.00? Is it worth $0.20 per share? It depends on how GE's stock price affects the company's investment decisions. To know that, we need an economic model of corporate decision-making. We have many of these, but we don't have one over-arching one that we know works in all circumstances. Corporations are way more complicated than what you read in your intro corporate finance textbook!

(And this is all without thinking about weird things like behavioral effects of the humans who interact with HFTs...)
 

I don't know much about HFT other than a few articles I've read here or there, so count me among those who have no idea if it's positive or negative.

Procrastination

A really wonderful two-part series on procrastination over at Wait But Why:

  1. Why Procrastinators Procrastinate
  2. How to Beat Procrastination

I hate patents, but the term Dark Playground perhaps deserves a trademark.

The Dark Playground is a place every procrastinator knows well. It’s a place where leisure activities happen at times when leisure activities are not supposed to be happening. The fun you have in the Dark Playground isn’t actually fun because it’s completely unearned and the air is filled with guilt, anxiety, self-hatred, and dread. Sometimes the Rational Decision-Maker puts his foot down and refuses to let you waste time doing normal leisure things, and since the Instant Gratification Monkey sure as hell isn’t gonna let you work, you find yourself in a bizarre purgatory of weird activities where everyone loses.