The end of routine work

In recessions of the 1960s and 1970s, routine jobs would fall during the recession but quickly snap back. But after the recession in 1990, something changed. Routine jobs fell and, as a share of the population, never recovered. In the recessions in 2001 and in 2007-09 they fell even further. The snapback never occurred, suggesting that many firms began coping with recessions by scrapping tasks that could be automated or more easily outsourced.
 
For his part, Mr. Siu thinks jobs have been taken away by automation, more than by outsourcing. While some manufacturing jobs have clearly gone overseas, “it’s hard to offshore a secretary.” These tasks more likely became unnecessary due to improving technology, he said.
 
In the late 1980s, routine cognitive jobs were held by about 17% of the population and routine manual jobs by about 16%. Today, that’s declined to about 13.5% and 12%. (The figures are not seasonally adjusted and so are displayed in the chart as 12-month moving averages, to remove seasonal fluctuations).

...

But they are not among the labor market’s pessimists who fear that robots will render humans obsolete. Their work shows the economy has continued to generate jobs, but with a focus on nonroutine work, especially cognitive. Since the late 1980s, such occupations have added more than 22 million workers.
 

By Josh Zumbrun. The idea that U.S. unemployment has jumped to a higher plateau because of jobs moving overseas or because they're replaced by technology is not a new one, but it's useful to see data and charts to support the claim.

Brad Delong comments:

Note that these jobs are “routine” only in the sense that they involve using the human brain as a cybernetic control processor in a manner that was outside the capability of automatic physical machinery or software until a generation ago. In the words of Adam Smith (who probably garbled the story):
 
In the first fire-engines, a boy was constantly employed to open and shut alternately the communication between the boiler and the cylinder, according as the piston either ascended or descended. One of those boys, who loved to play with his companions, observed that, by tying a string from the handle of the valve which opened this communication to another part of the machine, the valve would open and shut without his assistance, and leave him at liberty to divert himself with his playfellows. One of the greatest improvements that has been made upon this machine, since it was first invented, was in this manner the discovery of a boy who wanted to save his own labour…
 
And Siu and Jaimovich seem to have gotten the classification wrong: A home-appliance repair technician is not doing a routine job–those jobs are disappearing precisely because they are not routine, require considerable expertise, are hence expensive, and so swirly swapping out the defective appliance for a new one is becoming more and more attractive.
 

As any economist would prescribe, it's become more critical when thinking about one's career and education to focus on humans' comparative advantage versus computers. But I also recommend people focus on their own unique comparative advantage: any intelligent person can do many things, but what can you do better than most anyone else? In winner-take-all markets, more common in this third industrial revolution, it's ideal to give yourself the best chance to be one of those winners, and consider it a bonus that those areas often overlap with one's personal passions (whatever you think of the 10,000 hour rule, most would agree it's easier to sustain through so many hours if one is more emotionally invested).

It's also best to accept that one will have to learn new skills many times in one lifetime. It used to be that once you finished college, the education phase of life was considered over. This is already obsolete for many. Even most programmers, supposedly the most insulated workers from technological job obsolescence, have to learn new programming languages or technologies on the job every few years now.

In the future, education will generally be accepted to mean a lifelong process. Continuing education will be the default. An undergraduate degree will simply be a first milestone in signaling one's skills and sociability to potential employers. More time will have to be set aside to level up, and resources and services to support this lifelong education continue to proliferate. I view the need for lifelong learning as a positive. Who was it that said that you're young at heart to the degree that what you want to learn exceeds what you already know?

Seriously, who said that? I don't know. Add that to my list of things to learn.

Payments as social network

Generally speaking, Venmo peaks around 7 pm on the east coast of the US and stays fairly strong until about 3am, which is midnight on the west coast. The timing—and corresponding emoji—suggests the preponderance of Venmo transactions are people paying each other back for dinner and drinks.
 
Emoji use is markedly different at quieter times of the day. For example, the house emoji ranks highly from 7 am to 3pm EST. And the car and taxi emoji crack the top five between 5am and 8am EST.
 
...
 
Some of these data are skewed by messages containing a lot of emoji. The “pile of poop” emoji only holds the top spot in the midnight hour because a single payment used it 1,116 times.
 

On the usage patterns for the payment service Venmo, as interpreted by the emoji used to tag public transactions.

I like to think I'm young at heart, but the fact that Venmo is also a social network makes me feel old. Payment transactions are one form of community interaction, sure, but looking through my Venmo feed feels like peering through a fog of data pollution. Perhaps the next generation of kids really does live with their default life privacy settings toggled to public.

Speaking of the pile of poop emoji, it seems only a matter of time until someone releases an app that allows you to broadcast when you are taking a poop. It should be a mobile app just called Poop. I leave it to the design geniuses at Apple to figure out what type of haptic feedback a poop notification should emit on the Apple Watch.

Beware of pickpockets

The Financial Times published a masterclass with pickpocket James Freedman 

Freedman's five tips for avoiding pickpockets:

1. Carry your Oyster card or travel ticket separately to avoid flashing your wallet or purse unnecessarily.
 
2. Don’t stand near the doors on a bus or train. These are prime spots for pickpockets.
 
3. Pickpockets often hang around near “Beware of Pickpockets” signs and then watch people instinctively tap their pockets, to pinpoint the valuables.
 
4. Don’t use the same PIN for all your bank cards and your phone.
 
5. Don’t keep your driving licence with your credit cards. Losing your cards is bad enough without giving the thief your address, full name and date of birth too
 

More on pickpocketing: this excellent profile of Apollo Robbins in The New Yorker, and a few links that are still relevant.

A cookbook from IBM's Watson

Robots taking all the jobs, cooking edition:

Steve Abrams, the director of IBM’s Watson Life research program, told Quartz that Watson scanned publicly available data sources to build up a vast library of information on recipes, the chemical compounds in food, and common pairings. (For any budding gastronomers out there, Abrams said Wikia was a surprisingly useful source.) Knowledge that might’ve taken a lifetime for a Michelin-starred chef to attain can now be accessed instantly from your tablet.
 
What separates Watson from the average computer (or chef) is its ability to find patterns in vast amounts of data. It’s essentially able figure out, through sheer repetition, what combinations of compounds and cuisines work together. This leads to unusual pairings, like Waton’s apple kebab dish, which has some odd ingredients: “Strawberries and mushrooms share a lot of flavor compounds,” Abrams said. “It turns out they go quite well together.”
 

The researchers are publishing a cookbook with recipe ideas from Watson, and it releases this Tuesday: Cognitive Cooking with Chef Watson: Recipes for Innovation from IBM & the Institute of Culinary Education. I have not read the book, but some of the recipes sound intriguing (“Belgian bacon pudding, a desert containing dried porcini mushrooms”) while others sound, at best, like clever wordplay (“the shrimp cocktail, which is a beverage with actual shrimp in it”). Regardless, I'm purchasing a copy just out of sheer curiosity. Let's hope they turn this resource into an app or service instead of a book, I blame Watson's vanity for wanting this in the outdated format of a book.

To the extent that standout recipes and flavor pairings are a matter of pattern recognition, there's no reason a computer, with its infinitely more scalable hardware and software for that purpose, couldn't match or exceed a human. And, so, a variant of the infinite monkey theorem: given enough time, a computer will write the French Laundry cookbook (and win a third Michelin star).

To be clear, I'm okay with this. I just want to eat tasty food, I'm fine with employing computers to come up with more amazing things to feed me.

For now, however, the computer still requires a human to actually prepare the recipe. In a true demonstration of how far artificial intelligence has progressed, no sufficiently advanced computer wants the drudgery of life as a line chef. Better profits in cookbooks than restaurants anyway.

A new cooking show concept already comes to mind: Top Freestyle Chef. Like freestyle chess, in freestyle cooking competitors would consist of a human or a human consulting with a computer. I am ready to program this into my DVR already, as long as they don't replace Padma Lakshmi with a robot host. I'm as big a fan of artificial intelligence and robots as the next guy, but I think we're a long way from replacing this.

Trim the fat on the NBA schedule

Tonight, the Boston Celtics crushed the Cleveland Cavaliers. Of course, the Cavs played without Lebron James, Kyrie Irving, Kevin Love, and J.R. Smith? Why? Because the Cavs had clinched the second seed in the East and so the games really don't matter to them anymore.

Years ago, David Stern fined the Spurs for not even bringing Tim Duncan, Tony Parker, Manu Ginobili or Danny Green to Miami for the last game of a road trip. Granted, that was earlier in the season, and Stern said the fine was because they didn't inform the media and league far enough in advance that those players wouldn't be there, but let's be honest, the strategy in each case was the same: rest your best players for the playoffs, when it really matters.

Greg Popovich is by far the best coach in the NBA, and he's just doing something more teams should do. If your ultimate goal is to win in the post season, resting your players during the regular season makes all sorts of sense, especially if home court advantage is diminishing. Lebron took a self-imposed sabbatical of a couple weeks mid-season this year, mostly because he was tired. This might become an annual tradition for him, and why not? He came back noticeably fresher and the Cavs have been on fire ever since. As he ages, he should preserve the best of his remaining minutes for the highest leverage moments. The regular season falls below that cut line. 

Of course, if you're a fan who paid several hundred dollars or more for your seats, only to find one or both teams resting their best players that night, you might have a different idea of just how wonderful a strategy that is. One reason I've stopped attending many NBA regular season games is that even when both teams are at full strength, the intensity is often noticeably throttled down. Given the steep price of a half-decent seat these days, a regular season NBA game often isn't a great entertainment value. I'd rather spend a lot to see one playoff game than spend the same amount to see three or four middling regular season games.

A better solution would be to shorten the number of games in the regular season. Everyone knows it's too long, even if they won't admit it publicly. As always with professional sports, it's doubtful the owners, league, and players would be willing to forego the additional revenue from all those superfluous games. But if more and more players like Lebron just choose to watch games from the sidelines in their three piece suits, as if they were taking “personal days” in the business world, perhaps the league will try to save face and pare back the schedule. During Lebron's sabbatical this season he wasn't even at all the games he missed, he spent some of that time vacationing in Miami. Why was he even on the bench tonight? What if he were posting photos to Instagram from Drake's set at Coachella tonight instead?

If the NBA won't shorten its regular season (let's be honest, they won't), perhaps on days when the teams choose to just sit their stars, the league should give some of the ticket revenue back to fans in the form of a credit towards concessions and the gift shop, or towards a future game. Or perhaps even offer a partial refund.

Can you imagine purchasing a ticket ahead of time to see Furious 7, only to arrive at the theater to be told that Vin Diesel is taking that night's showing off because he is feeling beaten up from all the movie's stunt work?

“The role of Dom Toretto will be played tonight by Mr. Diesel's understudy, former American Idol contestant Chris Daughtry.”

Skipping the web

Why India's Flipkart abandoned its mobile website:

Now, if you tack on a gigantic population with miserable internet connection speeds, the prospect of scaling up your website operations and back end to deal with not only the overload on it, but also the abysmal experience on the consumer end, whether it is mobile or desktop, is even more bleak. An app allows a user to stay logged in while updates and other information are efficiently and constantly downloaded, ready for consumption almost instantly. It is, in fact, perfect for low-bandwidth situations.
 

People often write of countries like India or Africa bypassing landlines or PCs to skip ahead to technologies like wireless or smartphones, but I haven't heard of countries treating the web as one of those intermediate technologies to be hopped over.

Having spent lots of time working out of China, I see the sense in it. Internet connection speeds are really slow there, and loading the web can be painful. Even with an upgraded pipe into the building, when I worked out of Hulu's Beijing office, I found myself browsing the web a lot less simply out of impatience.

Having grown up in the U.S., the web was one of the first and still longest-running touchpoint to the internet. My first was using newsgroups in college, and the web came about towards the end of my undergrad days. I can understand why so many in the U.S. are nostalgic and defensive of the web as a medium. Seeing so much content and online interaction move behind the walls of social networks seems like an epic tragedy to many, and I empathize.

Many people in India, China, and other parts of the world, where bandwidth is low and slow, and where mobile phones are their one and only computer, have no room for such sentimentality. They may never have experienced the same heyday of the web, so they feel no analogous nostalgia for it as a medium. Path dependence matters here, as it does in lots of areas of tech, and one of the best ways to detect it is to widen your geographic scope of study outside the U.S. Asia is a wonderful comparison group, especially for me because I have so many friends and relatives there and because I still interact with them online at a decent frequency.

In the U.S., many tech companies were lauded as pioneers for going mobile first when in Asia companies are already going mobile only. In some ways, Asia feels like it lives in the past as compared to the U.S., especially when one sees so many fast followers of successful U.S. technology companies, but in a surprisingly large number of ways, Asia lives in our near future.

Fiction's willful ignorance of tech

“And I just don’t know how to write a novel in which the characters can get in touch with all the other characters at any moment. I don’t know how to write a novel in the world of cellphones. I don’t know how to write a novel in the world of Google, in which all factual information is available to all characters. So I have to stand on my head to contrive a plot in which the characters lose their cellphone and are separated from technology.”
 

Ann Patchett on one of the chief problems of novel-writing today. The internet has disrupted lots of things, and one of those is dramatic information asymmetry

Steve Himmer on one popular solution for fiction writers:

In literary fiction, the more popular solution seems to be relying on settings close to the present, but far enough back to avoid such inconvenience. Granted, the popularity of the 1970s, 1980s, and early-1990s as settings also owes plenty to generational shifts in literary production as people write about formative periods and the years they remember. But it also avoids any number of narrative problems and allows writers to go on telling stories in the way they are used to, rather than incorporating the present in ways that are difficult and disruptive. When I recently wondered on Twitter — one of those very disruptions — if we’ve reached the point of needing a term for this kind of setting, author Jared Yates Sexton suggested “the nostalgic present.” And while it’s easy enough to incorporate mention of that into this essay, where might a tweet fit into a novel? As dialogue, formatted like any other character’s utterance? Or embedded with timestamp and retweet count and all? What happens when our characters spend half their novel on Twitter, as so many of us spend our workdays? It’s a hard question, but not one that gets answered when writers aspire to be more like Andras Schiff than Lukas Kmit.
 

Himmer goes on to urge more writers to embrace the technology of today and incorporate it into their fiction, and I'm with him. It strikes me as lazy screenwriting when they incorporate one of those old school answering machines just so the audience can hear what message is being left. I haven't seen one of those in ages. If it's meant to be a message missed, show us the person as they leave the message so we can hear it. If we're meant to see the recipient read as they hear the voicemail, give us the audio as the person listens to the voice message.

Younger fiction writers, in particular, have a great opportunity here. Embrace the massive role that all this technology plays in our lives and teach readers about how it is affecting us and how we might cope. Novels are supposed to, among other things, illuminate the human condition. No one using all this technology should feel any less human, but the absence of technology in art leaves an odd temporal void in which we can only travel to the past or the future or, as Sexton suggests in the passage above, the nostalgic present.

One way to create demand

An ad from the early 1900's, with the opening paragraphs excerpted below:

Milly caught the bride's bouquet but everybody present knew that nothing would come of it...that she wouldn't be the next to marry by a long ways...and they knew the reason why, too.
 
People with halitosis (unpleasant breath) simply don't get by. It is the unforgivable social fault.
 
You never know when you have it—that's the insidious thing about it. Moreover, you are quite likely to have it, say dental authorities. Conditions present even in normal mouths constantly produce objections odors.
 
Don't take a chance
 
The one way to make sure that your breath does not offend others is to rinse the mouth with Listerine.
 

From How "Clean" Was Sold to America with Fake Science in Gizmodo. Worth a read if for nothing else than to see some of the photos of unbelievable ads about body odor. Even the use of the term “halitosis” to give bad breath a scientific term, as if it were some serious malady, is fiendishly clever. It's also shocking how the ads pictured are all targeted at women, reflecting the sexism of their times.

The tech industry has a lot to learn from consumer packaged goods companies about how to manufacture demand or consumer desire. The ads in this piece are negative and scare mongering, but companies like Procter & Gamble are just as good at generating desire with positive emotional triggers. It still feels like most tech brand advertising derives from viral stunts. One notable exception, of course, is Apple, most of whose ads now have more in common with fashion ads than technology ones.