When introverts should drink coffee

In his book Me, Myself, and Us: The Science of Personality and the Art of Well-Being, psychologist Brian Little argues that introverts shouldn't drink coffee before an important meeting, or anything like it.

Why does coffee seem to have this effect on introverts?
This isn't my own research, but it's based on the theory of extraversion by Hans Eysenck and research by William Revelle of Northwestern University. It's the idea that introverts and extraverts differ in the level of neocortical arousal in the brain — in other words, how alert or responsive you are to your environment. According to this theory, introverts are over the optimal level — that is, more easily stimulated — and extraverts under the optimal level. 

It's more complex than that, but this is a useful model because it allows us to make some predictions. This suggests that performance will be compromised for introverts if they are exposed to stimulating situations, or if they ingest a stimulant (such as caffeine),which pushes them even further away from the optimal level. 

So when should introverts have their coffee, then?
Later in the day would be better; at any rate, they should try not to have caffeine right before something like an important meeting, as I say in the book.

Being cold as a weight-loss technique?

Is the fact that Americans can be warm year-round one reason they are so obese?

That's the theory of Ray Cronise, former NASA materials scientist.

Cronise’s latest ideas are laid out in a 2014 article he co-authored with Andrew Bremer, who was then at Vanderbilt University (he is now at the National Institutes of Health), and the Harvard geneticist David Sinclair, who is well known for his recent work on resveratrol (the “anti-aging” antioxidant found in red wine) and sirtuins—enzymes that help control metabolism. Sirtuins are active during times of stress, including when a person is hungry, and are thought to be related to the known life-prolonging effects of very-low-calorie diets.

Cronise, Bremer, and Sinclair propose what they call the “Metabolic Winter” hypothesis: that obesity is only in small part due to lack of exercise, and mostly due to a combination of chronic overnutrition and chronic warmth. Seven million years of human evolution were dominated by two challenges: food scarcity and cold. “In the last 0.9 inches of our evolutionary mile,” they write, pointing to the fundamental lifestyle changes brought about by refrigeration and modern transportation, “we solved them both.” Other species don’t exhibit nearly as much obesity and chronic disease as we warm, overfed humans and our pets do. “Maybe our problem,” they continue, “is that winter never comes.”

Their article joins a growing body of research on the metabolic effects of cold exposure, some of which I’ve reported on previously. Earlier last year, in the journal Cell Metabolism, researchers from the National Institutes of Health likened these effects to those of exercise, arguing that a better understanding of endocrine responses to cold could be useful in preventing obesity. The lead researcher in that study, Francesco Celi, published more research in June, finding that when people cool their bedrooms from 75 degrees to 66 degrees, they gain brown fat, the metabolically active fat that burns calories to generate heat. (Having brown fat is considered a good thing; white fat, by contrast, stores calories.) Another 2014 study found that, even after controlling for diet, lifestyle, and other factors, people who live in warmer parts of Spain are more likely to be obese than people who live in the cooler parts.

If you want to try this out for yourself, the article mentions a device called the Cold Shoulder, a vest that holds ice packs, which you can wear around the house to try to burn more calories. You can buy the Cold Shoulder weight loss vest on Amazon for $149.

This still all sounds speculative, but I do sleep much better in really cold rooms. When I'm in a hotel and need a good night's sleep, I always crank up the A/C and bury myself under the covers.

I came home from vacation in Del Mar to a 63 degree apartment in San Francisco today, and I'm not going to turn on the heat. Damn it is cold.

Elf on the Shelf

Not having any kids of my own, I had not heard of the whole Elf on the Shelf tradition until this Christmas break when I spent a lot of time with my nephews and nieces. Every morning my niece Averie would wake up and do a search for her elf, whose name I've forgotten already. I had no idea this had been all the rage with young kids since 2005.

Some researchers have studied the cultural phenomenon and concluded it may be a troubling way of acclimating children to life in a surveillance state.

Through play, children become aware about others’ perspectives: in other words, they cultivate understandings about social relationships. The Elf on the Shelf essentially teaches the child to accept an external form of non-familial surveillance in the home when the elf becomes the source of power and judgment, based on a set of rules attributable to Santa Claus. Children potentially cater to The Elf on the Shelf as the “other,” rather than engaging in and honing understandings of social relationships with peers, parents, teachers and “real life” others.

What is troubling is what The Elf on the Shelf represents and normalizes: anecdotal evidence reveals that children perform an identity that is not only for caretakers, but for an external authority (The Elf on the Shelf), similar to the dynamic between citizen and authority in the context of the surveillance state. Further to this, The Elf on the Shelf website offers teacher resources, integrating into both home and school not only the brand but also tacit acceptance of being monitored and always being on one’s best behaviour--without question.

By inviting The Elf on the Shelf simultaneously into their play-world and real lives, children are taught to accept or even seek out external observation of their actions outside of their caregivers and familial structures. Broadly speaking, The Elf on the Shelf serves functions that are aligned to the official functions of the panopticon. In doing so, it contributes to the shaping of children as governable subjects.

Parents who have enthusiastically embraced Elf on the Shelf are likely rolling their eyes right now. I'm no parent, and I stay far away from teaching folks how to raise their kids, but linking Elf on the Shelf to the Benthamite Panopticon was too much to resist.

I am intellectually curious about the impact of childhood culture and mythology on children's personalities, however. For example, should we teach our kids to believe in Santa Claus?

Will Wilkinson plans to.

Well, we’re atheists. I don’t intend to proselytize atheism to my kid, because I’m not interested in getting him to believe anything in particular. What I’m interested in is teaching him how to reason in a way that maximizes his chances of hitting on the truth. Now, one of the most interesting truths about the empirical world is that there are all these powerful systems of myth that are kept afloat by a sort of mass conspiracy, and humans seem disposed to pick one from the ambient culture and take it very seriously. But it can be hard to get your head around the way it all works unless you participate in it. Santa is a perfect and relatively harmless way to introduce your child the socio-psychology of a collective delusion about the supernatural. The disillusionment that comes from the exposure to the truth about Santa breeds a general skepticism about similarly ill-founded popular beliefs in physics-defying creatures.

Tyler Cowen has contemplated this issue as well:

I say why not leave them guessing, hovering in a state of Bayesian Santa doubt?  My parents never told me Santa “was real,” but they didn’t tell me he “wasn’t real” either, so I slid rather gracefully into my Santa non-belief.  I don’t recall ever feeling disillusioned by a sense of loss and in fact those presents kept on coming.  I even had a clearer sense of the appropriate channel for making gift requests, what’s not to like about that?

This all seems rather harmless, and I do think in many cases, as with fairy tales, fiction offers a smoother psychic transition to some of the harsher truths of the world. For example, if you're getting a bitter divorce and have young kids, I doubt the best way to break the news to them is to explain that the institution of marriage is an unnatural and brittle one, or that their mother or father had an affair with someone they picked up at the corner pub. It takes time to build up that armor.

However, I am suspicious of the behavioral enforcement effectiveness of the Santa Claus and the Elf on the Shelf mythologies. The whole idea of a naughty or nice list, or the Elf on the Shelf observing you and reporting back to the North Pole, I've yet to see any evidence it encourages kids to behave any better. Perhaps that's not the point, and I don't know any parents who've ever withheld gifts from their kids. If so, why keep that whole naughty list panopticon surveillance state portion of the mythology at all? Why not have the story be about unconditional love?

Personally, I think it would be just as miraculous to teach kids about Jeff Bezos instead of Santa Claus, and about how Amazon delivers a gazillion packages worldwide through a vast coordinated interconnected system of computers, people, and vehicles. Sometimes the truth is magical.

(h/t Clive Thompson)

Can we move past pure outrage?

Eric Meyer's Inadvertent Algorithmic Cruelty got a lot of traction over the past week. He discusses how Facebook's Year in Review app confronted him with a photo of his daughter who died this year.

Yes, my year looked like that.  True enough.  My year looked like the now-absent face of my little girl.  It was still unkind to remind me so forcefully.

And I know, of course, that this is not a deliberate assault.  This inadvertent algorithmic cruelty is the result of code that works in the overwhelming majority of cases, reminding people of the awesomeness of their years, showing them selfies at a party or whale spouts from sailing boats or the marina outside their vacation house.

But for those of us who lived through the death of loved ones, or spent extended time in the hospital, or were hit by divorce or losing a job or any one of a hundred crises, we might not want another look at this past year.

To show me Rebecca’s face and say “Here’s what your year looked like!” is jarring.  It feels wrong, and coming from an actual person, it would be wrong.  Coming from code, it’s just unfortunate.  These are hard, hard problems.  It isn’t easy to programmatically figure out if a picture has a ton of Likes because it’s hilarious, astounding, or heartbreaking.

Jeffrey Zeldman wrote a follow-up titled Unexamined Privilege Is The Real Source of Cruelty in Facebook's “Year In Review”.

UNEXAMINED PRIVILEGE is the real source of cruelty in Facebook’s “Your Year in Review”—a feature conceived and designed by a group to whom nothing terrible has happened yet. A brilliant upper-middle-class student at an elite university conceived Facebook, and college students, as everyone knows, were its founding user group. The company hires recent graduates of expensive and exclusive design programs and pays them several times the going rate to brainstorm and execute exciting new features. 

...

But when you put together teams of largely homogenous people of the same class and background, and pay them a lot of money, and when most of those people are under 30, it stands to reason that when someone in the room says, “Let’s do ‘your year in review, and front-load it with visuals,’” most folks in the room will imagine photos of skiing trips, parties, and awards shows—not photos of dead spouses, parents, and children.

At least in my Twitter timeline, I saw plenty of people jump on Zeldman's post to excoriate Facebook engineers for being privileged and heartless. Were there other posts like Zeldman's? 

I was really happy to see Meyer do a follow-up where he urged folks to put away the pitchforks and cage the outrage monster the internet loves to summon.

Yes, their design failed to handle situations like mine, but in that, they’re hardly alone.  This happens all the time, all over the web, in every imaginable context.  Taking worst-case scenarios into account is something that web design does poorly, and usually not at all.  I was using Facebook’s Year in Review as one example, a timely and relevant foundation to talk about a much wider issue.

...

What surprised and dismayed me were the…let’s call them uncharitable assumptions made about the people who worked on Year in Review.  “What do you expect from a bunch of privileged early-20s hipster Silicon Valley brogrammers who’ve never known pain or even want?” seemed to be the general tenor of those responses.

No.  Just no.  This is not something you can blame on Those Meddling Kids and Their Mangy Stock Options.

First off, by what right do we assume that young programmers have never known hurt, fear, or pain?  How many of them grew up abused, at home or school or church or all three?  How many of them suffered through death, divorce, heartbreak, betrayal?  Do you know what they’ve been through?  No, you do not.  So maybe dial back your condescension toward their lived experiences.

Second, failure to consider worst-case scenarios is not a special disease of young, inexperienced programmers.  It is everywhere.

A voice of sanity online? Elsa is working her magic in Hell.

I don't mean to imply there isn't privilege of all forms, or that more diversity isn't a good thing. I believe strongly in both. But if there's one thing that exhausted me about the online community in 2014 it was the never-ending cycle of outrage. And if there's one thing I wish we'd see more of in 2015 it's more reasoned, nuanced debate.

That begins with not misreading Zeldman's post, either. The paragraph that came between the two I quoted above was this:

I’m not saying that these brilliant young designers are heartless, or that individuals among them haven’t personally experienced tragedy—that would be mathematically impossible. I have taught some of these designers, and worked with others. Those I’ve known are wonderful people who want to make a difference in the world. And in theory (and sometimes in practice) a platform like Facebook lets them do that.

Both of Meyer's pieces are worth reading in their entirety, as is Zeldman's piece, whose title and opening paragraph are far more extreme than the rest of the piece.

It feels good to create the “other.” Humans love the ingroup-outgroup dynamics. It's so psychologically comforting to belong to one side and channel one's rage to demonize a common enemy. I'm not asking for artificial harmony but a shift towards more rational, unemotional debate.

How? I'm not sure, but the next time you feel the urge to unleash the rage monster, step back from the keyboard and try to pass Bryan Caplan's ideological Turing test. It's not perfect, but I don't have any better ideas. It takes two to have a reasoned debate. You count as one. Empathy creates the other.

Echoes of the fall of the studio system in journalism

Most people today think of movie studios as largely interchangeable. Who cares if the opening credits of a movie bring up the Warner Bros or Paramount or Twentieth Century Fox title card? It's a meaningless signal as to what you're going to see next. And what does it even mean, to show their logo before the movie? Ask most moviegoers and the best they can offer is that perhaps the studio put up the money (which isn't far from the truth), but most won't really have any idea.

It wasn't always this way. In the heyday of the studio system, the late 1920's through the early 1960's, if Twentieth Century Fox or MGM came up before a movie, it told you a lot about the theme, subject, tone, and style of the movie to follow. Since talent (directors, screenwriters, actors, etc.) worked for studios back then, you could often anticipate who would star in the movie, what genre it would be about, who might direct and shoot the movie, if I simply told you which studio was behind a movie. What's more, studios owned the theater chains themselves so they could guarantee distribution of their films.

As if it wasn't enough to control not just the means of production (talent) and distribution, studios faced little competition at the time from foreign movies or other forms of entertainment. The vast power they wielded wasn't all bad; the studios put out some amazing movies. They also produced a lot of dreck.

When I read about the problems facing professional journalism today, I hear echoes of the changes that led to the fall of the studio system. In 1948, in United States vs. Paramount Pictures, Inc., the Supreme Court ruled that the major U.S. movie studios had to sell off their ownership in movie theater chains. This was a boon to other studios and independent artists who didn't own their own theaters. No longer were the major studios the gatekeepers to what could get shown.

The internet, of course, had the same effect on distribution of content in journalism. No longer do major metro newspapers own de facto monopolies on what people can consume as part of their journalistic diet. In the 1960's, foreign cinema finally made inroads in the U.S. and grabbed mindshare, covering topics and themes that American studios shied away from either by choice or because of reflexive adherence to The Hays Code, which had already been outlawed in Joseph Burstyn, Inc. v. Wilson in 1952.

[By the way, if you've never read The Hays Code, it's worth scanning. Strands of its rigid morality still weave themselves through much of Hollywood's output today.]

Just as major studios allowed foreign movies and independent studios to take the lead on diversity in subject matter, blogs and new media sites have arisen to cover all sorts of subjects that newspapers have either never had the bandwidth or desire to cover. For any odd subject matter, I just assume there's a subreddit for it.

It's not just subject matter but form and style where old media is chasing the new. It wasn't the major newspapers like The New York Times or The Washington Post who leapt first into publishing blogs and listicles online as regular features. Long before old media started having their reporters on Twitter and other social media, independent bloggers were ensuring their voices could be heard on all networks online. This is hard to remember now because most of the old guard have made the leap by  now, but I remember when you couldn't find the vast majority of old media journalists on Twitter, Facebook, Instagram, or anywhere else.

Another common lament of journalists today is the rise in competition for mindshare. People have too many options on their phone competing with the news, and more often than not, this generation chooses Instagram, Facebook, Twitter, Snapchat, et al over the news.

The studios faced just as significant a competitor in the rise of television, which started to gain traction in the 1950's and 1960's (when color television was invented and went mainstream). Suddenly, people had a much broader choice of entertainment form factor. No longer was the 90 to 120 minute narrative film the primary video form factor.

It's not just about the overall package or the choice of newspaper but what's inside it that has been pushed into a bigger pool of competitionWhen I was in elementary school, my only source of much information (movie listings, comic strips, automobile reviews, stock prices) came from the Chicago Tribune. Today, specialty websites have given newspapers massive competition along every subject vertical. During WWII, many Americans got news of American troops in newsreels played before movies. Monopolies make for strange bedfellows. After television rose to prominence, newsreels and the video news function in general moved to television.

Today, I don't get movie showtimes, restaurant reviews, or stock quotes from my local newspaper. In fact, I get almost nothing from local newspapers. I'm still a big Chicago sports fan, and for many years, though I was living in other cities, I still turned to the Chicago Tribune for local sports coverage. Then the website was overrun with ads of all types and then a metered paywall and finally it became near unusable. It turns out their local coverage wasn't all that unique anyhow. Now I rely on niche blogs like Bleacher Nation to keep up with my Cubs.

The package of content in a newspaper was always somewhat arbitrary and non-personalized. What if you didn't like automobiles or business news or reviews of obscure history tomes? Too bad, you were paying for it as part of your daily newspaper (many of the subject choices in the bundle were advertising-driven, of course, as grim world and local news is not a desirable subject matter for advertisers).

This is both good—users can pick and choose their media diet!—and bad—users can pick and choose their media diet! It turns out that when people can choose what to read each day rather than have it assigned to them as one lump, they tend to lean away from a steady diet of grim world news. In fact, they want “serious news” in a much much lower proportion of their diet than traditional newspaper dole it out. Any parent knows if they let their kids choose what to eat, they'll over-index on sugar, under-index on vegetables. Buzzfeed does some really great long-form journalism, but compare their front page to that of The New York Times and you'll find a substantial difference in subject matter weighting.

Movie studios are facing a similar economic conundrum on this front. The movie studio equivalent of “serious news” is the mid-tier adult drama, many of which appeal to an increasingly narrow population of cinephiles. These types of movies tend not to travel well internationally so they can't capitalize on foreign box office revenue to help recoup their budgets. They're too expensive to make relative to their revenue; soon we may see directors of such serious fare starring in short infomercials, urging donations to protect what is in essence an endangered species.

This is exactly the brutal reality of lots of the types of serious long-term global reporting that institutions like The New York Times have long championed. I'm a huge fan of many mid-tier adult dramas, and I believe in the importance of the types of long-term reporting to society, but I'm also a fan of the free market, and both have benefitted from cross-subsidization and de-facto monopolies that have fallen away. Lamentation is not a business strategy.

You know what does travel well internationally? Superhero movies. You know what types of articles travel well through today's social networks, attracting viewers by tapping into some primal wiring in our brain? Clickbait, like listicles and photo galleries of celebrities coming out of clubs and restaurants and gyms. I'm painting with broad sweeps of my arms, but only because the broad trends are unmistakeable.

Another similarity between the fall of the studio system and the changes sweeping across journalism is the rise in power of the talent. Recall that studios once locked up talent to multi-movie, multi-year contracts. Imagine if Tom Cruise could only make movies for Warner Bros, only for their directors, only in movies by their screenwriters. That was once how Hollywood worked, and often the star didn't even have a choice of whether to take a role; it was assigned to them.

In time, some movie stars sued to get out of their onerous contracts, and today we accept that most movie talent are free agents that can pick and choose their projects. Sure, some studios still offer offices and money to producers for first look deals, but most significant talent aren't locked up. Journalism has seen the rise of power for journalists with their own following as well, with Bill Simmons and Grantland being the most prominent example (it's under the auspices of ESPN, but occasionally he gets banned from Twitter for a few days, but that doesn't detract from his stunning rise in power from the blogger Sports Guy for AOL to Bill Simmons; it's no coincidence more people refer to him by his real name now than his Sports Guy moniker).

Not every talent has the brand to pursue this strategy, but the economics for such stars are almost always better if they test themselves in free agency. Here's another though experiment: imagine I tell you you're going to see a Sony Pictures movie. Or read an ESPN article. What comes to mind? Anything distinctive or specific?

Now imagine I tell you you're going to see a Paul Thomas Anderson movie. Or read a Bill Simmons article. What you anticipate is likely far more specific and vivid. Some movie studios have that distinctiveness, Pixar is always held up as the poster child for a studio whose brand still has weight as an artistic signal, but examples like that or The New Yorker, media enterprises with a distinctive house style, are the exception. It's no surprise. The role an ESPN assumes in supporting Grantland or that Dreamworks occupies with a Steven Spielberg movie is more financial than anything else.

It's not just previously established stars who have been empowered. Anyone with an internet connection can publish and distribute now, adding even more competition for old media. The cost of writing is much lower, of course, than to make a movie, but in short form video entertainment in particular, with YouTube or Vine, the need for studio support for talent has lessened.

Great movies were produced in both the heyday of the studio system and today, just as great journalism was done in the age of the local newspaper monopoly and today. I'm neither overly nostalgic for the past nor universally enthusiastic about all modern progress, and I don't get caught up in debates about the merits of the past versus the future. However, from a business perspective, nostalgia is a dangerous emotion. There are at least two ways to rage against the dying of the light, and only one of them is productive.