Alan published in NEJM

My brother Alan had an article published in the February 14th issue of The New England Journal of Medicine. We're all proud of him.​

The article title: Selumetinib-Enhanced Radioiodine Uptake in Advanced Thyroid Cancer.​ As with all brilliant ideas, the conclusion of the article seem self-evident upon further reflection, I mean clearly you'd anticipate selumetinib producing clinically meaningful increases in iodine uptake and retention in a subgroup of patients with thyroid cancer that is refractory to radioiodine, it's amazing we never believed this before, ahhh, I have no idea what I'm talking about, why am I not smart enough, my life has no meaning.

A dangerous market feedback loop

Moskowitz’s path to mastering the bliss point began in earnest not at Harvard but a few months after graduation, 16 miles from Cambridge, in the town of Natick, where the U.S. Army hired him to work in its research labs. The military has long been in a peculiar bind when it comes to food: how to get soldiers to eat more rations when they are in the field. They know that over time, soldiers would gradually find their meals-ready-to-eat so boring that they would toss them away, half-eaten, and not get all the calories they needed. But what was causing this M.R.E.-fatigue was a mystery. “So I started asking soldiers how frequently they would like to eat this or that, trying to figure out which products they would find boring,” Moskowitz said. The answers he got were inconsistent. “They liked flavorful foods like turkey tetrazzini, but only at first; they quickly grew tired of them. On the other hand, mundane foods like white bread would never get them too excited, but they could eat lots and lots of it without feeling they’d had enough.”

This contradiction is known as “sensory-specific satiety.” In lay terms, it is the tendency for big, distinct flavors to overwhelm the brain, which responds by depressing your desire to have more. Sensory-specific satiety also became a guiding principle for the processed-food industry. The biggest hits — be they Coca-Cola or Doritos — owe their success to complex formulas that pique the taste buds enough to be alluring but don’t have a distinct, overriding single flavor that tells the brain to stop eating.

​From the tomorrow's NYTimes Magazine cover story The Extraordinary Science of Junk Food. It's both fascinating and terrifying.

Poring over data one day in his home office, trying to understand just who was consuming all the snack food, Riskey realized that he and his colleagues had been misreading things all along. They had been measuring the snacking habits of different age groups and were seeing what they expected to see, that older consumers ate less than those in their 20s. But what they weren’t measuring, Riskey realized, is how those snacking habits of the boomers compared to themselves when they were in their 20s. When he called up a new set of sales data and performed what’s called a cohort study, following a single group over time, a far more encouraging picture — for Frito-Lay, anyway — emerged. The baby boomers were not eating fewer salty snacks as they aged. “In fact, as those people aged, their consumption of all those segments — the cookies, the crackers, the candy, the chips — was going up,” Riskey said. “They were not only eating what they ate when they were younger, they were eating more of it.” In fact, everyone in the country, on average, was eating more salty snacks than they used to. The rate of consumption was edging up about one-third of a pound every year, with the average intake of snacks like chips and cheese crackers pushing past 12 pounds a year.

Riskey had a theory about what caused this surge: Eating real meals had become a thing of the past. Baby boomers, especially, seemed to have greatly cut down on regular meals. They were skipping breakfast when they had early-morning meetings. They skipped lunch when they then needed to catch up on work because of those meetings. They skipped dinner when their kids stayed out late or grew up and moved out of the house. And when they skipped these meals, they replaced them with snacks. “We looked at this behavior, and said, ‘Oh, my gosh, people were skipping meals right and left,’ ” Riskey told me. “It was amazing.” This led to the next realization, that baby boomers did not represent “a category that is mature, with no growth. This is a category that has huge growth potential.”

​The article includes wonderful tidbits like "people like a chip that snaps with about four pounds of pressure per square inch" and explains why Cheetos are one of the most perfect snacks ever constructed.

The foodie movement look to high end restaurants ​for culinary innovation, but the truth is that much more of that happens in the mass market industrial food production machine. Many high end restaurant techniques are actually borrowed from the industrial food production laboratories.

This all speaks to one of the defects of our free market economy, that these dangerous feedback loops will be set up in which we are given exactly what we want but don't need. The most insidious type of killing might be the one that happens under our very noses, so slowly we don't notice it, a caper in which we are given cheap and ready access to a slow-acting poiso and readily gorge on it until it's too late.

Reading about some of these brilliant food scientists, concocting new snacks to steal our market share, I couldn't help but think of Walter White, with his blue meth. In the tech industry, it's fashionable to talk about marketing and distribution as necessary companions to product development. Few industries embody the perfect unity of those disciplines than the food industry.

An ode to online writing

Where is it coming from? Some of it comes from professional journalists, writing for the websites of established publications or on their own blogs. But much of it – the great new addition to our writing and reading culture – comes from professionals in other fields who find the time, the motivation and the opportunity to write for anyone who cares to read. I am sorry that the internet gifted this practice with such an ugly name, “blogging”, but it is too late to change that now.

As a gross generalisation, academics make excellent bloggers, within and beyond their specialist fields. So, too, do aid workers, lawyers, musicians, doctors, economists, poets, financiers, engineers, publishers and computer scientists. They blog for pleasure; they blog for visibility within their field; they blog to raise their value and build their markets as authors and public speakers; they blog because their peers do.

Businessmen and politicians make the worst bloggers because they do not like to tell what they know, and telling what you know is the essence of blogging well. They also fear to be wrong; and, as Felix Salmon, Reuters’ finance blogger, insists and sometimes demonstrates: “If you are never wrong, you are never interesting”.

Very meta, this great piece of online writing about great pieces of online writing from Robert Cottrel, editor of the Browser, from which I've found many wonderful pieces of writing on the web. Some more:

My second contention as a professional reader is one that may seem self-evident in the world of blogging but also holds good across the whole universe of online writing and publishing: the writer is everything. The corollary of this also holds good: the publisher (with a few exceptions) is nothing.

A great read throughout, though I wonder if Cottrell will feature it in tomorrow's email newsletter from the Browser. I suspect he won't.​

Preserving information entropy

[SPOILER ALERT: I excerpt an article which discusses the season 3 finale of Downton Abbey, so if you haven't watched that episode yet and don't want it spoiled, close this tab and be gone with ye.]

Vulture summarizes a bunch of reasons why Downton Abbey will probably not air at the same time in the U.S. as it does in the U.K. A lot of those are very sensible ones, but the one which I take issue with is the first one:

Spoilerphobes may have been mad, but they still watched.
Back in December, the Internet flooded with tears after online headlines spoiled Matthew’s demise. But it’s impossible to quantify the actual percentage of Downton viewers who had the story ruined for them, and Hoppe says the fan feedback hasn’t been that bad. What’s more, viewers don’t appear to have abandoned the show as a result. “There is a little bit of negative buzz around the spoilers, but it’s pretty minimal from what we’re hearing,” Hoppe says, pointing to the ratings, which are ginormous. PBS says the third season of Downton is averaging more than 11 million viewers per episode (when you factor in the premiere plus seven days of DVR viewing). That’s 420 percent above the public broadcaster’s average prime-time rating and double the average viewership of the show’s second season. By far, it is the most-watched program in PBS history. “That kind of success is hard to argue with,” she says. It’s also worth noting that Sybil’s episode-five death didn’t become headline news in the U.S. the way Matthew’s did, proving that some secrets make it to the U.S. intact.

I heard this used as justification for airing last year's Summer Olympics on tape delay, too: the ratings were great, and people tuned in to watch primetime events even when everyone knew who had won the 100 meter dash, or whether Phelps had won a particular event.

Aaron Cohen wrote a post when he was guest blogger at Kottke last summer citing a research paper that said tests showed that subjects preferred movies and books that were spoiled for them, even in genres like mysteries where a surprise ending would seem to be most important.

I am intrigued by contrarian ideas as much as anyone, but I can't buy into this line of thinking. It's not just because it may be confusing correlation and causation, but that's definitely a big part of it. If you take the findings of that paper to the logical extreme, we should actively start telling people the endings to movies and books just to drive more sales. Why stop there, why not just tape delay all sporting contests and then release the results before the events are actually televised?

I remember the moment when I actually first saw the ending of The Sixth Sense in a theater, without any knowledge of what was coming. That moment when I realized what had happened and my entire brain nearly exploded the neurons were firing so hot. Then I imagine someone telling me ahead of time about how that movie ends, and then what I would have felt watching the movie (that's essentially how I felt watching most of the events at the Summer Olympics on TV last year since everything was spoiled for me ahead of time).

If it's something great, I'll still enjoy a spoiled experience. However, I love an unexpected surprise. I loved when my sisters flew to Seattle and surprised me for my 30th birthday. I loved the time my manager said pack my snowboard for work so we could slip away for an afternoon ski trip in Seattle and then it turned out we were actually flying to Sundance for the weekend. I loved the ending of The Others (the rest was good, too). I loved that moment in Infernal Affair I when the elevator doors opened and...well, I won't ruin it for you. 

Genuine surprise is a pleasure the modern world is robbing us of bit by bit. We live in an instantly connected world where information flows more easily than at any time in history, and increasingly our only foolproof defense against spoilage is to lead a monk-like existence of solitude from all other humans and devices. I understand that giant media entities like NBC and PBS are unlikely to shift the TV schedules for the other side of the world, but let's not pretend it's not suboptimal.

In one of my favorite books of 2012, The Most Human Human, Brian Christian argues convincingly that information entropy is of huge value to the experience of being human. 

[information entropy can be quantified. If I tell you to guess a random eight letter word, you'll have a very low chance of being right, but if I present you with the first seven letters and ask you to guess the eighth, for example Faceboo_, your chances of guessing that eighth letter are quantifiably higher.]

Here's one relevant passage from the book:

Information, defined intuitively and informally, might be something like 'uncertainty's antidote.' This turns out also to be the formal definition—the amount of information comes from the amount by which something reduces uncertainty...The higher the [information] entropy, the more information there is. It turns out to be a value capable of measuring a startling array of things—from the flip of a coin to a telephone call, to a Joyce novel, to a first date, to last words, to a Turing test...Entropy suggests that we gain the most insight on a question when we take it to the friend, colleague, or mentor of whose reaction and response we're least certain. And it suggests, perhaps, reversing the equation, that if we want to gain the most insight into a person, we should ask the question of qhose answer we're least certain... Pleasantries are low entropy, biased so far that they stop being an earnest inquiry and become ritual. Ritual has its virtues, of course, and I don't quibble with them in the slightest. But if we really want to start fathoming someone, we need to get them speaking in sentences we can't finish.

This is all a long way of saying that if someone out there wants to organize some secret Game for me without tipping me off, that would likely be the greatest thing ever.