10 browser tabs

1. Love in the Time of Robots

“Is it difficult to play with her?” the father asks. His daughter looks to him, then back at the android. Its mouth begins to open and close slightly, like a dying fish. He laughs. “Is she eating something?”
 
The girl does not respond. She is patient and obedient and listens closely. But something inside is telling her to resist. 
 
“Do you feel strange?” her father asks. Even he must admit that the robot is not entirely believable.
 
Eventually, after a few long minutes, the girl’s breathing grows heavier, and she announces, “I am so tired.” Then she bursts into tears.
 
That night, in a house in the suburbs, her father uploads the footage to his laptop for posterity. His name is Hiroshi Ishi­guro, and he believes this is the first record of a modern-day android.
 

Reads like the treatment for a science fiction film, some mashup of Frankenstein, Pygmalion, and Narcissus. One incredible moment after another, and I'll grab just a few excerpts, but the whole thing is worth reading.

But he now wants something more. Twice he has witnessed others have the opportunity, however confusing, to encounter their robot self, and he covets that experience. Besides, his daughter was too young, and the newscaster, though an adult, was, in his words, merely an “ordinary” person: Neither was able to analyze their android encounter like a trained scientist. A true researcher should have his own double. Flashing back to his previous life as a painter, Ishi­guro thinks: This will be another form of self-portrait. He gives the project his initials: Geminoid HI. His mechanical twin.
 

Warren Ellis, in a recent commencement speech delivered at the University of Essex, said:

Nobody predicted how weird it’s gotten out here.  And I’m a science fiction writer telling you that.  And the other science fiction writers feel the same.  I know some people who specialized in near-future science fiction who’ve just thrown their hands up and gone off to write stories about dragons because nobody can keep up with how quickly everything’s going insane.  It’s always going to feel like being thrown in the deep end, but it’s not always this deep, and I’m sorry for that.
 

The thing is, far future sci-fi is likely to be even more off base now given how humans are evolving in lock step with the technology around them. So we need more near future sci-fi, of a variety smarter than Black Mirror, to grapple with the implications.

Soon his students begin comparing him to the Geminoid—“Oh, professor, you are getting old,” they tease—and Ishi­guro finds little humor in it. A few years later, at 46, he has another cast of his face made, to reflect his aging, producing a second version of HI. But to repeat this process every few years would be costly and hard on his vanity. Instead, Ishi­guro embraces the logi­cal alternative: to alter his human form to match that of his copy. He opts for a range of cosmetic procedures—laser treatments and the injection of his own blood cells into his face. He also begins watching his diet and lifting weights; he loses about 20 pounds. “I decided not to get old anymore,” says Ishi­guro, whose English is excellent but syntactically imperfect. “Always I am getting younger.”
 
Remaining twinned with his creation has become a compulsion. “Android has my identity,” he says. “I need to be identical with my android, otherwise I’m going to lose my identity.” I think back to another photo of his first double’s construction: Its robot skull, exposed, is a sickly yellow plastic shell with openings for glassy teeth and eyeballs. When I ask what he was thinking as he watched this replica of his own head being assembled, Ishi­guro says, perhaps only half-joking, “I thought I might have this kind of skull if I removed my face.”
 
Now he points at me. “Why are you coming here? Because I have created my copy. The work is important; android is important. But you are not interested in myself.”
 

This should be some science fiction film, only I'm not sure who our great science fiction director is. The best examples may be too old to want to look upon such a story as anything other than grotesque and horrific.

2. Something is wrong on the internet by James Bridle

Of course, some of what's on the internet really is grotesque and horrific. 

Someone or something or some combination of people and things is using YouTube to systematically frighten, traumatise, and abuse children, automatically and at scale, and it forces me to question my own beliefs about the internet, at every level. 
 

Given how much my nieces love watching product unwrapping and Peppa the Pig videos on YouTube, this story was induced a sense of dread I haven't felt since the last good horror film I watched, which I can't remember anymore since the world has run a DDOS on my emotions.

We often think of a market operating at peak efficiency as sending information back and forth between supply and demand, allowing the creation of goods that satisfy both parties. In the tech industry, the wink-wink version of that is saying that pornography leads the market for any new technology, solving, as it does, the two problems the internet is said to solve better, at scale, than any medium before it: loneliness and boredom.

Bridle's piece, however, finds the dark cul-de-sacs and infected runaway processes which have branched out from the massive marketplace that is YouTube. I decided to follow a Peppa the Pig video on the service and started tapping on Related Videos, like I imagine one of my nieces doing, and quickly wandered into a dark alleyway where I saw some video which I would not want any of them watching. As Bridle did, I won't link to what I found; suffice to say it won't take you long to stumble on some of it if you want, or perhaps even if you don't.

What's particularly disturbing is the somewhat bizarre, inexplicably grotesque nature of some of these video remixes. David Cronenberg is known for his body horror films; these YouTube videos are like some perverse variant of that, playing with popular children's iconography.

Facebook and now Twitter are taking heat for disseminating fake news, and that is certainly a problem worth debating, but with that problem we're talking about adults. Children don't have the capacity to comprehend what they're seeing, and given my belief in the greater effect of sight, sound, and motion, I am even more disturbed by this phenomenon.

A system where it's free to host videos to a global audience, where this type of trademark infringement weaponizes brand signifiers with seeming impunity, married with increasingly scalable content production and remixes using technology, allows for the type of scalable problem we haven't seen before.

The internet has enabled all types of wonderful things at scale; we should not be surprised that it would foster the opposite. But we can, and should, be shocked.

3. FDA approves first blood sugar monitor without finger pricks

This is exciting. One view which seems to be common wisdom these days when it comes to health is that it's easier to lose weight and impact your health through diet than exercise. But one of the problems of the feedback loop in diet (and exercise, actually) is how slow it is. You sneak a few snacks here and there walking by the company cafeteria every day, and a month later you hop on the scale and emit a bloodcurdling scream as you realize you've gained 8 pounds.

A friend of mine had gestational diabetes during one of her pregnancies and got a home blood glucose monitor. You had to prick your finger and draw blood to get your blood glucose reading, but curious, I tried it before and after a BBQ.

To see what various foods did to my blood sugar in near real-time was a real eye-opener. Imagine in the future when one could see what a few french fries and gummy bears did to your blood sugar, or when the reading could be built into something like an Apple Watch, without having to draw blood each time. I don't mind the sight of blood, but I'd prefer not to turn my finger tips into war zones.

Faster feedback might transform dieting into something more akin to deliberate practice. Given that another popular theory of obesity is that it's an insulin phenomenon, tools like this, built for diabetes, might have much mass market impact.

4.  Ingestable ketones

Ingestable ketones have been a recent sort of holy grail for endurance athletes, and now HVMN is bringing one to market. Ketogenic diets are all the rage right now, but for an endurance athlete, the process of being able to fuel oneself on ketones has always sounded like a long and miserable process.

The body generates ketones from fat when low on carbs or from fasting. The theory is that endurance athletes using ketones rather than glycogen from carbs require less oxygen and thus can work out longer.

I first heard about the possibility of exogenous ketones for athletes from Peter Attia. As he said then, perhaps the hardest thing about ingesting exogenous ketones is the horrible taste, which caused him to gag and nearly vomit in his kitchen. It doesn't sound like the taste problem has been solved.

Until we get the pill that renders exercise obsolete, however, I'm curious to give this a try. If you decide to pre-order, you can use my referral code to get $15 off.

5. We Are Nowhere Close to the Limits of Athletic Performance

By comparison, the potential improvements achievable by doping effort are relatively modest. In weightlifting, for example, Mike Israetel, a professor of exercise science at Temple University, has estimated that doping increases weightlifting scores by about 5 to 10 percent. Compare that to the progression in world record bench press weights: 361 pounds in 1898, 363 pounds in 1916, 500 pounds in 1953, 600 pounds in 1967, 667 pounds in 1984, and 730 pounds in 2015. Doping is enough to win any given competition, but it does not stand up against the long-term trend of improving performance that is driven, in part, by genetic outliers. As the population base of weightlifting competitors has increased, outliers further and further out on the tail of the distribution have appeared, driving up world records.
 
Similarly, Lance Armstrong’s drug-fuelled victory of the 1999 Tour de France gave him a margin of victory over second-place finisher Alex Zulle of 7 minutes, 37 seconds, or about 0.1 percent.3 That pales in comparison to the dramatic secular increase in speeds the Tour has seen over the past half century: Eddy Merckx won the 1971 tour, which was about the same distance as the 1999 tour, in a time 5 percent worse than Zulle’s. Certainly, some of this improvement is due to training methods and better equipment. But much of it is simply due to the sport’s ability to find competitors of ever more exceptional natural ability, further and further out along the tail of what’s possible.
 

In the Olympics, to take the most celebrated athletic competition, victors are celebrated with videos showing them swimming laps, tossing logs in a Siberian tundra, running through a Kenyan desert. We celebrate the work, the training. Good genes are given narrative short shrift. Perhaps we should show a picture of their DNA, just to give credit where much credit is due?

If I live a normal human lifespan, I expect to live to see special sports leagues and divisions created for athletes who've undergone genetic modification in the future. It will be the return of the freak show at the circus, but this time for real. I've sat courtside and seen people like Lebron James, Giannis Antetokounmpo, Kevin Durant, and Joel Embiid walk by me. They are freaks, but genetic engineering might produce someone who stretch our definition of outlier.

In other words, it is highly unlikely that we have come anywhere close to maximum performance among all the 100 billion humans who have ever lived. (A completely random search process might require the production of something like a googol different individuals!)
 
But we should be able to accelerate this search greatly through engineering. After all, the agricultural breeding of animals like chickens and cows, which is a kind of directed selection, has easily produced animals that would have been one in a billion among the wild population. Selective breeding of corn plants for oil content of kernels has moved the population by 30 standard deviations in roughly just 100 generations.6 That feat is comparable to finding a maximal human type for a specific athletic event. But direct editing techniques like CRISPR could get us there even faster, producing Bolts beyond Bolt and Shaqs beyond Shaq.
 

6. Let's set half a percent as the standard for statistical significance

My many-times-over coauthor Dan Benjamin is the lead author on a very interesting short paper "Redefine Statistical Significance." He gathered luminaries from many disciplines to jointly advocate a tightening of the standards for using the words "statistically significant" to results that have less than a half a percent probability of occurring by chance when nothing is really there, rather than all results that—on their face—have less than a 5% probability of occurring by chance. Results with more than a 1/2% probability of occurring by chance could only be called "statistically suggestive" at most. 
 
In my view, this is a marvelous idea. It could (a) help enormously and (b) can really happen. It can really happen because it is at heart a linguistic rule. Even if rigorously enforced, it just means that editors would force people in papers to say "statistically suggestive for a p of a little less than .05, and only allow the phrase "statistically significant" in a paper if the p value is .005 or less. As a well-defined policy, it is nothing more than that. Everything else is general equilibrium effects.
 

Given the replication crisis has me doubting almost every piece of conventional wisdom I've inherited in my life, I'm okay with this.

7. We're surprisingly unaware of when our own beliefs change

If you read an article about a controversial issue, do you think you’d realise if it had changed your beliefs? No one knows your own mind like you do – it seems obvious that you would know if your beliefs had shifted. And yet a new paper in The Quarterly Journal of Experimental Psychology suggests that we actually have very poor “metacognitive awareness” of our own belief change, meaning that we will tend to underestimate how much we’ve been swayed by a convincing article.
 
The researchers Michael Wolfe and Todd Williams at Grand Valley State University said their findings could have implications for the public communication of science. “People may be less willing to meaningfully consider belief inconsistent material if they feel that their beliefs are unlikely to change as a consequence,” they wrote.
 

Beyond being an interesting result, I link to this as an example of a human readable summary of a research paper. This his how this article summarize the research study and its results:

The researchers recruited over two hundred undergrads across two studies and focused on their beliefs about whether the spanking/smacking of kids is an effective form of discipline. The researchers chose this topic deliberately in the hope the students would be mostly unaware of the relevant research literature, and that they would express a varied range of relatively uncommitted initial beliefs.
 
The students reported their initial beliefs about whether spanking is an effective way to discipline a child on a scale from “1” completely disbelieve to “9” completely believe. Several weeks later they were given one of two research-based texts to read: each was several pages long and either presented the arguments and data in favour of spanking or against spanking. After this, the students answered some questions to test their comprehension and memory of the text (these measures varied across the two studies). Then the students again scored their belief in whether spanking is effective or not (using the same 9-point scale as before). Finally, the researchers asked them to recall what their belief had been at the start of the study.
 
The students’ belief about spanking changed when they read a text that argued against their own initial position. Crucially, their memory of their initial belief was shifted in the direction of their new belief – in fact, their memory was closer to their current belief than their original belief. The more their belief had changed, the larger this memory bias tended to be, suggesting the students were relying on their current belief to deduce their initial belief. The memory bias was unrelated to the measures of how well they’d understood or recalled the text, suggesting these factors didn’t play a role in memory of initial belief or awareness of belief change.
 

Compare this link above to the abstract of the paper itself:

When people change beliefs as a result of reading a text, are they aware of these changes? This question was examined for beliefs about spanking as an effective means of discipline. In two experiments, subjects reported beliefs about spanking effectiveness during a prescreening session. In a subsequent experimental session, subjects read a one-sided text that advocated a belief consistent or inconsistent position on the topic. After reading, subjects reported their current beliefs and attempted to recollect their initial beliefs. Subjects reading a belief inconsistent text were more likely to change their beliefs than those who read a belief consistent text. Recollections of initial beliefs tended to be biased in the direction of subjects’ current beliefs. In addition, the relationship between the belief consistency of the text read and accuracy of belief recollections was mediated by belief change. This belief memory bias was independent of on-line text processing and comprehension measures, and indicates poor metacognitive awareness of belief change.
 

That's actually one of the better research abstracts you'll read and still it reflects the general opacity of the average research abstract. I'd argue that some of the most important knowledge in the world is locked behind abstruse abstracts.

Why do researchers write this way? Most tell me that researchers write for other researchers, and incomprehensible prose like this impresses their peers. What a tragedy. As my longtime readers know, I'm a firm believer in the power of the form of a message. We continue to underrate that in all aspects of life, from the corporate world to our personal lives, and here, in academia.

Then again, such poor writing keeps people like Malcolm Gladwell busy transforming such insight into breezy reads in The New Yorker and his bestselling books.

8. Social disappointment explains chimpanzees' behaviour in the inequity aversion task

As an example of the above phenomenon, this paper contains an interesting conclusion, but try to parse this abstract:

Chimpanzees’ refusal of less-preferred food when an experimenter has previously provided preferred food to a conspecific has been taken as evidence for a sense of fairness. Here, we present a novel hypothesis—the social disappointment hypothesis—according to which food refusals express chimpanzees' disappointment in the human experimenter for not rewarding them as well as they could have. We tested this hypothesis using a two-by-two design in which food was either distributed by an experimenter or a machine and with a partner present or absent. We found that chimpanzees were more likely to reject food when it was distributed by an experimenter rather than by a machine and that they were not more likely to do so when a partner was present. These results suggest that chimpanzees’ refusal of less-preferred food stems from social disappointment in the experimenter and not from a sense of fairness.
 

Your average grade school English teacher would slap a failing grade on this butchery of the English language.

9. Metacompetition: Competing Over the Game to be Played

When CDMA-based technologies took off in the US, companies like QualComm that work on that standard prospered; metacompetitions between standards decide the fates of the firms that adopt (or reject) those standards.

When an oil spill raises concerns about the environment, consumers favor businesses with good environmental records; metacompetitions between beliefs determine the criteria we use to evaluate whether a firm is “good.”

If a particular organic foods certification becomes important to consumers, companies with that certification are favored; metacompetitions between certifications determines how the quality of firms is measured.
 
In all these examples, you could be the very best at what you do, but lose in the metacompetition over what criteria will matter. On the other hand, you may win due to a metacompetition that protects you from fierce rivals who play a different game.
 
Great leaders pay attention to metacompetition. They advocate the game they play well, promoting criteria on which they measure up. By contrast, many failed leaders work hard at being the best at what they do, only to throw up their hands in dismay when they are not even allowed to compete. These losers cannot understand why they lost, but they have neglected a fundamental responsibility of leadership. It is not enough to play your game well. In every market in every country, alternative “logics” vie for prominence. Before you can win in competition, you must first win the metacompetition over the game being played.
 

In sports negotiations between owners and players, the owners almost always win the metacompetition game. In the writer's strike in Hollywood in 2007, the writer's guild didn't realize they were losing the metacompetition and thus ended up worse off than before. Amazon surpassed eBay by winning the retail metacompetition (most consumers prefer paying a good, fixed price for a good of some predefined quality than dealing with the multiple axes of complexity of an auction) after first failing at tackling eBay on its direct turf of auctions.

Winning the metacompetition means first being aware of what it is. It's not so easy in a space like, say, social networking, where even some of the winners don't understand what game they're playing.

10. How to be a Stoic

Much of Epictetus’ advice is about not getting angry at slaves. At first, I thought I could skip those parts. But I soon realized that I had the same self-recriminatory and illogical thoughts in my interactions with small-business owners and service professionals. When a cabdriver lied about a route, or a shopkeeper shortchanged me, I felt that it was my fault, for speaking Turkish with an accent, or for being part of an élite. And, if I pretended not to notice these slights, wasn’t I proving that I really was a disengaged, privileged oppressor? Epictetus shook me from these thoughts with this simple exercise: “Starting with things of little value—a bit of spilled oil, a little stolen wine—repeat to yourself: ‘For such a small price, I buy tranquillity.’ ”
 
Born nearly two thousand years before Darwin and Freud, Epictetus seems to have anticipated a way out of their prisons. The sense of doom and delight that is programmed into the human body? It can be overridden by the mind. The eternal war between subconscious desires and the demands of civilization? It can be won. In the nineteen-fifties, the American psychotherapist Albert Ellis came up with an early form of cognitive-behavioral therapy, based largely on Epictetus’ claim that “it is not events that disturb people, it is their judgments concerning them.” If you practice Stoic philosophy long enough, Epictetus says, you stop being mistaken about what’s good even in your dreams.
 

The trendiness of stoicism has been around for quite some time now. I found this tab left over from 2016, and I'm sure Tim Ferriss was espousing it long before then, and not to mention the enduring trend that is Buddhism. That meditation and stoicism are so popular in Silicon Valley may be a measure of the complacency of the region; these seem direct antidotes to the most first world of problems. People everywhere complain of the stresses on their mind from the deluge of information they receive for free from apps on the smartphone with processing power that would put previous supercomputers to shame.

Still, given that stoicism was in vogue in Roman times, it seems to have stood the test of time. Since social media seems to have increased the surface area of our social fabric and our exposure to said fabric, perhaps we could all use a bit more stoicism in our lives. I suspect one reason Curb Your Enthusiasm curdles in the mouth more than before is not just that his rich white man's complaints seem particularly ill timed in the current environment but that he is out of touch with the real nature of most people's psychological stressors now. A guy of his age and wealth probably doesn't spend much time on social media, but if he did, he might realize his grievances no longer match those of the average person in either pettiness or peculiarity.

The Kipsang Number

In this discussion between Malcolm Gladwell and Nicholas Thompson about the World Track and Field Championships, Gladwell brought up a concept called the Kipsang Number.

I watched the marathon and was struck (as I always am watching marathons) by the same dumb, obvious point: they are fast. It’s worth dwelling on this a moment. Back when Wilson Kipsang set the world record (which was then promptly broken), my running friends and I came up with the “Kipsang number,” which represented how long could you keep up with Wilson Kipsang while he was running twenty-six miles. I am a devoted runner and my Kipsang number is less than a mile. If I’m lucky, fourteen-hundred metres. You are a really good runner, and I’m guessing your Kipsang number is two miles. The average, healthy, athletic, American, twenty-two-year old varsity athlete in a sport other than track probably has a Kipsang number of between 400 and 800 metres. To recap: you could keep up with him for a quarter of a mile, then you would collapse in exhaustion. He would keep running at the same pace for another twenty-six miles.
 

My cycling friends and I often ponder a similar number when out on group rides: on a flat road, how long can you bike at the speed that professional cyclists ride at in the flats in the peloton? Or how long can you hold the average speed of a professional climbing expert like Nairo Quintana on a cycling climb?

This would be a fun charity event, either on the track or in cycling, to invite average Joes to try to keep up with Kipsang or a strong pro cyclist like Fabio Aru for as long as possible, and soon as you fell behind, you'd be eliminated. Since that is logistically too complex to set up for all but a few people, perhaps fitness apps like Strava can add in virtual challenges like this.

Why are apes skinny and humans fat?

Scientists studied dead humans and bonobos in an effort to understand why humans became the fat primate. What happened when chimps and humans diverged? It's not clear, but the results thousands of years later are.

...humans got fat. Chimps and bonobos are 13 percent skin, and we're only 6 percent skin, but we compensate for that by being up to 36 percent body fat on the high end of average, while bonobos average 4 percent. That's a wildly disproportional fatness differential.
 

From an interview of one of the authors of the paper.

So what happened on the path from common ancestor to Homo sapiens?
One of the things is, you've gotta shift the body around and change the muscle from the forelimbs if you're a quadrupedal ape. Our ancestors—and most apes—can venture into open areas, but they live in forests. They're really tied to having tree cover available, because they get hot.
 
So we developed fat so we could get away from forests?
Compared to the apes, we have less muscle, which is an energy savings, because it's such an expensive tissue. Two important things about the way we store fat: We store it around our buttocks and thighs, but you want to make sure that you're storing fat so it doesn't interfere with locomotion. You don't want it on your feet, for instance. So you concentrate it around the center of gravity. And you also don't want it to interfere with being able to get rid of heat.
 
What was the benefit of having fat down low and weak arms?
If you're moving away from the forest and tree cover, you want to be able to exploit food in a more mosaic habitat that has areas of bush and a few forests around rivers. You want to be able to move into a lot of different areas. So you've gotta get rid of your hair, and really ramp up those sweat glands. Our skin has really been reorganized for a lot of different functions.
 
Do chimps and bonobos not have sweat glands? 
They have sweat glands. They're not really functioning. All primates have eccrine sweat glands in their hands and feet. Monkeys have them on their chests. [But] they're not stimulated by heat.

“Distraction is a kind of obesity of the mind”

Matthew Crawford has written a new book The World Beyond Your Head: On Becoming an Individual in an Age of Distraction. In an interview with the Guardian, he discussed the heightened competition for what I've often called the only finite resource in tech, user attention.

“I realised how pervasive this has become, these little appropriations of attention,” he says. “Figuring out ways to capture and hold people’s attention is the centre of contemporary capitalism. There is this invisible and ubiquitous grabbing at something that’s the most intimate thing you have, because it determines what’s present to your consciousness. It makes it impossible to think or rehearse a remembered conversation, and you can’t chat with a stranger because we all try to close ourselves off from this grating condition of being addressed all the time.”
 
He points out that the only quiet, distraction-free place in the airport is the business-class lounge, where all you hear is “the occasional tinkling of a spoon against china”. Silence has become a luxury good. “The people in there value their silence very highly. If you’re in that lounge you can use the time to think creative, playful thoughts; you could come up with some brilliant marketing scheme that you would then use to determine the character of the peon section. You can think of it as a transfer of wealth. Attention is a resource, convertible into actual money. ”
 
...
 
“We increasingly encounter the world through these representations that are addressed to us, often with manipulative intent: video games, pornography, gambling apps on your phone,” he says. “These experiences are so exquisitely attuned to our appetites that they can swamp your ordinary way of being in the world. Just as food engineers have figured out how to make food hyper-palatable by manipulating fat, salt and sugar, similarly the media has become expert at making irresistible mental stimuli.” Distraction is a kind of obesity of the mind, in other words, with results that could be just as hazardous for our health.
 

I've certainly felt, in recent times, like some sort of information addict, with my smartphone playing the part of drug dealer sitting over my shoulder, offering a free and never ending supply of, well, whatever I want. It would seem to follow that the returns to self-control have increased as well, and Crawford notes that “the rich can hire “professional naggers” – tutors for their children, personal trainers – in effect outsourcing their own self-control.”

In Renaissance times, obesity was a signal of wealth and thus seen as attractive. In an age of cheap and plentiful calories, a signal of wealth is being fit, which may indicate good genetics and self-discipline but may also mean you can afford a personal trainer and home chef.

In tech we're constantly chasing the holy grail of personalization and more precise targeting of content, advertising, services, etc. Crawford sees the glass half full in this scenario, a rise of self-obsession that transfers power and wealth to companies.

It is tempting to see the advent of this crisis as technological, but for Crawford it’s more that the technology has created the perfect vehicles for our self-obsession. Individual choice has been fetishised to the point where we have thrown away many of the structures – family, church, community – that helped us to make good decisions, and handed more and more power to corporations.
 

I can cite many examples of how the internet has improved the world; I'm hardly a technology alarmist. Still, I find it more necessary these days to hone my self-control and find ways to cocoon my mind in quiet from time to time.

I was in Taiwan for 6 days last week, and I bought a 300MB data package from AT&T for the week [the AT&T Passport, as they call these all-in-one international bundles, is one of the few customer-friendly things they've added travelers; no more trying to cobble together a suitable international bundle yourself across text, data, voice]. While there, I'd get in the habit of turning on cellular data for short sips at moments when I really needed it—to get directions to my next destination via Google Maps, look up restaurants, coordinate meetups with friends, post a photo to Instagram—and the rationing kept me off the phone most of the day. Most my phone's apps, besides the camera, aren't all that enticing without network access. The enforced data hibernation felt like meditation.

My last morning in Taiwan, I had a taxi driver take me out to Jiufen, a small coastal town, supposedly an inspiration for many of the settings in Miyazaki's wonderful Spirited Away. Cutting through the center of town was one of what they call an “old street,” a sort of pedestrian alley lined with shops and decorated in a deliberate retro style. Many towns in Taipei have one as they are honey traps for tourists.

Because my flight back to the U.S. was that afternoon, I arrived at the Jiufen Old Street really early, before any tourists had appeared. I strolled down the winding alley while shop owners were just beginning to open doors and prepare for the tourist hordes to come. Some proprietors were rolling out dough to make taro balls, the popular local delicacy, while others were grilling meat or heating up tofu stew. It was very quiet, and many times I turned a corner to find myself the only person in that segment of the street. I was almost out of data allotment on my phone so I kept it off as I ambled to and fro.

Near the end of the street, I stopped to purchase a bowl of taro balls in ice, and I ate from it as I walked the last segment of the street. Every tourist guidebook probably tells you to purchase the taro balls while there, and you know what? They're right. Those things are damn great.

No one was on that last segment of street except an old man carrying a large bag of rice over his shoulder up the steps to a house. In that moment, I felt a peace that comes from a clear mind  and a complete absence of any want. It was a serenity I hadn't felt in so long that I can distinctly remember the moment it washed over me, the sensation of my mind exhaling and going still. Had my phone been on and connected to the internet I'm not sure I could have achieved it.

I thought back to this moment when reading this lovely piece on The Death of Awe in the Age of Awesome:

Travel writers like me spend a lot of time contemplating why people venture abroad. Not just the obvious enticements — relaxation, winter sun, cheap pilsner — but the emotional, soul-stirring stuff: the sustenance of the new. The awe. It has, I think, become one of the main incentives of our travelling lives. As spirituality wanes experience is the new faith, and we are refugees from the mundane.
 
But behind this quest for the big, beautiful and baffling is a disconcerting sense that wonder in the age of the bucket-list is under attack. From technology, from information overload, from the anti-spiritual cynicism of the post-hippy world. In an era where a child has only to hold a five-inch screen in front of their face to gorge themselves on the apparent miracle of a one-inch Dora the Explorer hatching from a two-tone chocolate shell, awe has started to feel increasingly elusive.
 
It doesn’t take a bona fide philosopher to understand that this diminution of the human experience is an inevitable price of social progress. Awe, after all, used to be much easier to come by. Imagine you’re a Stone-age hunter witnessing a solar eclipse (not like last month’s anticlimactic, cloud-snuffed eclipse. A proper one.). Suddenly, the sun is extinguished. You don’t know it’s a temporary phenomenon, an orbital idiosyncrasy. So you tremble, piss your mammoth-skin pants, invent Gods! That’s a great big uppercut of awe.
 

At the end of Jiufen Old Street, I stopped to look out at houses perched along the side of the mountain which sloped down to the sea.

I heard them before I saw them, the first tour groups to catch up to me. Throngs of mainland Chinese and Japanese tour groups, mostly middle-aged to senior travelers, led by young guides holding flags and trying, with moderate success, to herd their flock. They jockeyed for position with each other, filling the alley two or three people wide. I fought my way back through them as best I could. That moment of peace was gone. Even in the real world, it's hard to stay away for long.

How spin becomes gospel

How and why do so many health myths transform into everyday wisdom? Perhaps because academic press releases are exaggerated or misrepresented when they're written up in news stories.

The goal of a press release around a scientific study is to draw attention from the media, and that attention is supposed to be good for the university, and for the scientists who did the work. Ideally the endpoint of that press release would be the simple spread of seeds of knowledge and wisdom; but it's about attention and prestige and, thereby, money. Major universities employ publicists who work full time to make scientific studies sound engaging and amazing. Those publicists email the press releases to people like me, asking me to cover the story because "my readers" will "love it." And I want to write about health research and help people experience "love" for things. I do!  

Across 668 news stories about health science, the Cardiff researchers compared the original academic papers to their news reports. They counted exaggeration and distortion as any instance of implying causation when there was only correlation, implying meaning to humans when the study was only in animals, or giving direct advice about health behavior that was not present in the study. They found evidence of exaggeration in 58 to 86 percent of stories when the press release contained similar exaggeration. When the press release was staid and made no such errors, the rates of exaggeration in the news stories dropped to between 10 and 18 percent.

Even the degree of exaggeration between press releases and news stories was broadly similar.

One golden rule: never believe a press release.

Of course, since most research papers I encounter online are behind a paywall, it's difficult to compare them to the news stories representing them. Why are so many research papers paywalled? I can't imagine the revenue to be more than a trifle so why not just let more eyeballs at it to amplify one's fame instead? Perhaps someone with more knowledge of that world can explain the economics.