My archives

​This section collects all my previous blog posts from my blog Remains of the Day from the years 2001 through 2011, when I was writing regularly. Most of these are left over from the blog I started on Blogger and then shifted to Movable Type.

In porting them over to Squarespace here, a lot of the formatting is off, and I have not had time to go back in and clean things up. My hosting subscription on another service crept up on me, and I had to rush to move everything over, so apologies to those of you who came here and saw random drafts of half-written, half-edited posts still hanging out like so much unfolded laundry strewn across the sofa.​ I need to tidy them up, then figure out how to add search to my site.

Going back through my old posts is  often embarrassing, like hearing a recording of my own voice or seeing a photo of how I used to dress or comb my hair. But mostly it's like reading letters from past versions of myself​. It amuses me. I leave them here in the hopes they do the same for you.

Days of Yore

From this Jennifer Egan interview at Days of Yore:



A: It’s a very trusting environment, but also a very rigorous environment. Because you want to know that everyone is on your side, but if they just tell you it’s great, they’re not doing you any favors. That part about everyone being on your side is really critical too. There’s nothing worse than not knowing whether their criticism is motivated by some sort of internal or external wish to undermine or whether it is valid.


Q: But it can be hard in, say, a writing workshop, to shut out the choir of voices and hear your own voice.


A: Yes. But what I lose by not listening is much greater than what I lose by listening to bad advice. Because I think I can sort of sort through with my gut what is useful and what is not useful. Whereas if I hear nothing, I know vividly what results. I am never going to let that happen again.


I think people feel somehow that they can be hurt by hearing the wrong thing. I am not convinced that is true. We might get our feelings hurt, but I don’t think there is any actual damage done. What’s bad falls away.


One thing I often say to students is, “I am not interested in hearing solutions.

The true cost of higher education

Malcolm Harris writes at n+1 about the next economic bubble in the U.S. after the housing bubble: the student debt bubble.



Since 1978, the price of tuition at US colleges has increased over 900 percent, 650 points above inflation. To put that number in perspective, housing prices, the bubble that nearly burst the US economy,  then the global one, increased only fifty points above the Consumer Price Index during those years. But while college applicants’ faith in the value of higher education has only increased, employers’ has declined. According to Richard Rothstein at The Economic Policy Institute, wages for college-educated workers outside of the inflated finance industry have stagnated or diminished. Unemployment has hit recent graduates especially hard, nearly doubling in the post-2007 recession. The result is that the most indebted generation in history is without the dependable jobs it needs to escape debt.


What kind of incentives motivate lenders to continue awarding six-figure sums to teenagers facing both the worst youth unemployment rate in decades and an increasingly competitive global workforce?



Harris goes into all the reasons why the cost of education has exploded, and why it's not likely to end anytime soon, and this all leads to his less than sanguine conclusion:



If tuition has increased astronomically and the portion of money spent on instruction and student services has fallen, if the (at very least comparative) market value of a degree has dipped and most students can no longer afford to enjoy college as a period of intellectual adventure, then at least one more thing is clear: higher education, for-profit or not, has increasingly become a scam.



To me, the real tragedy of so many higher education programs is that they saddle students with so much debt that they have no choice but to consider salary as a first-order criterion for their choice of profession. When the real problem that an institution of higher education should aspire to is to teach us how to live and engage with the world, and to teach us how to be both children and adults at once. The best of my college days blended lessons in adult responsibility (how to take care of yourself without your parents as a backstop) with encouragement that the best way to pursue intellectual adventure was to follow your curiosity wherever it led.


Free market solutions to obesity

Matt Ridley discusses free-market solutions to obesity. One section caught my attention:



In due course, the obesity problem will be solved, I suspect. The ultra-rich have already solved it. Most of them are very thin these days, quite unlike in ancient times. That's because they can afford the solutions that work for them, from low-carb diets to personal trainers.


If economic growth continues to spread, as it has over the past two centuries, most people will be ultra-rich by today's standards within two generations, and slim figures will also spread. Still, it would be nice to find a way for people to lose weight without having to wait for them to get rich first.



The signifiers of wealth evolve over time. In some countries a tan is a signal of wealth because it means you can afford to vacation where there's sun. But in Asia a dark complexion was often a signal that you were in a lower socioeconomic class and forced to work out in the sun all day, for example in farming or construction, so a pale complexion was socially desirable.


In places of food scarcity, obesity can be a sign of wealth. However, in the U.S. today, it's the reverse: the wealthy can afford to hack the diet and lifestyle issues that lead to obesity in poorer socioeconomic groups.


DFW's The Pale King

John Jeremiah Sullivan's review of David Foster Wallace's The Pale King is a fantastic read. And in GQ, no less. I loved this paragraph:

It's worth noting, in that regard, that The New Yorker, which published some of his best fiction, never did any of his nonfiction. No shame to Wallace or The New Yorker, it's simply a technically interesting fact: He couldn't have changed his voice to suit the magazine's famous house style. The "plain style" is about erasing yourself as a writer and laying claim to a kind of invisible narrative authority, the idea being that the writer's mind and personality are manifest in every line, without the vulgarity of having to tell the reader it's happening. But Wallace's relentlessly first-person strategies didn't proceed from narcissism, far from it—they were signs of philosophical stubbornness. (His father, a professional philosopher, studied with Wittgenstein's last assistant; Wallace himself as an undergraduate made an actual intervening contribution—recently published as Fate, Time, and Language—to the debate over free will.) He looked at the plain style and saw that the impetus of it, in the end, is to sell the reader something. Not in a crass sense, but in a rhetorical sense. The well-tempered magazine feature, for all its pleasures, is a kind of fascist wedge that seeks to make you forget its problems, half-truths, and arbitrary decisions, and swallow its nonexistent imprimatur. Wallace could never exempt himself or his reporting from the range of things that would be subject to scrutiny.

 I am among the many for whom David Foster Wallace, or DFW as he's commonly referred to, is the most influential writer of this generation, and Sullivan's article articulates much of why that is. I'd add that beyond his prodigious talent, Wallace's writing evinced a curious but tolerant world view which seemed all too rare in this more cynical age, with its tendency towards snap and shallow judgment.

I enjoyed the reference to the New Yorker house style, which Tom Wolfe (according to a citation Wikipedia), once said (emphasis mine): "The New Yorker style was one of leisurely meandering understatement, droll when in the humorous mode, tautological and litotical when in the serious mode, constantly amplified, qualified, adumbrated upon, nuanced and renuanced, until the magazine’s pale-gray pages became High Baroque triumphs of the relative clause and appository modifier."

 

The alpha and the loyal sidekick

Paul Allen's autobiography is about to hit, and so it's omnipresent on the web. A juicy excerpt appears in Vanity Fair. As with Keith Richards slagging Mick Jagger in his entertaining autobiography, Allen doesn't hold back about his relationship with Gates, both positive and negative.



Bill craved closure, and he would hammer away until he got there; on principle, I refused to yield if I didn’t agree. And so we’d go at it for hours at a stretch, until I became nearly as loud and wound up as Bill. I hated that feeling. While I wouldn’t give in unless convinced on the merits, I sometimes had to stop from sheer fatigue. I remember one heated debate that lasted forever, until I said, “Bill, this isn’t going anywhere. I’m going home.

The hardest climb in cycling?

A profile, complete with ominous photos, of The Koppenberg, a contender for the title of "toughest climb in cycling."



The Koppenberg's stats are, objectively speaking, nothing to go home crying about. I know, sacrilege, you howl. Hear me out. There are climbs that are steeper, longer, even harder. It's only 600 meters long, it averages just under 12%, and its steepest section is 22%. I'm not saying these are paltry figures, but climbs like the Zoncolan and Angliru manage figures like that for ten thousand meters. They're debilitatingly hard, but so is the Koppenberg. There are a few things that are very, very different about the Koppenberg that set it well apart from the horror climbs of the Zoncolan and the Angliru, namely: cobbles, cobbles, cobbles, and the Ronde van Vlaanderen.



I haven't done a lot of riding over cobblestones. It's not pleasant, though it gives one a sense of joining some gladiatorial fraternity. But watching professionals suffer over them is good fun.


Behind every autotuned 13 year old girl...

It makes sense, doesn't it, that Gawker would do the profile of the man behind Rebecca Black's Friday? Meet Ark Music Factory CEO Patrice Wilson.



Where did Wilson get the inspiration for such lyrics as "Yesterday was Thursday/Today is Friday?" "I wrote the lyrics on a Thursday night going into a Friday," he said. "I was writing different songs all night and was like, 'Wow, I've been up a long time and it's Friday.' And I was like, wow, it is Friday!"



The immense spike of interest in this otherwise unremarkable video begs for an explanation. Needless to say, I enjoy reading the attempts to explain the phenomenon more than the video itself. From one example:



Rob: I like the song too, but I don’t find that embarrassing. It feels like a confirmation of the suspicion that the best pop music must aspire to a formal purity that comes at the expense of content. The best pop songs are the emptiest. At that point, pop music has nothing to do with subjectivity or identity construction: You don’t become empty when you hear it; instead you have your own fullness confirmed.



And later:



To shift the terminology, I think we’ve been in post-Fordist relations of pop-culture production for some time now, with consumers driving the innovations in meaning that culture-industry firms then harvest and exploit. They increasingly supply the playground itself rather than the specific jungle gyms. No one owns the malleable, mutable meanings of pop culture, but the process and the medium for those transmutations is definitely owned. This is the essence of what Jodi Dean calls communicative capitalism.



Next: 1,000 words on the great existential crisis of our time: which seat can I take?


What happened to liberal ed?

Why are fewer and fewer students interested in liberal arts and more focused on professional degrees? Dan Edelstein's theory:



The only problem with this logic is that universities in fact bear a considerable responsibility for the brain drain away from the humanities. By raising the cost of education to stratospheric levels, we oblige students to seek a higher return on their investment. It is this sort of economic calculation, I suggest, and not some alleged generational change, that is driving students in droves towards preprofessional degrees.



He forecasts a dire future for the humanities.



Until the tuition imbalance stabilizes – and eventually Congress may well intervene to ensure that it does – humanities departments need to act more aggressively to ensure their survival. Increasing the turnout of majors may be beyond our reach, but we perhaps need to rethink the relationship between research and teaching. Do highly specialized courses offered by individual departments provide the best kind of background in the humanities for students headed for careers in law, engineering, finance, or science? Or do we need to offer more cross-disciplinary courses, ideally team-taught by faculty from different departments, on core questions and topics in the humanities? The bulk of our teaching is geared toward majors and graduate students. If we do not want to be the victims of the next recession (or, if it lasts long enough, the current one), we also need to target those students who feel they do no longer have the luxury of specializing in a humanistic subject.



If I were to choose one subject to start this movement to more practical instruction, it would be writing. Non-English majors are often forced to take some eclectic literature course when a more focused class on writing well would prove so much more practical over the course of their lives. I'm amazed how few people I encounter in the professional world who can write with clarity and command.


If we look at the Internet space, the world seems ready for a new and more focused educational paradigm for the modern age. Or at least an alternative to the traditional educational roadmap in the U.S.


RELATED: It's been clear for some time now that the tech sector is in the midst of a huge software engineer shortage. This weekend the NYTimes wrote about the wide variety of perks companies are dangling to not only hire new employees but keep the ones they have from bolting.


In a market like this, it wouldn't surprise me at all if a company like Google started its own alternative educational institution to train software engineers from an even earlier age in life (high school, perhaps), in exchange for a first look at candidates or just to expand the pool of candidates in general. Just as in sports, the cheapest candidates are those fresh out of school, before their salaries correct over time to match their contributions. Given how much of their work can be leveraged across a global audience now, the return on investment from a software engineer is massive.


 


Supercharging your memory

Joshua Foer's account of becoming a world-class memory athlete in the NYTimes seems like the perfect fodder for a short and focused documentary, like Spellbound or Word Wars. It's not surprising that Foer received a huge advance to turn this into a book, Moonwalking with Einstein: The Art and Science of Remembering Everything.


Two ideas about memory really struck me. One is that memorizing is an act of creation (emphasis below is mine).



What distinguishes a great mnemonist, I learned, is the ability to create lavish images on the fly, to paint in the mind a scene so unlike any other it cannot be forgotten. And to do it quickly. Many competitive mnemonists argue that their skills are less a feat of memory than of creativity. For example, one of the most popular techniques used to memorize playing The point of memory techniques to take the kinds of memories our brains aren’t that good at holding onto and transform them into the kinds of memories our brains were built for. cards involves associating every card with an image of a celebrity performing some sort of a ludicrous — and therefore memorable — action on a mundane object. When it comes time to remember the order of a series of cards, those memorized images are shuffled and recombined to form new and unforgettable scenes in the mind’s eye. Using this technique, Ed Cooke showed me how an entire deck can be quickly transformed into a comically surreal, and unforgettable, memory palace.



The second is the idea of memory as not just an act of creation but as one of creating a physical space, like being an architect.



Memory palaces don’t have to be palatial — or even actual buildings. They can be routes through a town or signs of the zodiac or even mythical creatures. They can be big or small, indoors or outdoors, real or imaginary, so long as they are intimately familiar. The four-time U.S. memory champion Scott Hagwood uses luxury homes featured in Architectural Digest to store his memories. Dr. Yip Swee Chooi, the effervescent Malaysian memory champ, used his own body parts to help him memorize the entire 57,000-word Oxford English-Chinese dictionary. In the 15th century, an Italian jurist named Peter of Ravenna is said to have used thousands of memory palaces to store quotations on every important subject, classified alphabetically. When he wished to expound on a given topic, he simply reached into the relevant chamber and pulled out the source.



One concept which you has applications beyond just training your memory is the idea of O.K. plateaus.



Psychologists used to think that O.K. plateaus marked the upper bounds of innate ability. In his 1869 book “Hereditary Genius,

Kasparov on Watson

[sorry for these untimely posts...I'm trying to clear out a few half-completed drafts]


The Atlantic pinged Garry Kasparov for his thoughts on IBM's Watson's victory on Jeopardy.



A convincing victory under strict parameters, and if we stay within those limits Watson can be seen as an incremental advance in how well machines understand human language. But if you put the questions from the show into Google, you also get good answers, even better ones if you simplify the questions. To me, this means Watson is doing good job of breaking language down into points of data it can mine very quickly, and that it does it slightly better than Google does against the entire Internet.



The analogy to a human using Google is a useful one. If you had infinite lifelines on Who Wants to Be a Millionaire and could call a friend who could Google for answers for you, would you always win a million dollars? Maybe not always, but fairly close. So the challenge for IBM was to figure out how to parse the Jeopardy clues into the right parameters to generate the proper query. Then Watson, like Google, had to use some algorithms for ranking the relevancy of various results.



My concern about its utility, and I read they would like it to answer medical questions, is that Watson's performance reminded me of chess computers. They play fantastically well in maybe 90% of positions, but there is a selection of positions they do not understand at all. Worse, by definition they do not understand what they do not understand and so cannot avoid them. A strong human Jeopardy! player, or a human doctor, may get the answer wrong, but he is unlikely to make a huge blunder or category error—at least not without being aware of his own doubts. We are also good at judging our own level of certainty. A computer can simulate this by an artificial confidence measurement, but I would not like to be the patient who discovers the medical equivalent of answering "Toronto" in the "US Cities" category, as Watson did.



Kasparov gives humans credit for knowing what they don't know, but in plenty of cases people are just as overconfident and blinded in their opinions. In fact, Watson had low confidence in its answer of Toronto in the US Cities category on Final Jeopardy, so it did have an "awareness" of its own uncertainty. I'm almost certain that the programmers ensured that Watson would make a guess in Final Jeopardy regardless of its confidence since there's nothing to lose at that point. In Single or Double Jeopardy, though, Watson wouldn't buzz in unless its confidence in an answer exceeded a threshold.


Humorous aside on Watson: how the computers could have beaten Watson on Jeopardy.


Beijing

China is well known for its blatant knockoffs, and Baidu's online maps are a dead ringer for Google Maps.


But Baidu did manage to put a personal touch on its equivalent of "street view" for Beijing...


767E5EA6573056FE

Since it's pixel art, you can zoom in really far.


767E5EA6573056FE


All that's missing from Google street view is some random pixel people caught in action.


It's like Street View deconstructed by China using one of their unique assets, namely a ton of excess labor.

Scientology

If writing an expose on Scientology isn't hard enough, imagine the fact-checking. Lawrence Wright told NPR that "The New Yorker assigned five fact checkers to the story and sent the Church of Scientology 971 fact-checking queries before publication."


The following segment from the NPR story sounded familiar:



Wright says that one of the most interesting parts of the meeting came when he asked Davis about L. Ron Hubbard's medical records. Hubbard, the founder of Scientology, had maintained that he was blind and a 'hopeless cripple' at the end of World War II — and that he had healed himself through measures that later became the basis of Dianetics, the 1950 book that became the basis for Scientology.


"I had found evidence that Hubbard was never actually injured during the war. ... And so we pressed [Tommy Davis] for evidence that there had been such injuries and [Hubbard] had been the war hero that he described," says Wright. "Eventually, Davis sent us what is called a notice of separation — essentially discharge papers from World War II — along with some photographs of all of these medals that [Hubbard] had won. ... At the same time, we finally gained access to Hubbard's entire World War II records [through a request to the military archives] and there was no evidence that he had ever been wounded in battle or distinguished himself in any way during the war. We also found another notice of separation which was strikingly different than the one that the church had provided."


Furthermore, says Wright, the notice of separation that the church provided was signed by a man who never existed. And two of the medals that Hubbard supposedly had won weren't commissioned until after Hubbard left active service.



L. Ron Hubbard as Don Draper?


Related: Paul Thomas Anderson's on again off again movie about Scientology is now on again. Maybe. Thanks to Larry Ellison's daughter?


Miscellany

Yeah, um, this is why I don't drive a Ferrari.


***


Garry Kasparov is known mostly for being a great chess player, but I'm impressed with his writing ability. I don't know enough about chess to characterize his playing style, but there is a precise and clinical objectivity to his writing that feels like it might arise from a mind optimized to the playing of a game with the nature of chess.


This review of the Endgame: Bobby Fischer's Remarkable Rise and Fall - from America's Brightest Prodigy to the Edge of Madness in The New York Review of Books is a case in point.



In his play, Fischer was amazingly objective, long before computers stripped away so many of the dogmas and assumptions humans have used to navigate the game for centuries. Positions that had been long considered inferior were revitalized by Fischer’s ability to look at everything afresh. His concrete methods challenged basic precepts, such as the one that the stronger side should keep attacking the forces on the board. Fischer showed that simplification—the reduction of forces through exchanges—was often the strongest path as long as activity was maintained. The great Cuban José Capablanca had played this way half a century earlier, but Fischer’s modern interpretation of “victory through clarity