From this Jennifer Egan interview at Days of Yore:
A: It’s a very trusting environment, but also a very rigorous environment. Because you want to know that everyone is on your side, but if they just tell you it’s great, they’re not doing you any favors. That part about everyone being on your side is really critical too. There’s nothing worse than not knowing whether their criticism is motivated by some sort of internal or external wish to undermine or whether it is valid.
Q: But it can be hard in, say, a writing workshop, to shut out the choir of voices and hear your own voice.
A: Yes. But what I lose by not listening is much greater than what I lose by listening to bad advice. Because I think I can sort of sort through with my gut what is useful and what is not useful. Whereas if I hear nothing, I know vividly what results. I am never going to let that happen again.
I think people feel somehow that they can be hurt by hearing the wrong thing. I am not convinced that is true. We might get our feelings hurt, but I don’t think there is any actual damage done. What’s bad falls away.
One thing I often say to students is, “I am not interested in hearing solutions.
It's worth noting, in that regard, that The New Yorker, which published some of his best fiction, never did any of his nonfiction. No shame to Wallace or The New Yorker, it's simply a technically interesting fact: He couldn't have changed his voice to suit the magazine's famous house style. The "plain style" is about erasing yourself as a writer and laying claim to a kind of invisible narrative authority, the idea being that the writer's mind and personality are manifest in every line, without the vulgarity of having to tell the reader it's happening. But Wallace's relentlessly first-person strategies didn't proceed from narcissism, far from it—they were signs of philosophical stubbornness. (His father, a professional philosopher, studied with Wittgenstein's last assistant; Wallace himself as an undergraduate made an actual intervening contribution—recently published as Fate, Time, and Language—to the debate over free will.) He looked at the plain style and saw that the impetus of it, in the end, is to sell the reader something. Not in a crass sense, but in a rhetorical sense. The well-tempered magazine feature, for all its pleasures, is a kind of fascist wedge that seeks to make you forget its problems, half-truths, and arbitrary decisions, and swallow its nonexistent imprimatur. Wallace could never exempt himself or his reporting from the range of things that would be subject to scrutiny.
I am among the many for whom David Foster Wallace, or DFW as he's commonly referred to, is the most influential writer of this generation, and Sullivan's article articulates much of why that is. I'd add that beyond his prodigious talent, Wallace's writing evinced a curious but tolerant world view which seemed all too rare in this more cynical age, with its tendency towards snap and shallow judgment.
I enjoyed the reference to the New Yorker house style, which Tom Wolfe (according to a citation Wikipedia), once said (emphasis mine): "The New Yorker style was one of leisurely meandering understatement, droll when in the humorous mode, tautological and litotical when in the serious mode, constantly amplified, qualified, adumbrated upon, nuanced and renuanced, until the magazine’s pale-gray pages became High Baroque triumphs of the relative clause and appository modifier."
Google’s strength, he continues, was to recognise back in 2001 that “we would be handling massive amounts of data, and would need to develop tools for that.
Joshua Foer's account of becoming a world-class memory athlete in the NYTimes seems like the perfect fodder for a short and focused documentary, like Spellbound or Word Wars. It's not surprising that Foer received a huge advance to turn this into a book, Moonwalking with Einstein: The Art and Science of Remembering Everything.
Two ideas about memory really struck me. One is that memorizing is an act of creation (emphasis below is mine).
What distinguishes a great mnemonist, I learned, is the ability to create lavish images on the fly, to paint in the mind a scene so unlike any other it cannot be forgotten. And to do it quickly. Many competitive mnemonists argue that their skills are less a feat of memory than of creativity. For example, one of the most popular techniques used to memorize playing The point of memory techniques to take the kinds of memories our brains aren’t that good at holding onto and transform them into the kinds of memories our brains were built for. cards involves associating every card with an image of a celebrity performing some sort of a ludicrous — and therefore memorable — action on a mundane object. When it comes time to remember the order of a series of cards, those memorized images are shuffled and recombined to form new and unforgettable scenes in the mind’s eye. Using this technique, Ed Cooke showed me how an entire deck can be quickly transformed into a comically surreal, and unforgettable, memory palace.
The second is the idea of memory as not just an act of creation but as one of creating a physical space, like being an architect.
Memory palaces don’t have to be palatial — or even actual buildings. They can be routes through a town or signs of the zodiac or even mythical creatures. They can be big or small, indoors or outdoors, real or imaginary, so long as they are intimately familiar. The four-time U.S. memory champion Scott Hagwood uses luxury homes featured in Architectural Digest to store his memories. Dr. Yip Swee Chooi, the effervescent Malaysian memory champ, used his own body parts to help him memorize the entire 57,000-word Oxford English-Chinese dictionary. In the 15th century, an Italian jurist named Peter of Ravenna is said to have used thousands of memory palaces to store quotations on every important subject, classified alphabetically. When he wished to expound on a given topic, he simply reached into the relevant chamber and pulled out the source.
One concept which you has applications beyond just training your memory is the idea of O.K. plateaus.
Psychologists used to think that O.K. plateaus marked the upper bounds of innate ability. In his 1869 book “Hereditary Genius,
Yeah, um, this is why I don't drive a Ferrari.
Garry Kasparov is known mostly for being a great chess player, but I'm impressed with his writing ability. I don't know enough about chess to characterize his playing style, but there is a precise and clinical objectivity to his writing that feels like it might arise from a mind optimized to the playing of a game with the nature of chess.
This review of the Endgame: Bobby Fischer's Remarkable Rise and Fall - from America's Brightest Prodigy to the Edge of Madness in The New York Review of Books is a case in point.
In his play, Fischer was amazingly objective, long before computers stripped away so many of the dogmas and assumptions humans have used to navigate the game for centuries. Positions that had been long considered inferior were revitalized by Fischer’s ability to look at everything afresh. His concrete methods challenged basic precepts, such as the one that the stronger side should keep attacking the forces on the board. Fischer showed that simplification—the reduction of forces through exchanges—was often the strongest path as long as activity was maintained. The great Cuban José Capablanca had played this way half a century earlier, but Fischer’s modern interpretation of “victory through clarity
Pauline Kael is getting canonized by the Library of Congress. A lot of her work is out of print, though I've tried to track down as much of it as possible. I'm curious to see what the contents of this collection will be and how much of is rescued from the obscurity of "out of print." It's something to tide us over until someday when we can just buy the rights to the her complete writings.
In a digital age, the idea of some writing being "out of print" sounds like an analog artifact.
Global warming is the environmental movement's pinup (though it faces perhaps its own crisis of coverage fatigue and noise that is a critical drag on momentum*). One ever growing and related crisis that may be underestimated by the public is the global food crisis. The issues are related because whether it's wind farms or a roast chicken, it's all about energy: "The core of the climate problem lies in the reality that the world doesn’t have the energy options it needs for a smooth ride toward roughly 9 billion people by mid-century, all seeking decent lives."
Tonight, there will be 219,000 additional mouths to feed at the dinner table, and many of them will be greeted with empty plates. Another 219,000 will join us tomorrow night. At some point, this relentless growth begins to tax both the skills of farmers and the limits of the earth's land and water resources.
Beyond population growth, there are now some 3 billion people moving up the food chain, eating greater quantities of grain-intensive livestock and poultry products. The rise in meat, milk, and egg consumption in fast-growing developing countries has no precedent. Total meat consumption in China today is already nearly double that in the United States.
The third major source of demand growth is the use of crops to produce fuel for cars. In the United States, which harvested 416 million tons of grain in 2009, 119 million tons went to ethanol distilleries to produce fuel for cars. That's enough to feed 350 million people for a year. The massive U.S. investment in ethanol distilleries sets the stage for direct competition between cars and people for the world grain harvest. In Europe, where much of the auto fleet runs on diesel fuel, there is growing demand for plant-based diesel oil, principally from rapeseed and palm oil. This demand for oil-bearing crops is not only reducing the land available to produce food crops in Europe, it is also driving the clearing of rainforests in Indonesia and Malaysia for palm oil plantations.
The combined effect of these three growing demands is stunning: a doubling in the annual growth in world grain consumption from an average of 21 million tons per year in 1990-2005 to 41 million tons per year in 2005-2010. Most of this huge jump is attributable to the orgy of investment in ethanol distilleries in the United States in 2006-2008.
The underexposure of this crisis is one reason I was so excited that the most acclaimed science fiction novel last year centered around food shortage. I've just started Paolo Bacigalupi's The Windup Girl, 2010's Hugo Award winner for Best Novel (tied with China Miéville's The City & the City), and so far it's as original as advertised.
A socially valuable role for art in society is to present critical issues to people in the most emotionally compelling manner, and that's one reason the success of this novel is exciting. While I'm skeptical anytime Hollywood tries to adapt one of my favorite books, this might be an exception. We may need to push the food crisis through every means available to mobilize the world so we can avoid a soylent green future.
*FOOTNOTE: In the NYTimes article "Climate News Snooze," Randy Olson is quoted:
Perhaps it’s just my bias as a communicator, but I think this is THE most important variable of the future. Things can be hot, flat, and crowded, yet still civil if there is effective leadership AND the people are able to hear the voices who know how to lead. But try starting a food fight in a crowded, NOISY lunch room and see what happens. Pretty hard to impose order.
I have a feeling that Al Gore would agree with this speculation. He tried to lead, but got shouted down by an already-noisy society. There is a coming “dark ages
I read Gordon Grice's The Red Hourglass over holiday break, and one question it left lodged in my head was what insect was the "cricket-beast" Grice wrote about in the mantid chapter. Grice placed this giant bug, which he found in his driveway, into a jar with a mantid (praying mantis) and the cricket-beast devoured the mantid with ease.
I owned a pet praying mantis in high school. While I was touring Japan with a youth symphony, the young son of the host family I stayed with gave me a praying mantis he grabbed out of a tree while we were sightseeing. No doubt in violation of some U.S. tourism laws, I brought the mantis back to the U.S., keeping it in my shirt pocket the whole flight home.
I kept it in a jar, and each day I'd catch some different insects to toss into the jar as an experiment. I didn't know what praying mantises ate, but given the famous configuration of their two barbed front legs, I was fairly certain they weren't vegetarians.
What I discovered was that the mantis had a wide and worldly palate. Crickets, moths, wolf spiders, grasshoppers, flies--the mantis ate all of them whole. In just two weeks, my mantis had eaten enough that it molted and emerged even larger than before. I was enthralled by the glorious violence of its feeding.
So when Grice wrote about a cricket-like beast that caused the mantis to retreat in fear and that mauled said mantis with a casual efficiency, I was intrigued.
Sometimes it seems like the internet was invented for such questions. On his blog, Gordon Grice reveals the identity of the cricket-beast.
Incidentally, doesn't it feel like a huge miss that this annotation can't be added to the Kindle version of the book for future readers?
Steven Berlin Johnson adapts a portion of his upcoming book Where Good Ideas Come From for this essay in the WSJ.
The premise that innovation prospers when ideas can serendipitously connect and recombine with other ideas may seem logical enough, but the strange fact is that a great deal of the past two centuries of legal and folk wisdom about innovation has pursued the exact opposite argument, building walls between ideas. Ironically, those walls have been erected with the explicit aim of encouraging innovation. They go by many names: intellectual property, trade secrets, proprietary technology, top-secret R&D labs. But they share a founding assumption: that in the long run, innovation will increase if you put restrictions on the spread of new ideas, because those restrictions will allow the creators to collect large financial rewards from their inventions. And those rewards will then attract other innovators to follow in their path.
The problem with these closed environments is that they make it more difficult to explore the adjacent possible, because they reduce the overall network of minds that can potentially engage with a problem, and they reduce the unplanned collisions between ideas originating in different fields.
At one time in the history of the language “glamour
The recent Spotify update added some sweet-looking social features. I say they look impressive because Spotify has yet to release its service in the U.S. With at least four major music labels to negotiate with to get a critical mass of tracks, the woods are thorny indeed, but if they manage to clear that significant hurdle and roll out the following feature set, I'd be ready, at long last, to switch to a subscription service over the model of buying and owning my own music:
Sasha Frere-Jones wrote about the shift of online music to the cloud in a recent issue of The New Yorker. He mentions the usual players (Pandora, MOG, Spotify) and concludes that the age of the computer DJ is upon us.
Similarly, the anonymous programmers who write the algorithms that control the series of songs in these streaming services may end up having a huge effect on the way that people think of musical narrative—what follows what, and who sounds best with whom. Sometimes we will be the d.j.s, and sometimes the machines will be, and we may be surprised by which we prefer.
I think he's partially right. DJ HAL is doing a good job (you can throw Apple's Genius in with those other services), but I still suspect that what Spotify and what I'm sure will be an iTunes cloud-based subscription service will facilitate is the sharing of playlists and discovery among humans. I enjoy MP3 blogs, but I'd much rather follow the lead of musical tastemakers more directly through the same applications I use to listen to music rather than having to read their blogs, go find the music they reference, and then spin those into playlists in iTunes to transfer to my iPod.
Current bandwidths for WiFi and 3G are sufficient to stream music to my iPhone. I'm ready for a cloud-based music subscription service that adds a follower-based social layer (where you can find good tastemakers and choose to follow them even if they don't care to follow you). Such a service is dynamic and ideally improves and changes every time you visit it.
I'm ready for this same revolution to occur in books, too, and with Amazon's latest Kindle app, we're just starting to see the first pebbles of the avalanche skipping by our ankles.
Recently I read David Lipsky's Although Of Course You End Up Becoming Yourself: A Road Trip with David Foster Wallace on my iPad through the Amazon Kindle app. As I was reading, I noticed some passages had been underlined already. When I clicked on the underlined passage, a box would pop up noting "57 other readers have highlighted this passage".
What was frustrating about the battle between Amazon and publishers over digital book pricing was that no one was talking about how to enhance the value of the digital book by capitalizing on what a digital, internet-connected book delivery service could provide, and that is a social reading experience. Publishers were demanding that Amazon charge higher prices for Kindle editions of books, but not once did I read anyone saying how they might justify that price hike by creating something more valuable for the reader.
In college, I hated buying used copies of textbooks, despite the significant price savings, because a book that was marked up and highlighted violated some aesthetic sensibility, especially if the previous owner had highlighted passages I didn't consider important.
But with the Kindle, you can enable highlights and notes to be turned on selectively. To pivot off of David Foster Wallace for a moment, recently the University of Texas acquired the David Foster Wallace archive. DFW was a voracious reader, and besides drafts of his writing the archive contains actual books from his personal collection.
There are also some two hundred books from Wallace’s own library. “Virtually all of the books are annotated, many are heavily annotated,
Atul Gawande gave the commencement speech at Stanford Med School this year. Long-time readers know I am programmed to read everything he writes (The New Yorker really has a murderer's row of regular contributors). His talk hit on many topics he's written or spoken about recently, including health care costs and the complexity of the health profession. The latter was the focus of his latest book, The Checklist Manifesto, which I read earlier this year. Its thesis: using a simple checklist is one of the most effective ways of coping with the complexity of so many modern challenges.
It sounds almost too mundane a topic for a book, even as slim as it is, but when the costs of a misstep are as high as they are in medicine, it seems negligent to ignore the possibilities. From his commencement speech...
Having great components is not enough. We’ve been obsessed in medicine with having the best drugs, the best devices, the best specialists—but we’ve paid little attention to how to make them fit together well. Don Berwick, of the Institute for Healthcare Improvement, has noted how wrongheaded this is. “Anyone who understands systems will know immediately that optimizing parts is not a good route to system excellence,
I would apologize for my lack of posts recently, but I'm not entirely sure who expects my writing here to be higher on the stack than work and other personal obligations. Maybe I'm apologizing mostly to myself. I promise I have a pile of draft posts piled high, all half finished, so the good intentions are there.
My posting infrequency is related to this post in that I do tend to rewrite longer form posts that land here. That's in contrast to the less filtered copy that flows through to my Twitter account, though even there I am sensitive to flooding my followers with too much personal ephemera.
Rewriting is an underrated commodity in this new age of instant publishing. Most of us have been our own copy editors for years, but with the web disseminating writing further afield, any laziness on that front pollutes a wider mind mass.
That's one reason I find this photo so heartwarming. If you don't recognize it, this is Obama's speech on health care reform to a Joint Session of Congress. The photo is even more fascinating blown up large so you can read the individual edits that I presume Obama sent to speechwriter Jon Favreau. It's a fascinating insight into Obama's communications strategy when you see him replacing "compassion" with "concern and regard for the plight of others" or replacing "character of this country" with "American character." He has a knack for verbal pacing and poetic turns of phrase, as when he flips "This has always been our history" to "This has always been the history of our progress.
If there was any lingering doubt about Obama's writing skills despite the two polished books to his credit, this should serve as an adequate response, though still I hear silly teleprompter chatter from the peanut gallery. Any real writer will tell you how much of writing is actually rewriting, and how much of growing as a writer is a willingness to abandon, at times, entire days worth of work once you've been able to cut the emotional umbilical cord and regard the work with the sage objectivity of a copy editor.
In contrast, I offer you Sarah Palin's Twitter stream. Some of this is the medium and 140 character limit, to be sure, but the prose style is that of a teenage girl. It's not as if others aren't working under the same constraints.
Robert Lane Greene lays the "correct" usage of "begging the question" to rest, albeit reluctantly.
Rather than concede that the correct usage is no longer useful, I only concede that the incorrect usage has now become an acceptable alternative usage. The the original meaning still has value, as Greene notes, in pointing out a fundamental logical fallacy.
The Atlantic interviewed Paul Theroux about "Fiction in the Age of E-Books." I liked the concluding exchange.
TA: The inevitable question: What’s your advice for a young person who wants to grow up to become a fiction writer?
PT: Notice how many of the Olympic athletes effusively thanked their mothers for their success? “She drove me to my practice at four in the morning,
Another older post I've just left hanging out there forever...
The joy of having your first novel reviewed by the New York Times Book Review quickly turns to horror when it turns out to be a succinct dismissal. Ronlyn Domingue writes about what that feels like.
Although the advice to have a thick skin was well-meant, it is emotionally dishonest. Sharing one’s writing is a naked act not intended for the meek. Harsh words can—and sometimes do—undermine the most confident, successful writers. It’s human. It’s okay. It will pass. Now, my guidance to myself, and others, is to have a permeable skin, one that doesn’t resist or trap the good or the bad. Reviews, critiques, comments come in, then move on. Then there’s space, inside and out, for something new.
Every artist experiences the little deaths that come with work in a creative field. In fiction writing seminars in college, every story you wrote would be read out loud, and then the others in the class would take turns offering their critiques. In film school, the same was done for our scripts, rough cuts, fine edits, final works.
Professors always counsel everyone to be civil with their criticism, to keep it about the work and not the person, but I suspect it's impossible to ever accept even the most even-tempered of criticism of one's work without suffering the smallest of deaths (the French use la petite mort in another sense, of course, but it's always felt more accurate here).
But even if your classmates and peers are respectful and professional, and for the most part I'd say my creative writing and film school peers were very much so, at some point if you're to succeed in your field you'll have to put your work out there for an audience that isn't in the same room with you, that isn't operating under the potential collateral damage of your potential subsequent feedback on their work. Then the gloves come off.
The internet has only accelerated that. It's given everyone a megaphone, and even if they're shouting into the wind (2 followers, one his mother, the other is Candy327, 5 tweets), Google or Twitter is saving their shouts for you to summon with a few mouse clicks. Before the internet came along, the cliche that "everybody's a critic" may have been true, but for the first time we can hear them all at once, all the time, one massive and stern Greek chorus of disapproval.
But whereas the chorus in a Greek tragedy at least spoke in meter, with a certain poetic eloquence, the anonymity of the web has reduced us to our most savage and bitter. We are all cavemen, all id. Civil debate and discourse isn't the norm in any large and open community online. 4Chan bullies prowl the hallways of the web like the high school thugs every awkward teenager dreads running into.
As a creator, you have to balance receptivity to criticism with the conviction of your creative choices. It's not easy withstanding the constant, withering glare of a million critics, but just in taking those steps to cross over from the darkness of the peanut gallery to the bright lights of center stage you've set yourself apart.
As for the millions of judges out there, I urge you, the next time you go to murder someone's book with your poison pen, try to write a book yourself. The next time you leave a movie theater ready to dismiss what you watched for two hours, try to direct your own short movie. What the world needs is not more judges. As the old saying goes, everbody's a critic.
It's on!? But is it an adaptation of the book or the movie? They are quite different in key ways. Most people I know who love it have only seen the movie. I saw the movie first but read the book later during my backpacking trip through South America in 2003. The book is much more graphic than the movie; I imagine the musical will be even more sanitized than the movie was but will still draw adoring crowds of bankers who fail to see the satire and clamor to look into the mirror it holds up to their lifestyle.
It has been too long since I've read the book, so I don't recall which scenes from the movie were lifted straight from the book, but I can't help but picture one of the musical numbers in this movie being a trio sung by Patrick Batemen and the two prostitutes he's paid to participate in a 3-way
Last night's opening segment of The Daily Show with Jon Stewart cracked me up and puts the revelations from Game Change in perspective, though I'm still going to read the crap out of it. It's difficult to tell how readers are receiving it as the reviews for the book on Amazon are skewed by dozens of 1-star reviews from users who haven't read the book but are angry that a Kindle version wasn't issued. Amazon does show when a user was a verified purchaser of a book; it would be useful someday if they could allow you to see only the average rating and reviews from that subset of readers.
Also, The Daily Show and The Colbert Report are up in widescreen on Hulu now. We had to work through that workflow with the Comedy Central folks, but we were able to retain captions in the widescreen files which was important for us.