Damning with painterly praise

That’s the kind of widget The Man From U.N.C.L.E. is: so good it’s practically defective.
 
This wasn’t something I wanted to see. The posters promise Armie Hammer and Henry Cavill, two actors who are like day-old bread. You practically have to give them away. They look like they’ve been attacked by a stylist from the fall issue of any men’s magazine. Down at the bottom of the poster, standing in front of an Aston Martin and looking like a flight attendant vacationing in a Paul Bowles novel, is Alicia Vikander, a Swede who’ll be shoved in our faces until we love her. Also, and not for nothing: This is a remake of a spy show that ran for four seasons on NBC near the height of the Cold War, a film version of which has been failing to launch for decades. Exactly no one was asking for this.
 
So it’s a surprise to discover that the bar for this movie is low enough to conga under. Ritchie has gotten everyone to agree not to take any of this seriously, including the person responsible for keeping an eye on Hammer’s Russian accent.
 

Wesley Morris on The Man from U.N.C.L.E. Here he in the same article on a movie I've never heard of, Cop Car.

From the car emerges Kevin Bacon, with a graying, rusty mustache and nary a line of dialogue, looking every bit the hypothetical adult outcome of Sam Elliott’s decision to do in vitro with himself.
 

When Grantland first arrived on the scene, I could read every article published on the site. Today I can barely keep up with a fraction of what's there, but it's still a wellspring of great writing.

Unfortunately, I have a sinking feeling that despite its popularity, new media economics put Grantland in no man's land: not targeted enough to be a one-man niche, not large enough to collect enough tax revenue to survive as an independent country. In barbell economics, the one in the middle is left holding a lot of weight.

I take solace in the fact that most of the talent there will always find work.

The paradox of writing

Rebecca Mead with a beautiful piece on the movie The End of the Tour, based on the book Although Of Course You End Up Becoming Yourself, about a road trip writer David Lipsky took with David Foster Wallace when Lipsky was working on a profile for Rolling Stone. Emphasis mine; that last sentence is so gorgeous I can't stop reading it over and over.

The movie ends before the article can appear in Rolling Stone, so the relationship between Wallace and Lipsky that it represents is all preamble, no aftermathAnd, in fact, the proposed article didn’t ever appear in Rolling Stone: according to Lipsky, in the afterword of his book, Wenner changed his mind about wanting it before it was even written. It was not until after Wallace’s suicide, in 2008, that Lipsky wrote up his notes into a long, award-winning article about the author; his book, which consists mostly of transcripts of their conversations, followed. (A meta-narrative of betrayal has, nonetheless, unfolded: David Foster Wallace’s widow and his estate have strenuously objected to the film, insisting that Wallace would never have wished the magazine interviews to be used this way.)
 
In “Although Of Course You End Up Becoming Yourself,” Lipsky writes that he was relieved by Wenner’s fiat that he shouldn’t write the piece, rather than experiencing it as a loss somewhere on the scale between devastating and irritating—the usual range of feelings available to a journalist upon having a piece killed. “I tried to write it, and kept imagining David reading it, and seeing through it, through me, and spotting some questionable stuff on the X-ray,” he writes. Lipsky was too lingeringly attached to the period of intimacy—of having momentarily befriended Wallace—to attain the necessary detachment to reshape that experience into a story. Given that, it’s probably just as well he didn’t have to write it; it wouldn’t have been a success. Any reporter may fleetingly fall in love with his or her subject during the process of researching a magazine profile—the singular dance chronicled by “The End of the Tour.” But for the work to be any good, the writer’s greatest libidinal pleasure must be discovered afterward: when the back-and-forth is over, and the recorder has stopped recording, and one is alone at the keyboard at last.
 

It's such a delicate balance. You need a very real and true interest in the subject to do it justice, and yet when you finally go to write the piece, you need the professionalism to retreat from your biases, desires, ego, your very self, and do the subject justice.

It's such a tricky dance, and it's a balance I failed to find in both the NYTimes piece on Amazon this past Saturday and many of the responses from current and former employees. I was all ready to contribute my thoughts on the controversy here, I have hundreds of words in draft form, but I decided to put them on ice for a few days, to see if I might achieve some zen-like distance from which to edit myself.

I've recently taken a few baby steps into meditation, and on a flight today I reread Marcus Aurelius' Meditations. It's a coincidence that both share the word meditation, but both have much to offer in finding a path to that productive and clear-headed place from which to write well. It's love that starts you in the right direction, but it doesn't get you all the way there.

Tablets are mostly for consumption

While there is nothing inherently wrong with a long upgrade cycle, as seen with the Mac, which continues to report solid sales momentum, the reasoning behind holding on to tablets for years is much more troubling. There are currently approximately 3 million units of the original iPad still in use, or 20% of the devices Apple sold. For the iPad 2, it is possible that close to 60% of the units Apple sold are still being used. These two devices are not superior tablets. The initial iPad lacks a camera, while the iPad 2 has a mediocre camera. When compared to the latest iPads, these first two iPads are simply inferior tablets with slow processors, heavy form factors, and inferior screens. But none of that matters with owners. This is problematic and quite concerning, suggesting that many of these tablets are just being used for basic consumption tasks like video and web surfing and not for the productivity and content creation tools that Apple has been marketing. 
 
There are signs that Apple believes there may be some kind of iPad revival around the corner. Since the average iPad upgrade cycle is three years and counting, does this mean that Apple may benefit from some sort of upgrade cycle? I'm skeptical.  Why would someone upgrade an iPad that is just being used to watch video?
 

From Neil Cybart at Above Avalon. iPad defenders used to push back anytime anyone said that the device was just for consumption, highlighting people who used iPads to write music, sketch, or edit. I was always skeptical that such use was common since I was using the iPad primarily to read email and books, browse the web, skim social network feeds, and watch video. Lacking a physical keyboard, it was cumbersome to type on, and that is my primary creative activity. It was too large to carry with me everywhere, and as the iPhone grew larger, it offered most of what I needed from my iPad.

It turns out that's mostly what most people do on their iPads. I dig my iPad, don't get me wrong, but it is more of a “niche” product than the iPhone or even the MacBooks and MacBook Pros (I put niche in quotes because Apple sold 84 million of them in the first two years of its existence; that was a good-sized niche that Apple filled quickly). For me my iPad is a bit of a luxury: it's lighter than my laptop and has superior battery life, and it has a larger screen than my iPhone but can be held with one hand most of the time. So it turns out to be a great device for using while lying in bed reading books or watching video. If I had to give up one of my three devices, though, no doubt the iPad would be low device on that totem pole.

I agree with Cybart that the rumored iPad Pro at least offers a possibility of differentiation. The larger screen size alone may open some use cases. I just don't know what those might be and if they'll be compelling. Perhaps if it's called Pro it's intended from the start for a niche audience, offering them a reason to upgrade. Cybart suggests a haptic keyboard could open up typing as a more popular mode of creation on iPads, but as I've never tried a haptic keyboard, I'm skeptical I'd enjoy it as much as a physical keyboard. I'd love to be proven wrong, but I have a whole life's worth of typing on a physical keyboard bearing witness to the defendant.

Another rumor has a Force Touch-compatible stylus shipping with the iPad Pro. That, along with the larger screen size, could open up some more illustration use cases, stealing share from Wacom tablets. Rumored split screen options open up multi-app interaction as a use case. Still, none of those sounds like a mass market, especially given what will likely be a higher price point for the Pro.

Ultimately, I'm not sure it matters if the iPad is a mass market device. If the new Macbook continues to evolve and steal share from iPads on one end while the iPhone to steals share on the smaller screen size quadrant, Apple still captures the sale and the customer. I'm sure they wouldn't love to lose share to cheap, Android tablets, but if that's where some of the market goes, I'm not sure Apple would care to follow with any sense of urgency.

Cuisine and Empire

Great episode of the podcast Econtalk featuring guest Rachel Laudan, author of Cuisine and Empire, an instant purchase for me. Host Russ Roberts has a fascinating diversity of guests on his show, but always takes an interesting angle into the conversation, one that is driven a lot but not entirely by economics.

Some of my favorite moments from this episode. First, on the history of the potato, and just think about how many ideas are packed into just this short exchange.

Russ: And, just to stick with basics for a minute: At one point, quite surprising to me, quite late in the book, you mention the potato. I think of the potato as a very basic foodstuff. But you point out that the potato is a relatively late invention. Talk about its cultural significance and a little bit about its history. 
 
Guest: Well, the potato is one of a series of roots--roots in a culinary sense, that is, underground bits of plants that can be cooked into edible foods. They have--the roots have always been of less interest to civilized societies because they are so wet and heavy you cannot provision them [?] fit to use with roots. Now, the one exception or partial exception to this is the high Andes mountains where they did grow potatoes and use them from early on. But they developed an incredibly elaborate way of freeze drying them to make them light enough and storable enough to go into cities as well as combining them with maize, which by then was down there. So when the potato comes into Europe, it's an enormous cultural effort to integrate the potato into the European food system, because for anyone who lives in a settled society with cities, root-eating is a sign of basically being more like animals. Roots were animal food in Europe. And so basically the poor of Europe had to be bludgeoned into adopting the potato in the 17th and 18th century. 
 
Russ: It's a little hard to understand because I really love French fries, and it's hard to imagine how someone could resist this. But they didn't have French fries. Talk about what they had. 

Guest: Well, basically, fat is very expensive for most people. So French fries, until the 1960s, 1970s, well they weren't invented until the middle of the 19th century, late 19th century. But until the invention of frozen French fries in the 1960s and 1970s, French fries were for the elite. Only the richest people could afford the potatoes that were cooked in that much fat. And double-cooked in that fat--which is what you have to do for French fries. What you find in the 19th century, as fats become more available for a large bulk of the population is that potatoes become more acceptable. Because you can put butter on your boiled potatoes; you can layer potatoes with milk and cheese and make a gratin; you can bake them and add butter. And that fat makes them much, much more palatable. 

Russ: But the point you make in the book is that the potato that was first introduced--I think in the early 18th century-- 

Guest: Right. 

Russ: was bitter, and nothing like the Idaho baked potato that we might envision at a potato bar. 

Guest: No. I've been concentrating in talking to you on the cooking and processing side, but there was also this agricultural trick they had to pull off to turn a plant that lived 8,000, 10,000 feet in the Andes, where seasons are reversed from Northern Europe, into a plant that would grow successfully and be palatable in Europe and the United States. And that took 100 plus years. 

Russ: And that's true of a lot of the things that we eat, I assume. I assume that if we went back to the 15, 16, 1700s and looked at what they called a 'blank'--whatever blank is, we would find it almost unrecognizable and very unattractive. Is that fair? Or am I being too harsh? 

Guest: Yes. Very few fruits--there are a few: dates, grapes--are palatable [?] without breeding. But most fruits have been systematically bred over the centuries. Animals have been bred. Probably the only things that we regularly eat but taste as they would have done hundreds of years ago are fish of various kinds. But everything else is the result of human breeding. 

Russ: Yeah, the goal of fruit has been to make fruit more like an M&M, and it's working evidently. 

Guest: Exactly.
 

Many parallels to the invention and then diffusion of technology. First, it's available to just the aristocrats, wealthy, and/or elite. The cost of production comes down with scale, and then it's brought to the masses. Finally, the wealthy go in search of some other way to signal their status, to differentiate. It's one reason behind the rise of extravagant $250 prix fixe menus in which guests photograph each dish as if it were their newborn child.

Food has replaced music at the heart of the cultural conversation for so many, and I wonder if it's because food and dining still offer true scarcity whereas music is so freely available everywhere that it's become a poor signaling mechanism for status and taste. If you've eaten at Noma, you've had an experience a very tiny fraction of the world will be lucky enough to experience, whereas if you name any musical artist, I can likely find their music and be listening to it within a few mouse clicks. Legally, too, which removes even more of the caché that came with illicit downloading, the thrill of being a digital bootlegger.

Once, it felt like watching music videos on MTV was a form of rebellion in plain sight. Nowadays, the channel doesn't play any music videos. Instead, we have dozens of food and cooking shows, even entire channels like The Food Network dedicated to the topic. Chefs have become elevated to the status of master craftsmen, with names that have risen above the status of their restaurants, and diners revere someone like Jiro of Jiro Dreams of Sushi fame the way a previous generation worshipped the guitar sound of a rock god like Jimi Hendrix.

The food scene today offers a seemingly never-ending supply of scarce experiences, ingredients, and dishes. Cronuts you have to wait in line for a few hours to get your hands on. Pop-up restaurants that serve only on a few nights a week for a few weeks, then disappear forever. Restaurants that you have to sacrifice a goat to just to get a reservation, and then they'll actually take that goat you killed and prepare your entire dinner from it, nose to tail. A white truffle add-on that tacks $80 on to a single piece of cured hamachi, and oh, the truffle is only available for four weeks a year and came over on a gondola from Alba, Italy, and the hamachi is one of the last of three members of its species so you know, you should probably try it before...oops, sorry, the chef says someone just ordered the last of it. Yep, it's that couple at the corner table, and that's the last plate that she's Instagramming right now.

It's not just the scarcity of the actual food that offers such signaling opportunities. You can generate your own scarcity just by having a broad palate. When it comes to dining, many people still have narrow bands of taste, so if you're from the Jonathan Gold school of adventurous dining, you can easily set yourself apart by ingesting something exotic, like tripe stew, or some part of an animal that most people didn't even know was edible and certainly wouldn't dream of consuming.

In more recent history, the tech world has spawned yet another branch of food religion with the invention of Soylent, representing the polar opposite of the foodie religion, with its reverence for organic ingredients, elaborate preparation, and theatrical plating. Soylent is the food for people who find cooking and eating to be a waste of time, a complex job in need of simplification. It is dining function over form, with Soylent promising to deliver an exact and efficient dose of the nutrition we need as humans. I love food and dining with others too much to ever be an acolyte of this school, but that it exists is proof enough of how broad and diverse the world of food has become.

In contrast, punk rock and other formerly edgy genres of music have been assimilated by the mainstream to such an extent that flying your freak flag is harder and harder through your musical tastes. Ironically loving Taylor Swift or pop music has somehow become more iconoclastic than listening to some indie band. That Apple has so dominated the act of musical consumption (through ubiquitous iPhone and iPods and white earbuds, TV commercials for said devices and the music services accessible through them, and the massively popular bands that play at their keynotes) has mainstreamed the very idea of listening to music.

Back to the Laudan episode of Econtalk. Here she is on how French high cuisine came to be the eponymous fine dining of the world. 

Russ: You are what you eat, I guess, is an appealing idea, and to some extent true. But maybe not to the extent they used to believe. So, we have the British having a big influence on world cuisine at the end of the 19th century. Somehow, French cuisine becomes the standard of sophistication and high dining. How did that happen? And it still persists, to some extent. It's lost some of its caché, I'd say in the last 50 years. But it still remains a standard of high dining. How did that come about and why was it important? 
 
Guest: I think it's first important to say it's French high cuisine, because the high cuisine of France that became the international standard was something that most French people had never seen and never ate. It did not come, swell up from the peasantry. There's a slightly complicated story about what happened around 1650 when you get a rapid political change and the establishment of, after the Peace of Westphalia, a series of nations in Europe, on supposedly equal terms, combined with a shift of the scientific revolution and the Protestant revolution. And in complicated ways these would act together to produce a new cuisine that the world had never seen before. It's a really striking example of radical and rapid culinary change. The old cuisines of spiced food that--ultimately stemming from Persia but that had really influenced China, dominated in the high cuisine of India, right across to Southern Europe, were displaced by this new Northern European cuisine. And the people who developed it in its most elaborate form, because they had the greatest resources--the richest courts--were the French. And they developed it really terribly rapidly between 1650 and 1700. And that's the point where diplomacy is become important because of this national state system. And the national state system needs something to use for diplomatic dinners, to demonstrate modernity, Europeanness against the Persian-type cuisines that existed before. And so French high cuisine becomes the cuisine of European diplomacy in the 18th century, and then of international diplomacy and the international elite in the 19th century. So that by 1880 you could go to Tokyo, you could go to Santiago de Chile, you could go to Sydney, you could go to San Francisco and the thing to be eating was, if you were really rich or you were really high in politics was high French cuisine.
 

I think French high cuisine is on the decline from its perch atop the world's dining hierarchy, at least among the most passionate U.S. based food lovers. Japanese cuisine is a strong competitor, and the overfishing of the world has added an element of barbarism but also scarcity to eating sushi that gives an extra thrill. That it is perceived as healthier than French high cuisine, with its butter-based sauces and rich, fatty cuts of meat, makes Japanese cuisine better-suited to carry the banner forward in this decade in which a healthy diet has become a high status problem. For a variety of reasons, at its high end sushi can also justify the nosebleed prices that people expect from high status symbols and pastimes.

In the first world, access to enough food to survive is no longer an issue, and so cooking has ascended into the realm of art. Some meals I've had in the past few years are as much performance and theater as they are a way of refueling. With our insatiable desire for narrative, we've enlisted a meal out as a story we first consume and then that we tell about ourselves.

Given America's relative youth as a nation, our national dining habits have always been a cultural battleground. The U.S. came about long after the age when food was a scarce commodity, so most of our wrestling with its meaning has been first around its symbolic value, and now more recently about how to optimize our relative consumption of different types of food like carbohydrates, fat, and protein.

If any food symbolizes American dining today, it's the hamburger. Laudan can take such a humble food item and connect it to forward and back through our nation's culinary history.

Guest: Well, if I may I'd like to back up a tiny bit about presidents serving French dinners, because the American presidency has had a terrible time deciding what to do at diplomatic dinners from the get-go. There were those, like Jefferson, who said we've got to be part of international culture as well as the economy, and we should go with high French cuisine. But there is also this extraordinarily strong republican--with a small 'r'--tradition in America that's part of what the Revolution is about. And the republican strain in American thought said very emphatically that, 'No, we do not want high French cuisine. We do not want aristocratic dining. That is not appropriate. And they looked back to the Roman republic and to the Dutch republic and to other republican movements in Europe and said, 'What we need is a decent cuisine for all citizens.' And that is very much the origin of Thanksgiving, which is not a fancy French dinner for diplomats but a dinner that essentially all Americans can afford and can cook, of American ingredients. It's a kind of striking symbol of the republican tradition exemplified in an American custom, and was deliberately designed to be so. But what happened--I mean the hamburger is just sort of amazing. People say, 'Well, the British had fish and chips.' Well, fish and chips don't cut it, because fish and chips are not this beef, bread, French fry phenomenon. And what Americans managed to do beginning with White Tower but pulled off triumphantly by McDonald's is to make the food of aspiration worldwide something that in America everybody can afford, and in much of the rest of the world the middle class can afford, namely a kind of ersatz piece of roast beef or steak that is a beef hamburger on a piece of white bread with a bit of fresh vegetable out of season, even in the winter, with a sauce which is part of high cuisine, with French fries, which, you know, are popular--which become really widespread with McDonald's and the frozen French fry, which Simplot perfects--until then the French had said it was the apex of French civilized food--and washed down either with a sparkling cold drink or with a milkshake, sweet and rich and cold and foamy. That is just--it makes the food of aspiration accessible to all, and you have it in this brightly lit dining room that is clean, that you have access to. I think only if we understand how McDonald's taps into all these competing traditions that go back so deep in our culture can we understand why it became such a kind of fire point for and against modern American food.
 

If McDonald's has been the 600 lb. gorilla of American dining in the past several decades, perhaps Chipotle is a more suitable totem of this current age of our culinary anxieties and obsessions, if one even exists anymore given our increasingly diverse dining habits. Chipotle serves an ethnic food, derived from one of our largest immigrant populations, but transformed into something palatable for the masses, claiming to be sourced with only GMO-free, organic ingredients, served in franchises that are clean if somewhat generic, available in the places we inhabit, from cities to suburbs to highway stops. It's not food prepared by ourselves, and we eat our burrito bowls at our desks, alone, a fork in one hand, our smartphones in the other, scrolling one of our many feeds while we feed our stomachs. Chipotle represents something America does better than any country, this assimilation of the world's people and ideas and then a subsequent radiation of that back out to the world in a form more agreeable to the masses. Harvey Weinstein used to do the same to niche independent films.

Laudan's most controversial opinion, though one that is likely quite widespread among economists, is that our reverence for natural food and distrust of industrialized, processed food is the reverse of what it should be. Her piece In Praise of Fast Food was published years ago but is as relevant as ever given current attitudes.

As a historian I cannot accept the account of the past implied by this movement: the sunny, rural days of yore contrasted with the gray industrial present. It gains credence not from scholarship but from evocative dichotomies: fresh and natural versus processed and preserved; local versus global; slow versus fast; artisanal and traditional versus urban and industrial; healthful versus contaminated. History shows, I believe, that the Luddites have things back to front.
 
That food should be fresh and natural has become an article of faith. It comes as something of a shock to realize that this is a latter-day creed.

...
 

Eating fresh, natural food was regarded with suspicion verging on horror; only the uncivilized, the poor, and the starving resorted to it. When the ancient Greeks took it as a sign of bad times if people were driven to eat greens and root vegetables, they were rehearsing common wisdom. Happiness was not a verdant Garden of Eden abounding in fresh fruits, but a securely locked storehouse jammed with preserved, processed foods.
 

As for slow food, it is easy to wax nostalgic about a time when families and friends met to relax over delicious food, and to forget that, far from being an invention of the late 20th century, fast food has been a mainstay of every society. Hunters tracking their prey, shepherds tending their flocks, soldiers on campaign, and farmers rushing to get in the harvest all needed food that could be eaten quickly and away from home. The Greeks roasted barley and ground it into a meal to eat straight or mixed with water, milk, or butter (as Tibetans still do), while the Aztecs ground roasted maize and mixed it with water (as Mexicans still do).
 

What about the idea that the best food was country food, handmade by artisans? That food came from the country goes without saying. The presumed corollary—that country people ate better than city dwellers—does not. Few who worked the land were independent peasants baking their own bread and salting down their own pig. Most were burdened with heavy taxes and rents paid in kind (that is, food); or worse, they were indentured, serfs, or slaves. They subsisted on what was left over, getting by on thin gruels and gritty flatbreads.
 

The dishes we call ethnic and assume to be of peasant origin were invented for the urban, or at least urbane, aristocrats who collected the surplus. This is as true of the lasagna of northern Italy as it is of the chicken korma of Mughal Delhi, the moo shu pork of imperial China, and the pilafs, stuffed vegetables, and baklava of the great Ottoman palace in Istanbul. Cities have always enjoyed the best food and have invariably been the focal points of culinary innovation.

I excerpt Laudan heavily, but it's only a fraction of her output not just on the podcast episode but in writing. All of what I've read thus far is fascinating.

This tendency to romanticize the past, to imagine it as a pastoral paradise of harmony between people and nature and each other, is an odd human trait. Dissatisfied with the present, we look to the past for an answer, as far back as our caveman days when it comes thing like the paleo diet, even if we hardly realize just how much harder and treacherous and brutal life was back then. Your pastoral fantasy? Here's how it ends, with you stepping in cow shit, contracting cholera, and dying after several feverish nights in an unheated bedroom, at the age of 20.

As for the future? Well, most of our most popular visions of the distant future are dystopic, either tales of stragglers trying to survive in a post-apocalyptic wasteland, or prophecies of human enslavement by AI run amok. When our society does survive into the next century, it has often morphed into a nightmarish surveillance state, where all human diversity in thought and being has been stamped out. You are both funny and headstrong? You are...divergent. Still one of the silliest ideas for a book and movie in recent memory.

All this despite a steady rise in the quality of life throughout human history, with increasing tolerance and leisure and life expectancy. All evidence is that the arc of the moral universe is long, but it bends towards justice. From scarcity to abundance. And taller, healthier people. I myself am very happy I wasn't born in an age when I'd have to be a farmer just to feed myself. 

Watch enough movies, though and you'd think that the arc of human life bends into a black hole, an apocalypse, from which we have to start over again, with the seeds of of the rebirth of civilization being resown by a lone hero, most often a white male, and the beautiful woman who grew to love him sometime during the second of the three acts of the script.

Beware your nostalgia for an age you never lived in. It was probably worse then than it is now. Given our increasing resolution of detail in recorded history, I wonder if future generations will be more immune to nostalgia. I'm somewhat hopeful. I told my nephew recently that when I was his age, I had no iPhone or iPad. I hadn't even seen a desktop computer yet, let alone the web.

“Whaaaaaaaat?” he said. “That sounds terrible.” Then he went back to playing a game on his mom's iPhone.

Age of abundance: oil painting edition

What is technology's greatest achievement if not putting what was once only accessible to the rich into the hands of the masses? Whereas you once had to be a nobleman to commission an oil painting self-portrait, Noblified not puts that within your reach with prices starting at $99.

If you're curious how one might look, here's a recent one Noblified produced for Snoop Dogg for promotional purposes. Snoop as Cesare Borgia. (h/t @kenwuesq)

I'd like to hear Snoop find some rhymes for Medici.

BauBax travel jacket

The BauBax travel jacket is the most funded clothing project in Kickstarter history. It comes in four models (blazer, windbreaker, bomber, sweatshirt) and comes with features like a built-in neck pillow, eye mask, and Gloves, earphone holders, a drink pocket, and a zipper that turns into a pen or stylus. I'm going to venture it's the nerdiest jacket ever created.

The zipper pen is described thus:

Your zipper is now smart, useful and social. It's a 1 inch pen that extends to 4 inches - great way for making new friends.

“Looks like you could use a pen to fill out your customs declaration form. And to write down my Snapchat username.”

It's rare that form and function cohere in an elegant way, though if I were to name two places they do off the top of my head I'd point to the iPhone and some of the modern sneakers worn by people who don't use them for running but for general comfort.

The decline of the phone call

The distaste for telephony is especially acute among Millennials, who have come of age in a world of AIM and texting, then gchat and iMessage, but it’s hardly limited to young people. When asked, people with a distaste for phone calls argue that they are presumptuous and intrusive, especially given alternative methods of contact that don’t make unbidden demands for someone’s undivided attention. In response, some have diagnosed a kind of telephoniphobia among this set. When even initiating phone calls is a problem—and even innocuous ones, like phoning the local Thai place to order takeout—then anxiety rather than habit may be to blame: When asynchronous, textual media like email or WhatsApp allow you to intricately craft every exchange, the improvisational nature of ordinary, live conversation can feel like an unfamiliar burden. Those in power sometimes think that this unease is a defect in need of remediation, while those supposedly afflicted by it say they are actually just fine, thanks very much.
 
But when it comes to taking phone calls and not making them, nobody seems to have admitted that using the telephone today is a different material experience than it was 20 or 30 (or 50) years ago, not just a different social experience. That’s not just because our phones have also become fancy two-way pagers with keyboards, but also because they’ve become much crappier phones. It’s no wonder that a bad version of telephony would be far less desirable than a good one. And the telephone used to be truly great, partly because of the situation of its use, and partly because of the nature of the apparatus we used to refer to as the “telephone”—especially the handset.
 
...
 
But now that more than half of American adults under 35 use mobile phones as their only phones, the intrinsic unreliability of the cellular network has become internalized as a property of telephony. Even if you might have a landline on your office desk, the cellular infrastructure has conditioned us to think of phone calls as fundamentally unpredictable affairs. Of course, why single out phones? IP-based communications like IM and iMessage are subject to the same signal and routing issues as voice, after all. But because those services are asynchronous, a slow or failed message feels like less of a failure—you can just regroup and try again. When you combine the seemingly haphazard reliability of a voice call with the sense of urgency or gravity that would recommend a phone call instead of a Slack DM or an email, the risk of failure amplifies the anxiety of unfamiliarity. Telephone calls now exude untrustworthiness from their very infrastructure.
 

Great piece by Ian Bogost on how the decline of the phone call is not just a result of the rise of alternative forms of communication but also because phones today are not that great for making phone calls. I saw Aziz Ansari do a great bit on what it would be like to travel back in time with an iPhone and show it to someone who'd never seen a mobile phone before. I can't find the sketch online but it came after his bit on Grindr. I'll paraphrase:

“Whoa, what is that?”
 
“It's an iPhone!”
 
“That looks amazing! So, is it really great at making phone calls?”
 
“Actually, no, it actually is terrible for that. But if you want to know [something really dirty related to Grindr, you can fill in the blank], this will do the trick perfectly!”
 

Bogost notes that the mobile part of the modern phone is part of the problem.

When the PSTN was first made digital, home and office phones were used in predictable environments: a bedroom, a kitchen, an office. In these circumstances, telephony became a private affair cut off from the rest of the environment. You’d close the door or move into the hallway to conduct a phone call, not only for the quiet but also for the privacy. Even in public, phones were situated out-of-the-way, whether in enclosed phone booths or tucked away onto walls in the back of a diner or bar, where noise could be minimized.
 
Today, of course, we can and do carry our phones with us everywhere. And when we try to use them, we’re far more likely to be situated in an environment that is not compatible with the voice band—coffee shops, restaurants, city streets, and so forth. Background noise tends to be low-frequency, and, when it’s present, the higher frequencies that Monson showed are more important than we thought in any circumstance become particularly important. But because digital sampling makes those frequencies unavailable, we tend not to be able to hear clearly. Add digital signal loss from low or wavering wireless signals, and the situation gets even worse. Not only are phone calls unstable, but even when they connect and stay connected in a technical sense, you still can’t hear well enough to feel connected in a social one. By their very nature, mobile phones make telephony seem unreliable.
 

How Bogost writes about old telephone handsets, the way they fit into the crook of your neck, the way they warmed up the longer your conversation went, brought me back to those days of my youth when an hour long phone call with a friend or someone you were crushing on was like an audio version of one of Robert Barrett's love letters to Elizabeth Browning. 

Trump

For example, when Trump says he is worth $10 billion, which causes his critics to say he is worth far less (but still billions) he is making all of us “think past the sale.” The sale he wants to make is “Remember that Donald Trump is a successful business person managing a vast empire mostly of his own making.” The exact amount of his wealth is irrelevant. 
 
When a car salesperson trained in persuasion asks if you prefer the red Honda Civic or the Blue one, that is a trick called making you “think past the sale” and the idea is to make you engage on the question of color as if you have already decided to buy the car. That is Persuasion 101 and I have seen no one in the media point it out when Trump does it.
 
The $10 billion estimate Trump uses for his own net worth is also an “anchor” in your mind. That’s another classic negotiation/persuasion method. I remember the $10 billion estimate because it is big and round and a bit outrageous. And he keeps repeating it because repetition is persuasion too. 
 
I don’t remember the smaller estimates of Trump’s wealth that critics provided. But I certainly remember the $10 billion estimate from Trump himself. Thanks to this disparity in my memory, my mind automatically floats toward Trump’s anchor of $10 billion being my reality. That is classic persuasion. And I would be amazed if any of this is an accident. Remember, Trump literally wrote the book on this stuff.
 

From Scott Adams (yes, of Dilbert fame) on the clown genius Donald Trump and how he's quite cleverly using verbal jiu-jitsu to turn his critics' attacks to his favor. A Republican card of Trump and Mark Cuban would be like something out of a satire novel and cause the media to swallow itself. Adams think it would win the election.

James Surowiecki on Trump and why he's won over working-class Republican voters:

Working-class voters face stagnant wages and diminished job prospects, and a 2014 poll found that seventy-four per cent of them think “the U.S. economic system generally favors the wealthy.” Why on earth would they support a billionaire?
 
Part of the answer is Trump’s nativist and populist rhetoric. But his wealth is giving him a boost, too. The Democratic pollster Stanley Greenberg, who’s published reams of work on white working-class attitudes, told me, “There is no bigger problem for these voters than the corruption of the political system. They think big companies are buying influence, while average people are blocked out.” Trump’s riches allow him to portray himself as someone who can’t be bought, and his competitors as slaves to their donors. (Ross Perot pioneered this tactic during the 1992 campaign.) “I don’t give a shit about lobbyists,” Trump proclaimed at an event in May. And his willingness to talk about issues that other candidates are shying away from, like immigration and trade, reinforces the message that money makes him free.
 
Trump has also succeeded in presenting himself as a self-made man, who has flourished thanks to deal-making savvy. In fact, Trump was born into money, and his first great real-estate success—the transformation of New York’s Commodore Hotel into the Grand Hyatt—was enabled by a tax abatement worth hundreds of millions of dollars. Yet many voters see Trump as someone who embodies the American dream of making your own fortune. And that dream remains surprisingly potent: in a 2011 Pew survey, hard work and personal drive (not luck or family connections) were the factors respondents cited most frequently to explain why people got ahead. Even Trump’s unabashed revelling in his wealth works to his benefit, since it makes him seem like an ordinary guy who can’t get over how cool it is to be rich.