Flat tire

Team Sky cyclist Richie Porte got a flat tire near the end of stage 10 of the Giro D'Italia, and a fellow Aussie from another team, Simon Clarke, stopped and gave Porte his wheel in a gesture of sportsmanship.

The moment was captured on social media, and when race officials saw the proof of the exchange, they penalized Porte 2 minutes and fined him 200 Swiss Francs for violating a rule forbidding members of one team from helping another (Clarke, not a contender, received the same fine). It's such an obscure rule that Porte and Clarke never thought twice about the exchange in the heat of the moment.

Cycling gets beat up much worse than other sports for enforcing strict drug testing. More cyclists are caught, leading the public to look upon the sport as tainted, but the drug testing rules in most other sports are a joke (e.g. the NBA, NFL, soccer, tennis) compared to cycling, so I'll defend cycling for putting its testing where its mouth is.

However, this is one time they should have forgiven Porte and Clarke. What could have been a great moment for the sport, a gesture of the type of sportsmanship we should encourage, turned into a moment where the letter of the law took precedence over the spirit of the law, and the spirit of sport, which is fair play. Porte already lost time on race leaders by virtue of the unfortunate flat, and what could have been a more exciting Giro lost one of its leading contenders.

For the same reason, race leader Alberto Contador shouldn't be fined for removing his helmet during the race, even as it is against the rules, and even as, to no one's surprise, Twitter users are in an uproar over the subjective application of the rulebook.

Meh

 From Adam Gurri's The Arc of the Universe Bends Towards Meh:

I think that McCloskey’s Bourgeois Dignity does a good job demolishing the notion that our prosperity rests on the continued poverty of others, and it is not alone in that. But if there’s one solid thing I took away from The Great Stagnation, and especially the debates that it sparked, it is that it is incredibly hard to think about progress, decline, and well-being, period.
 
Cultural conservatives often claim we’re in decline from the point of view of deteriorating values. Roy Baumeister’s book covering his work on willpower makes the claim that the Victorians took self-control much more seriously than we do, and we are the worse for it. More extreme claims of moral decline, are, of course, quite common. MacIntyre’s After Virtue tells you where he stands on the matter in his title. Mencius Moldbug, the the pseudonymous prophet of the Internet neoreactionaries, resurrects Carlyle among others to make the claim that we have fallen into complete lawlessness and immorality. He thinks the Victorians eradicated violent crime, but liberal ideology led us to abandon the very values that made such eradication possible.
 
Consider Baltimore. For those looking through the lens of decline, the writing is on the wall—the barbarians have stormed the gates, they are on the inside, they’re just waiting for the right moment to deliver the final blow to our crumbling civilization. For those looking through the lens of progress, the riots in Baltimore are unfortunate but the peaceful protests, and increased media scrutiny of cop violence there, in New York, and in Ferguson, Missouri all hint at a possibility of important reform. A chance to take a next step in the journey that began with the abolition of slavery, continued through the civil rights movement, and continues to this day, if we who inherited it make ourselves good caretakers.
 

I still can't escape the feeling that to seem smart, it's better to have strong opinions, loosely held, with about 50% of them being contrarian. Whether you're right or not matters very little in too many situations.

Good luck being born tomorrow

97% of people born tomorrow will be in a country that is authoritarian, communist, doesn’t support same sex marriage, does not allow abortion, supports capital punishment or has seen over ten thousand deaths in recent armed conflicts. Good luck!
 

From part 1 of 2 of Good Luck Being Born Tomorrow which includes some. The statistics within are eye opening.

The meat is in part 2.

300 million years ago, a supercontinent called Pangaea was formed, that later broke apart into continents that we inhabit today. Modern technology has turned the world back into Pangaea – a world where everything is connected. You can have a live video call with someone across the world in seconds (you’re welcome!) or you can find yourself on the next continent in 10 hours if the need be. Yet we’ve built these imaginary borders around us that limit human potential. These borders are a direct result of historic military conflict. And allowing your fate to be determined by things that took place before your birth feels like accepting defeat before you even get started.
 
This all brings me to the third option of what to do when the environment is not favorable – you can change it! As weird as it sounds, one of the means to cause change is actually also to migrate (for those who already have that freedom). As opposed to a slow democratic process of giving your marginal vote every four years in the hope of changing something you care about, you can vote with your feet already today. You have a choice between expressing your needs at a popularity contest twice a decade or putting constant pressure on places.
 
Not only will you find yourself in a place where your problem is already fixed (remember – that’s why you moved!), you’re also putting real budget pressure on the old place by taking your taxes elsewhere (will hurt every month). With enough people doing that, the competition for taxes forces incumbent states to fix their environments.
 
In positive political theory, this is described as the Tiebout hypothesis.
 

Living in the Silicon Valley media bubble, with its insatiable need to produce some minimum volume of news coverage every day, can lead to a surplus of technology naysaying in one's diet. I believe it's a useful corrective to have sites like Gawker and Valleywag and that ilk of gadfly to point out when the emperor has no clothes, even if the level of trolling is on the high side.

Still, climb up to higher vantage point and it's hard to disagree that technology is perhaps the greatest hope for lifting the standard of living for the most number of people in the world, whether directly or indirectly.

Yes, the tech industry has its problems, and it has its share of ridiculous douchebags, some of them with absurd amounts of wealth. And yes, perhaps some of our technology is too addictive, and maybe it is transforming some of us into intolerable social-network-preening narcissists.

Visit other parts of the world, though, and see what a life-changing event it is to get a cell phone and internet access. Observe people making a living selling goods on social networks, or watch people coordinate protests against authoritarian governments, and on and on. I'll continue to take the bad with the good for the net gain to society.

It’s not hard to imagine the invention of blockchain (the core of Bitcoin) having remarkable implications in the developing world through enabling micro-transactions and possibly helping eliminate corruption through blockchain based electronic voting. There’s stuff coming that we haven’t even thought of yet.
 
Technological innovation is finally making it possible to meet the assumptions of the Tiebout model (mobile consumers, complete information, abundant choices, telecommuting etc.). Bringing transparency into the world of basic freedoms, taxes, government services, public goods and reducing the cost/pain associated with moving will be the way to give us a future, where every nation state will have to compete for every citizen. Can you imagine that world?

Competing against robots

Some scholars are trying to discern what kinds of learning have survived technological replacement better than others. Richard J. Murnane and Frank Levy in their book “The New Division of Labor” (Princeton, 2004) studied occupations that expanded during the information revolution of the recent past. They included jobs like service manager at an auto dealership, as opposed to jobs that have declined, like telephone operator.
 
The successful occupations, by this measure, shared certain characteristics: People who practiced them needed complex communication skills and expert knowledge. Such skills included an ability to convey “not just information but a particular interpretation of information.” They said that expert knowledge was broad, deep and practical, allowing the solution of “uncharted problems.”
 
These attributes may not be as beneficial in the future. But the study certainly suggests that a college education needs to be broad and general, and not defined primarily by the traditional structure of separate departments staffed by professors who want, most of all, to be at the forefront of their own narrow disciplines. But this old departmental structure is still fundamental at universities, and it is hard to change.
 

Full article here from Robert Shiller.

A few random thoughts. Disciplines which are purely about knowledge accumulation are risky if the type of knowledge acquired is that which computers can accumulate in a fraction of the time. Lots of Ph.D's seem unlikely to be economically worthwhile considering the cost of higher education.

Watch the virtual assistant on your phone. Siri or Google Now are good benchmarks for what skills are becoming obsolete, and which are still of great value.

Most humans still prefer a bit of entropy and warmth from those they interact with, especially in the service sector, and indexing high on that still commands a premium.

Universal sign language

“Decide” is what is known as a telic verb—that is, it represents an action with a definite end. By contrast, atelic verbs such as “negotiate” or “think” denote actions of indefinite duration. The distinction is an important one for philosophers and linguists. The divide between event and process, between the actual and the potential, harks back to the kinesis and energeia of Aristotle’s metaphysics.
 
One question is whether the ability to distinguish them is hard-wired into the human brain. Academics such as Noam Chomsky, a linguist at the Massachusetts Institute of Technology, believe that humans are born with a linguistic framework onto which a mother tongue is built. Elizabeth Spelke, a psychologist up the road at Harvard, has gone further, arguing that humans inherently have a broader “core knowledge” made up of various cognitive and computational capabilities. 
 
...
 
In 2003 Ronnie Wilbur, of Purdue University, in Indiana, noticed that the signs for telic verbs in American Sign Language tended to employ sharp decelerations or changes in hand shape at some invisible boundary, while signs for atelic words often involved repetitive motions and an absence of such a boundary. Dr Wilbur believes that sign languages make grammatical that which is available from the physics and geometry of the world. “Those are your resources to make a language,” she says. As such, she went on to suggest that the pattern could probably be found in other sign languages as well.
 
Work by Brent Strickland, of the Jean Nicod Institute, in France, and his colleagues, just published in the Proceedings of the National Academy of Sciences, now suggests that it is. Dr Strickland has gone some way to showing that signs arise from a kind of universal visual grammar that signers are working to.
 

Fascinating. Humans associate language with intelligence to such a strong degree, I predict the critical moment in animal rights will come when a chimp or other monkey takes the stand in an animal testing court case and uses sign language to give testimony on their own behalf.

Reading the test methodology employed in the piece, I wonder if any designers out there have done any similar studies with gestures or icons. I'm not arguing a Chomskyist position here; I doubt humans are born with some basic touchscreen gestures or base icon key in their brain's config file. This is more about second-order or learned intuition.

Or perhaps we'll achieve great voice or 3D gesture interfaces (e.g. Microsoft Kinect) before we ever settle on any standards around gestures on flat touchscreens. If you believe, like Chomsky, that humans have some language skills (both verbal and gestural) hard-wired in the brain at birth, the most human (humane? humanist?) of interfaces would be one that doesn't involve any abstractions on touchscreens but instead rely on the software we're born with.