Light

The smartphone is the dominant camera in the world now, combining best in class portability with good enough quality. I still own an SLR, though, because for some special situations, the superior lens selection and larger sensor is worth its larger form factor (among other qualities).

Light is a company with a novel approach to bringing smartphone cameras up to SLR quality.

Rather than hewing to this one-to-one ratio, Light aims to put a bunch of small lenses, each paired with its own image sensor, into smartphones and other gadgets. They’ll fire simultaneously when you take a photo, and software will automatically combine the images. This way, Light believes, it can fit the quality and zoom of a bulky, expensive DSLR camera into much smaller, cheaper packages—even phones.
 
Light is still in the early stages, as it doesn’t yet have a prototype of a full product completed. For now it just has camera modules whose pictures can be combined with its software. But the startup says it expects the first Light cameras, with 52-megapixel resolution, to appear in smartphones in 2016.
 

Artist's rendering of what a Light camera array on a smartphone might look like.

I've been curious to see how smartphones continue to improve camera quality. This seems like one credible vector.

Be the side(s) in your multi-sided market

Anyone who's tried to start a multi-sided market business (the most common being a two-sided marketplace, like eBay, with buyers and sellers) knows one of the first major challenges is the chicken-and-egg problem. In order to attract sellers, you need buyers, but in order to attract buyers, you need sellers. On a dating service, the two sides are obvious. And so on.

How to get out of this infinite loop of emptiness? Kickstart things by being one or more of the sides in your multi-sided market. In other words, fake it til you make it.

Seeding and Weeding
 
Dating services simulate initial traction by creating fake profiles and conversations. Users who come to the site see some activity and are incentivized to stay on. Marketplace sites may also show fake activity to attract buyers and sellers.
 
Seeding Demand
 
In the book, Paypal Wars, Eric M. Jackson talks about how PayPal grew a base of sellers who accepted PayPal by creating demand for the service among buyers. When Paypal figured that eBay was their key distribution platform, they came up with an ingenious plan to simulate demand. They created a bot that bought goods on eBay and then, insisted on paying for it using PayPal. Not only did sellers come to know about the service, they rushed onto it as it already seemed to be getting popular. The fact that it was way better than every other payment mechanism on eBay only helped repeated usage.
 

Other great war stories and tips within.

The end of routine work

In recessions of the 1960s and 1970s, routine jobs would fall during the recession but quickly snap back. But after the recession in 1990, something changed. Routine jobs fell and, as a share of the population, never recovered. In the recessions in 2001 and in 2007-09 they fell even further. The snapback never occurred, suggesting that many firms began coping with recessions by scrapping tasks that could be automated or more easily outsourced.
 
For his part, Mr. Siu thinks jobs have been taken away by automation, more than by outsourcing. While some manufacturing jobs have clearly gone overseas, “it’s hard to offshore a secretary.” These tasks more likely became unnecessary due to improving technology, he said.
 
In the late 1980s, routine cognitive jobs were held by about 17% of the population and routine manual jobs by about 16%. Today, that’s declined to about 13.5% and 12%. (The figures are not seasonally adjusted and so are displayed in the chart as 12-month moving averages, to remove seasonal fluctuations).

...

But they are not among the labor market’s pessimists who fear that robots will render humans obsolete. Their work shows the economy has continued to generate jobs, but with a focus on nonroutine work, especially cognitive. Since the late 1980s, such occupations have added more than 22 million workers.
 

By Josh Zumbrun. The idea that U.S. unemployment has jumped to a higher plateau because of jobs moving overseas or because they're replaced by technology is not a new one, but it's useful to see data and charts to support the claim.

Brad Delong comments:

Note that these jobs are “routine” only in the sense that they involve using the human brain as a cybernetic control processor in a manner that was outside the capability of automatic physical machinery or software until a generation ago. In the words of Adam Smith (who probably garbled the story):
 
In the first fire-engines, a boy was constantly employed to open and shut alternately the communication between the boiler and the cylinder, according as the piston either ascended or descended. One of those boys, who loved to play with his companions, observed that, by tying a string from the handle of the valve which opened this communication to another part of the machine, the valve would open and shut without his assistance, and leave him at liberty to divert himself with his playfellows. One of the greatest improvements that has been made upon this machine, since it was first invented, was in this manner the discovery of a boy who wanted to save his own labour…
 
And Siu and Jaimovich seem to have gotten the classification wrong: A home-appliance repair technician is not doing a routine job–those jobs are disappearing precisely because they are not routine, require considerable expertise, are hence expensive, and so swirly swapping out the defective appliance for a new one is becoming more and more attractive.
 

As any economist would prescribe, it's become more critical when thinking about one's career and education to focus on humans' comparative advantage versus computers. But I also recommend people focus on their own unique comparative advantage: any intelligent person can do many things, but what can you do better than most anyone else? In winner-take-all markets, more common in this third industrial revolution, it's ideal to give yourself the best chance to be one of those winners, and consider it a bonus that those areas often overlap with one's personal passions (whatever you think of the 10,000 hour rule, most would agree it's easier to sustain through so many hours if one is more emotionally invested).

It's also best to accept that one will have to learn new skills many times in one lifetime. It used to be that once you finished college, the education phase of life was considered over. This is already obsolete for many. Even most programmers, supposedly the most insulated workers from technological job obsolescence, have to learn new programming languages or technologies on the job every few years now.

In the future, education will generally be accepted to mean a lifelong process. Continuing education will be the default. An undergraduate degree will simply be a first milestone in signaling one's skills and sociability to potential employers. More time will have to be set aside to level up, and resources and services to support this lifelong education continue to proliferate. I view the need for lifelong learning as a positive. Who was it that said that you're young at heart to the degree that what you want to learn exceeds what you already know?

Seriously, who said that? I don't know. Add that to my list of things to learn.

Payments as social network

Generally speaking, Venmo peaks around 7 pm on the east coast of the US and stays fairly strong until about 3am, which is midnight on the west coast. The timing—and corresponding emoji—suggests the preponderance of Venmo transactions are people paying each other back for dinner and drinks.
 
Emoji use is markedly different at quieter times of the day. For example, the house emoji ranks highly from 7 am to 3pm EST. And the car and taxi emoji crack the top five between 5am and 8am EST.
 
...
 
Some of these data are skewed by messages containing a lot of emoji. The “pile of poop” emoji only holds the top spot in the midnight hour because a single payment used it 1,116 times.
 

On the usage patterns for the payment service Venmo, as interpreted by the emoji used to tag public transactions.

I like to think I'm young at heart, but the fact that Venmo is also a social network makes me feel old. Payment transactions are one form of community interaction, sure, but looking through my Venmo feed feels like peering through a fog of data pollution. Perhaps the next generation of kids really does live with their default life privacy settings toggled to public.

Speaking of the pile of poop emoji, it seems only a matter of time until someone releases an app that allows you to broadcast when you are taking a poop. It should be a mobile app just called Poop. I leave it to the design geniuses at Apple to figure out what type of haptic feedback a poop notification should emit on the Apple Watch.

Skipping the web

Why India's Flipkart abandoned its mobile website:

Now, if you tack on a gigantic population with miserable internet connection speeds, the prospect of scaling up your website operations and back end to deal with not only the overload on it, but also the abysmal experience on the consumer end, whether it is mobile or desktop, is even more bleak. An app allows a user to stay logged in while updates and other information are efficiently and constantly downloaded, ready for consumption almost instantly. It is, in fact, perfect for low-bandwidth situations.
 

People often write of places like India or Africa bypassing landlines or PCs to skip ahead to technologies like wireless or smartphones, but I haven't heard of countries treating the web as one of those intermediate technologies to be hopped over.

Having spent lots of time working out of China, I see the sense in it. Internet connection speeds are really slow there, and loading the web can be painful. Even with an upgraded pipe into the building, when I worked out of Hulu's Beijing office, I found myself browsing the web a lot less simply out of impatience.

Having grown up in the U.S., the web was one of the first and still longest-running touchpoint to the internet. My first was using newsgroups in college, and the web came about towards the end of my undergrad days. I can understand why so many in the U.S. are nostalgic and defensive of the web as a medium. Seeing so much content and online interaction move behind the walls of social networks seems like an epic tragedy to many, and I empathize.

Many people in India, China, and other parts of the world, where bandwidth is low and slow, and where mobile phones are their one and only computer, have no room for such sentimentality. They may never have experienced the same heyday of the web, so they feel no analogous nostalgia for it as a medium. Path dependence matters here, as it does in lots of areas of tech, and one of the best ways to detect it is to widen your geographic scope of study outside the U.S. Asia is a wonderful comparison group, especially for me because I have so many friends and relatives there and because I still interact with them online at a decent frequency.

In the U.S., many tech companies were lauded as pioneers for going mobile first when in Asia companies are already going mobile only. In some ways, Asia feels like it lives in the past as compared to the U.S., especially when one sees so many fast followers of successful U.S. technology companies, but in a surprisingly large number of ways, Asia lives in our near future.

Fiction's willful ignorance of tech

“And I just don’t know how to write a novel in which the characters can get in touch with all the other characters at any moment. I don’t know how to write a novel in the world of cellphones. I don’t know how to write a novel in the world of Google, in which all factual information is available to all characters. So I have to stand on my head to contrive a plot in which the characters lose their cellphone and are separated from technology.”
 

Ann Patchett on one of the chief problems of novel-writing today. The internet has disrupted lots of things, and one of those is dramatic information asymmetry

Steve Himmer on one popular solution for fiction writers:

In literary fiction, the more popular solution seems to be relying on settings close to the present, but far enough back to avoid such inconvenience. Granted, the popularity of the 1970s, 1980s, and early-1990s as settings also owes plenty to generational shifts in literary production as people write about formative periods and the years they remember. But it also avoids any number of narrative problems and allows writers to go on telling stories in the way they are used to, rather than incorporating the present in ways that are difficult and disruptive. When I recently wondered on Twitter — one of those very disruptions — if we’ve reached the point of needing a term for this kind of setting, author Jared Yates Sexton suggested “the nostalgic present.” And while it’s easy enough to incorporate mention of that into this essay, where might a tweet fit into a novel? As dialogue, formatted like any other character’s utterance? Or embedded with timestamp and retweet count and all? What happens when our characters spend half their novel on Twitter, as so many of us spend our workdays? It’s a hard question, but not one that gets answered when writers aspire to be more like Andras Schiff than Lukas Kmit.
 

Himmer goes on to urge more writers to embrace the technology of today and incorporate it into their fiction, and I'm with him. It strikes me as lazy screenwriting when they incorporate one of those old school answering machines just so the audience can hear what message is being left. I haven't seen one of those in ages. If it's meant to be a message missed, show us the person as they leave the message so we can hear it. If we're meant to see the recipient read as they hear the voicemail, give us the audio as the person listens to the voice message.

Younger fiction writers, in particular, have a great opportunity here. Embrace the massive role that all this technology plays in our lives and teach readers about how it is affecting us and how we might cope. Novels are supposed to, among other things, illuminate the human condition. No one using all this technology should feel any less human, but the absence of technology in art leaves an odd temporal void in which we can only travel to the past or the future or, as Sexton suggests in the passage above, the nostalgic present.

One way to create demand

An ad from the early 1900's, with the opening paragraphs excerpted below:

Milly caught the bride's bouquet but everybody present knew that nothing would come of it...that she wouldn't be the next to marry by a long ways...and they knew the reason why, too.
 
People with halitosis (unpleasant breath) simply don't get by. It is the unforgivable social fault.
 
You never know when you have it—that's the insidious thing about it. Moreover, you are quite likely to have it, say dental authorities. Conditions present even in normal mouths constantly produce objections odors.
 
Don't take a chance
 
The one way to make sure that your breath does not offend others is to rinse the mouth with Listerine.
 

From How "Clean" Was Sold to America with Fake Science in Gizmodo. Worth a read if for nothing else than to see some of the photos of unbelievable ads about body odor. Even the use of the term “halitosis” to give bad breath a scientific term, as if it were some serious malady, is fiendishly clever. It's also shocking how the ads pictured are all targeted at women, reflecting the sexism of their times.

The tech industry has a lot to learn from consumer packaged goods companies about how to manufacture demand or consumer desire. The ads in this piece are negative and scare mongering, but companies like Procter & Gamble are just as good at generating desire with positive emotional triggers. It still feels like most tech brand advertising derives from viral stunts. One notable exception, of course, is Apple, most of whose ads now have more in common with fashion ads than technology ones.

The rise of the intangible corporation

Justin Fox quotes Oxford business professor Colin Mayer riffing off of the seven age of man from Shakespeare's As You Like It.

At first the merchant trading company established by royal charter to undertake voyages of discovery and promote commerce around the world. 
 
Then the public corporation created by Acts of Parliament to engage in major public works and the building of canals and railways. 
 
Then with the freedom of incorporation in the 19th century, the private corporation -- the seedbed of the industrial revolution and the manufacturing corporation.
 
Next comes the service firm and the rise of the financial institution.
 
The fifth age is the transnational corporation putting a girdle around the world and running rings around national governments.
 
Last scene of all that ends this strange eventful history is the mindful corporation -- sans machines, sans man, sans money, sans everything.
 

Mayer uses WhatsApp as his canonical example of a company with no assets and very few employees and yet a huge market cap (given its $22 billion purchase by Facebook), but just a short while before that Silicon Valley was all abuzz about Instagram for the same reason, albeit a lower price in relative terms.

Just wait until VR goes mainstream. The most valued bricks and mortar and real estate of today are digital. It's a lot cheaper than the real thing, and a whole lot less regulated, too. Tech companies do love their degrees of freedom.