Style transfer

Frank Liu developed a technique he calls “style transfer” in which he renders an image in the style of another. 

Bhautik Joshi riffed off of that for video, resulting in experiments like this rendering of clips from Blade Runner in the style of Van Gogh's Starry Night. 

Prisma has capitalized on this technique by taking it mobile. I've enjoyed using the app to render some photos in my camera roll in a variety of styles. I had previously found expensive apps and plugins on the desktop that could do something like this, but now it's available in a free mobile app. Any sufficiently advanced technology is indistinguishable from a free app on your phone, eventually.

I'm looking forward to when Prisma can handle video, it's only a matter of time before style transfer videos like the one above are flooding your social media feeds.

A market opportunity for human ingenuity remains, however, for those who can actually transfer not just the visual style but the entire cinematic grammar of one artist to another. What if Werner Herzog directed Toy Story? What if Stanley Kubrick directed Star Wars? Who wants to see Terrence Malick's take on Captain America?

I draw much too much pleasure from style transfer in prose, like imitations of Hemingway, Cormac McCarthy, and the such. James Wood once wrote of McCarthy:

To read Cormac McCarthy is to enter a climate of frustration: a good day is so mysteriously followed by a bad one. McCarthy is a colossally gifted writer, certainly one of the greatest observers of landscape. He is also one of the great hams of American prose, who delights in producing a histrionic rhetoric that brilliantly ventriloquizes the King James Bible, Shakespearean and Jacobean tragedy, Melville, Conrad, and Faulkner.

My favorite photography tip

Someone was asking me the other day about which of several cameras to buy to improve their photography. The typical photography enthusiast's response to that question is that the camera doesn't matter, and it's the stock response because it's true. However, I'm not immune to a bit of gadget lust so I'll always offer some recommendation if asked.

However, one of the simplest tips about improving one's photography is one I learned from the filmmaking, and it's a simple one that takes no skill to understand or learn. It's a concept that everyone who's ever watched a lot of movies knows, even if unconsciously.

Take photos where your subjects aren't looking directly into the lens.

In the movies, looking into the camera is known as breaking the fourth wall. It takes many actors years of practice to both be aware of where the camera is yet resist the temptation to let it affect their performance. In one student film I worked on, one actor constantly made the mistake of either glancing at camera or actively avoiding the camera with his gaze, ruining multiple takes.

[Most people recognize bad acting, but few understand just how hard it is to be an actor. I had to take an acting class in film school even though I was in the directing program, and it was one of the most frightening and uncomfortable things I've ever done, and it gave me an appreciation for actors that will last until I'm six feet under.]

Occasionally a director will break the fourth wall intentionally, but in the vast majority of narrative movies, the actors never look at the camera. Not once. To do so would break the fictive dream.

The reason this tip works is that photos are stories, and whenever your subject looks into the lens, the story is almost always, “Someone posed for a photo at this place.” It's a story, to be sure, just not an interesting one. What's more, it draws attention to the invisible photographer, too, since the subjects are responding to a camera being held by a person.

The vast majority of photos I see of people on Instagram or Facebook are of people, and of those, the vast majority feature those people looking into the camera. Selfies, posed group photos. They lack, for the most part, drama. They function as visual check-ins. Looking through a bunch of those in a news feed is like scanning a newspaper column which reads, “So and so were here. So and so were there. She was there. They were there.”

The moment your subject looks off camera, suddenly you are a photojournalist. For the viewer, it's as if they are transported to that place, and it puts them in a different state of engagement with the photo, a state of awakening. Suddenly they examine the body language and the arrangement of subjects within the scene (the mise en scène) to try to understand what is happening, just as viewers do when they watch TV and movies. Because the subject isn't looking into camera, the viewer doesn't read the body language as a pose, they read it as natural and thus more honest, worth deeper scrutiny.

As with most tips and rules, this one is not universal. Sometimes a gaze into camera is disarming, makes the viewer complicit, catches them in a moment of voyeurism. Sometimes you really do just want the subject gazing into the lens, as in much of portraiture. And sometimes the story you want to tell really is that some people were in a place together.

But if you're a beginning photographer and want to improve, that's my favorite tip because most photographers who are starting out take lots of pictures of people. At the next wedding or night out, try to shoot all your photos this way. It will force you to work a bit harder because when people pose for a head-on portrait the shooting angles are quite limited, but when you're photographing people who aren't looking at camera you can shoot them from almost any angle. It will cost you nothing.

There is no #nofilter

Our filter-friendly life is coming under so much scrutiny lately. First, hipster Barbie touched a nerve by parodying all the stereotypes of the Instagram hashtag #liveauthentic. Now, Chompoo Baritone, a Thai photographer based in Bangkok, has published a new series of picturesthat reveal all the messy reality we usually crop from our social media shots.
It’s the predictable round up of relaxing work spaces, quiet alone time in the sun, glamorously plated fresh vegetables and—of course—a copy of indie lifestyle magazine Kinfolk. But in Baritone’s photos, the rest of the scene remains in the frame: curious pets and onlookers, messy plates of food, clothes piled on the bed.

Via Quartz. See Baritone's full album here.

This is why I find the #nofilter tag so amusing. Even if you don't apply one of Instagram's preset filters, the mere act of photography is a form of filtering. It's not just what you crop out of the photo, but how many pics you toss before finding one suitable for sharing with your social media following.

Kylie Jenner, someone you might call a “selfie professional,” admits to taking about 500 selfies for everyone one she publishes. You can see that as vain, if you'd like, but in a world where that digital photo is reputational currency, especially for a celebrity who most people won't ever see in person, it is also rational behavior.

Everyday messenger bag

I've backed a ton of Kickstarter projects in my day, and experience tells me that projects that require electronics and hardware almost never ship on time. They are almost always manufactured somewhere in China, and somewhere along the line, some part doesn't come back the way the creators hoped, and it's back to the drawing board, and the promised release date becomes a joke, albeit not an amusing one for either party.

Physical goods that don't require electronics, however, have shipped on time more often than not. Fashion items, for example. Thus I'm hopeful to receive one of these Everyday Messenger Bags by the end of 2015, hopefully before Christmas vacation.

I've long wanted a bag that would hold both a 13" to 15" Macbook Pro and a camera body plus a few lenses. This looks like it might fit the bill, and the latest update includes a potentially clever tripod carrying solution.

Check out the video below and/or the Kickstarter page for more detail.


The smartphone is the dominant camera in the world now, combining best in class portability with good enough quality. I still own an SLR, though, because for some special situations, the superior lens selection and larger sensor is worth its larger form factor (among other qualities).

Light is a company with a novel approach to bringing smartphone cameras up to SLR quality.

Rather than hewing to this one-to-one ratio, Light aims to put a bunch of small lenses, each paired with its own image sensor, into smartphones and other gadgets. They’ll fire simultaneously when you take a photo, and software will automatically combine the images. This way, Light believes, it can fit the quality and zoom of a bulky, expensive DSLR camera into much smaller, cheaper packages—even phones.
Light is still in the early stages, as it doesn’t yet have a prototype of a full product completed. For now it just has camera modules whose pictures can be combined with its software. But the startup says it expects the first Light cameras, with 52-megapixel resolution, to appear in smartphones in 2016.

Artist's rendering of what a Light camera array on a smartphone might look like.

I've been curious to see how smartphones continue to improve camera quality. This seems like one credible vector.

Deliberate underexposure with Nikon DSLRs

Nikon has been killing it with its DSLR sensors in recent years in terms of how much detail can be pulled out of the shadows, and Deci Gallen has a great piece on a creative way of shooting that exploits that capability.

As photographers, we strive for correct exposure but the ability of modern Nikon cameras to find details in shadows opens up a debate as to what correct exposure actually is. More and more, I find myself technically exposing wrong with post-processing in mind.

As a wedding photographer, my wife and I often find ourselves shooting portraits when the sun is highest in the sky: conditions generally considered to be unfavorable in portraiture. In the past these situations were addressed with fill flash, reflectors or frantically searching for open shade. The current range of Nikons gives us another option – creative underexposure.


Having the ability to draw details from shadows so cleanly has changed not only how we shoot and post-process, but also the equipment we need to take certain kinds of shots.

Our flash triggers have been mostly redundant for 2 years now and our flashguns only really come out on the dance floor. We don’t use reflectors at all. The extra couple of minutes spent in post is negated by the time saved setting up equipment while shooting — allowing us to spot and shoot scenes quickly, taking advantage of beautiful but often fleeting lighting conditions.

Check out the piece to see some examples of what's possible.

I had skipped some generations of Nikon DSLRs and found myself picking mine up less and less given the weight of a fully loaded body, but I just picked up a D750 recently and it has won back my mindshare from other cameras like my iPhone.

The D750 isn't in their pro line of DSLRs, with their built in vertical grips and magnesium body construction, but that means it's much lighter. I love that it has integrated WiFi so I can quickly get pictures from my DSLR to my iPhone. It's something Nikon should've added years ago and that all modern DSLRs should have as a default feature, and I doubt I'll ever buy another camera that doesn't mark that checkbox.

And yes, the shadow recovery is fantastic. I've pushed shadows in RAW photos out of the D750 up to 4 stops, and I've heard that 5 stops is possible. Even before reading Gallen's article I'd been shooting as he recommends, usually with exposure compensation of -0.3 to -0.7 turned on by default. To me, it's far more convenient to shoot this way and bring shadows up in Lightroom than to shoot two or three photos at different exposures and blend them using Photoshop or something like HDRSoft's Photomatix Pro. Call it the lazy man's HDR.

Total recall

Rob Smith uploaded photos from a holiday in France to Google+ which processed the pics using its AutoAwesome algorithm(s). When Smith went to look at the results, he spotted something peculiar with one of the photos.

It’s a nice picture, a sweet moment with my wife, taken by my father-in-law, in a Normandy bistro. There’s only one problem with it. This moment never happened.

I haven't opened Google+ in ages, I had no idea its algorithms would try to combine the best elements of photos shot in a burst. In this case, it looked to be trying to capture the best smiles from all the faces in the photo to put into a single composite.

You may say that the AIs in the cloud helped me out, gave me a better memory to store and share, a digestion of reality into the memory I wish had been captured.

But I’m reasonably sure you wouldn’t say that if this were a photo of Obama and Putin, smiling it up together, big, simultaneously happy buddies, at a Ukraine summit press conference. Then, I think algorithms automatically creating such symbolic moments would be a concern.

And why am I saying “then”? I’m certain it’s happening right now. And people are assuming that these automatically altered photos are “what happened”.

This was fairly seamless work on the part of the AI. Search for panoramic photo fails and you'll see some of the Frankensteinian horrors that can result from digital stitching gone awry. I half suspect the Human Centipede movies were inspired by a botched panoramic pic.