Listen beautiful relax classics on our Youtube channel.
But what if you’ve only got a single 2D shot? Since this is the year 2020, you’d think that by now we’d have Esper machines. That’s what Deckard used in the original Blade Runner, set in 2019, to zoom around in a 2D photo to discover data hidden in the third dimension:
We might not have Esper machines yet, but we’re getting closer. Last fall Cornell University computer vision researchers Simon Niklaus, Long Mai, Jimei Yang and Feng Liu released a collaborative paper called “3D Ken Burns Effect from a Single Image,” detailing a neural network they’d created to pull the trick off. An unaffiliated experimental coder named Jonathan Fly subsequently applied older depth-mapping techniques to the results in the research paper, and yielded this:
“Some images do well even with the older stuff, others are riddled with artifacts,” Fly writes. “I did use dramatic angles to make the failures stand out, in part because I do love a good artifact.”