Since the Talk

Exclusive demo: Blaise Agüera y Arcas shows how the new Photosynth lets you swoop through 3D space

Posted by: Kate Torgovnick May

About a decade ago, Blaise Agüera y Arcas and his team started on a not-so-small mission: “To reinvent the whole enterprise of photography for ordinary people.”

He revealed the first step back at TED2007. Blaise Agüera y Arcas: How PhotoSynth can connect the world's imagesBlaise Agüera y Arcas: How PhotoSynth can connect the world's imagesIn a viral talk dotted with ooohs and aaahs from the audience (his strategy? “I talked really fast”), Agüera y Arcas demoed Photosynth, which let users take multiple images of the same physical space and assemble them together. “[Each photo was] like a 2D poster hanging in space, and we tried to fit those posters onto something like the real 3D surface it represented,” says Agüera y Arcas. It was cool — but, he admits, a bit glitchy.

This week, Agüera y Arcas’ team, now at Microsoft Research, released the new Photosynth, a set of tools that lets anyone create a 3D experience that swoops down a path or spins around an axis. While these new “synths” look Steadicam-smooth, they’re made by everyday users with cameraphones, snapping anywhere from a dozen to 100+ images, often sans tripod. And it’s a quantum leap from the already-jawdropping tech of his viral demo.

“[Photosynth 1] worked really well with pictures that hung on a wall — a close-up of a painting, for example. It worked less well if you were taking photos of real, three-dimensional environments. You can’t really reduce that to a series of posters,” says Agüera y Arcas. “The other problem with Photosynth 1 was that because we didn’t impose any constraints on the way the camera moves, or how you took your photos to make your synth, you ended up with these tangled spiderwebs of photos. It was generally unclear how to move around in it … It also took a really long time to shoot all of those pictures because, you know, you’re trying to cover space with your images.”

The team thought about these issues before releasing Photosynth as an app for mobile devices.

“That was the first full-360º, spherical panorama stitcher made for a mobile device, and so it was clearly revolutionary,” says Agüera y Arcas. (Apple and Google quickly followed suit.) “You could aim your iPhone around from one spot, and it would stitch — in real-time — images to cover the entire sphere, to make spherical panoramas.”

But still, Agüera y Arcas and his team — including Photosynth project lead David Gedye — were not completely pleased.

“Where we sort of failed, I think, was that the panoramas never stitch,” Agüera y Arcas says. “When you hold the mobile phone to make the panorama, you never do a good enough job for the result to be useful for 3-D reconstruction … You can’t hold the camera still enough — you always end up with seams and things.”

Rather than hammering away at this frustrating problem, Agüera y Arcas turned the problem on its head. “We realized that we could really make hay out of the idea of camera movement during the capture of a panorama instead of fighting it,” says Agüera y Arcas. “So instead of insisting that you hold the camera totally fixed, we can say, ‘Look, there are a bunch of different ways of moving the camera.’”

To capture a synth, the photographer can move in many ways: walking with the camera to capture forward movement, scanning the camera sideways to capture a flat surface, moving the camera around a point to create a spin, or moving it in an arc to create a panorama. Says Agüera y Arcas, “We can use the movement of the camera to reconstruct parallax in 3D from that set of photos. Instead of taking the set of photos and trying to fuse them, as we did with the original panorama app, we can leave them unfused and reconstruct 3D, and generate these kind of swooshy, continuous things.”

The results are true three-dimensional panoramas, where the depth of field changes and things move in and out of focus as the viewer swipes or moves their mouse. This incredible view of St. Bonaventure Church in Philadelphia, midway through its demolition, allows you to spin around a column as the place eerily crumbles. An unreal flight over the Valley of Silence in Nepal, with Mount Everest on your left, leaves you feeling as if you’re airborne. And a very lifelike walk through Fremont, Seattle, could leave you feeling like you could knock on a door or step aboard a houseboat. Agüera y Arcas shows many, many more examples of what is possible with the new Photosynth in his demo, above.

Want to see Photosynth 2 in action? Click on this image of St. Bonaventure Church to give it a road test.

Want to see the new Photosynth in action? Click on this image of St. Bonaventure Church to give it a road test.

Synths are clearly distinct from photos. But they are also a distinct form from videos, says Agüera y Arcas, and not just because each frame of a synth is high-def. “Video is purely one-dimensional. It’s meant to be played forward at one speed. This is something that you can swipe, a range of views,” he says. “It is about spatial exploration, primarily, not temporal exploration. Although given that you often take those pictures in sequence, you certainly can get hyperlapse-like things, where time and space both flow together.”

Yes, synths are “really interesting social media artifacts,” says Agüera y Arcas. But his team imagines that, down the road, synths could be linked together to form a spiderweb of 3D spaces, snapping together like Lego blocks.

The new Photosynth represents many years of work for his team at Microsoft. But Agüera y Arcas estimates that only about 15 to 20% of his lab are working on it right now. As for what the others are cooking up, we’ll just have to see. “Most of it I can’t really talk about yet,” Agüera y Arcas tells the TED Blog. “But it’s in a much broader, natural user interface, wearables, virtual reality kind of space.”

Watch Agüera y Arcas’ past talks: