Australian researchers have discovered a way to take coherent 3D photographs using nothing more than a smartphone camera and a little arm waving.
To create a full 3D scan of an object, all users have to do is take a single picture and then move the phone around the object being photographed, according to Marc Pollefeys, professor at the Institute for Visual Computing. at ETH Zurich, a technical university in Switzerland.
The app, which Pollefeys demonstrated at a computer-vision conference in Sydney, Australia Dec. 4, automatically records images from the moving camera, decides which angles are needed to form a complete image and automatically stitches them together, or tells the user what additional angles are needed.
The app uses inertial sensors in the phone to decide where it is in relation to the object, and which images to capture.
There are other apps that accomplish the same thing, but rely on servers in the cloud to do most of the photo processing, which makes 3D scans difficult inside museums or other buildings in which good wireless connections may not be available, Pollefeys said: “Only two years ago, such a software only run on massive computers. We were able to shrink processes down on smartphone level and make them highly efficient.”
Because it captures all sides of an object, the app would allow users to take pictures of loved ones and decide afterward whether to display a face-forward or profile shot, for example, as well as allowing them to capture 3D images of statues or other objects.
The images can be used to examine objects in three-dimensional detail after the user is back in the office, and can be added to augmented-reality applications or nearly any other purpose, without requiring expensive, compute-intensive photo assembly or post-production processing on high-performance hardware (or depending on the skills of photographers or designers experienced in digital-image processing).
A video of the patent-pending app shows a researcher taking a 3D image of a bronze fountain statue while hardly moving the camera more than a few inches in any direction.
More information on Pollefeys’ research and results are posted on the team’s ETH Zurich site.
Image: ETH Zurich