You may also download this file. Running time: 8:42
With the release of the 3.1 iPhone OS, application developers will finally be able to develop augmented reality (AR) apps. In other words, Terminator Vision is right around the corner. I recently talked to Chetan Damani, one of the founders of Acrossair, about their new AR applications, Nearest Tube.
According to Damani, Nearest Tube uses the iPhone camera to display whatever the iPhone is pointed at, overlayed with data about the nearest London Tube stations. He says that the 3GS is a great platform for AR applications. "This only works on the iPhone 3GS phone, because what we need is the actual compass that's built into the 3GS. That compass allows us to see which direction the user's pointing their iPhone in. Without that compass, we wouldn't know which direction they're pointing in so we wouldn't be able to overlay the data in the relevant spots. And the 3GS is quite a powerful phone. It's got enough megahertz, about 600 megahertz. It's got twice the memory of the 3G phone. And we haven't come up to any limitations in the performance at all. It's also got a very good OpenGL engine built into it, which allows us to create and render these graphics quite easily. "
He went on to say that it's pretty simple to do AR applications using the new 3.1 APIs, due out in September. " It's a pretty straightforward API. There's no complexity in there. All it does is it just switches on the video feed at the background. That's the only API that's published. All we're doing is using that video feed at the back. It just displays the video feed as if it's a live camera feed. And on top of that, all you need to do is calculate where the user is, utilize the compass to identify which direction the user is actually pointing their iPhone and then you can just overlay the data on top of that. All you need is the geolocation data. So if you have the longitude and latitude information, you put that into a database and then you can call that longitude and latitude information to plot it onto this 3D landscape. And it's not going to be that difficult to do. The difficulty really comes in when you want to try to create a much better user experience and when you want to start doing more complicated things with the augmented reality. So displaying information on top of an iPhone with a video feed isn't going to be so hard. The difficulty, as I mentioned, is going to be getting hold of the data in the first place and then trying to do something really interesting with that data."One of the limitations that developers are going to run up against, according to Damani, is that while the compass data is fairly accurate, the limitations of the GPS will make very fine-grained positional data difficult to get. " The compass is 3D, so that's the main issue with it. It's not an actual real compass that's sitting inside. It's something called a magnetometer. And what it does is it provides a complete 3D compass. That's why it's not 100 percent accurate when you hold it flat. But as you put a slant on it, you do get a bit more accuracy in there. So the compass isn't perfect. The other issue is on the GPS side of stuff, that's only accurate to about 20 feet. So we can't accurately identify where that user's standing. It can only be accurate to within the 20 feet radius. So that way we're not able to identify objects which are very close to each other or identify objects in a very small space. So for example, I can show you where a store is, but I won't be able to show you the jeans section within a store."
Damani says that the simplicity of the API will lead to a lot of basic applications, but the real value will come from developers who take the time to produce more polished apps. "There's a couple of different types of ways people can do this. It's not so difficult to do an application whereby using an augmented reality view and then overlaying information on top of that. So you could simply do an augmented reality view which is showing the video feed at the back and then just putting a couple of dots on top of it which is what other browsers and other application developers have started doing, like with Twittaround and applications like that. The only difference with our stuff is that we're using OpenGL which just makes it look as if the object's floating in midair rather than an overlay on top of the screen. So I think we see a lot of apps where as soon as 3.1 is actually launched, we'll probably see a number of apps which have augmented reality actually going out onto the market.
It's all about who's going to have the most amount of data and the most valid data. So there's the obvious types of apps which you're going to launch and those are the find me my nearest bar, find me my nearest event, find me the nearest tube stop, find me the nearest ATM. And those sorts of apps are all going to be around. But they're only going to be useful for when you're trying to look for things. So if we want to get users to use augmented reality a little bit more, we have to start introducing other bits of functionality, things like show me the offers available in a particular high street. Show me when I'm walking down a high street if there's a table available at a particular restaurant. And it's that sort of interactivity and providing that real-time data in this augmented reality view which is going to start getting people to use it a lot more rather than just for show me where the nearest area is. "
Acrossair hasn't even seen the release of Nearest Tube, since it's waiting for the official iPhone 3.1 release, but Damani is already looking forward to more advanced products. "We're working a lot of these augmented reality, out of the package, applications, closed applications. We'll be launching a couple of others. But where we see the real space is creating the whole browser. Having an augmented reality browser which users download onto their iPhone and within that browser, they can look at any number of data feeds which someone opens up and then overlay all of that information based on where they're standing. So I mean a typical example of a use case for this might be as he's coming out of a tube stop, he doesn't really know where he wants to go, where his friends are located. He switches on the browser. He switches on the ability to view where his users are located; all of his friends are located. He switches on the ability to see if there's any events happening in that area around him. He switches on the ability to see if there's any restaurants around him. And then he can also switch on any other data feeds and then just pull up the browser and then be able to physically see where his friends are, where events are and all sorts of other stuff. So it's just creating a single browser to have a look at all of this data in a single interface, rather than having multiple applications out there. I think the real power is in having that browser."