To make progress in Pokémon Go, a mobile game that’s taken over America in the past week, you can’t just sit on your couch. You have to get out of the house and travel around town. Some Pokémon can only be found in watery locations. Others are only available at night.
When you find a Pokémon, the game shows the creature superimposed on your actual surroundings. Move your phone around, and the animated image moves too, making it appear as though the Pokémon is located in the real world.
But before it can do that, the technology has to get better. That’s partly about better software, but augmented reality won’t really come into its own until companies like Apple and Samsung create better hardware for seamlessly merging the real and virtual worlds.
Pokémon Go is augmented reality in only the most superficial sense. When I first started playing the game yesterday, a Pokémon showed up on top of my desk. Then when I backed up, it slowly slid toward me until it appeared to be bouncing around on the floor under the desk. As I walked over to the window, it appeared to be floating through the air and then bouncing on the ledge outside.
In short, the creature didn’t seem to be very firmly connected to the physical world. And that broke the illusion augmented reality tries to create.
The basic problem here is that my iPhone is only aware of its location and surroundings in the crudest sense. It has a GPS chip, a compass, and an accelerometer that tells it very roughly where I’m located and which way my phone is pointing. But the iPhone’s camera doesn’t have the ability to recognize or precisely track objects — something that human beings take for granted.
Technology companies have been working on technology to fix this problem. Last year, Intel introduced RealSense, a new category of chips that allow the creation of 3D photos. A RealSense-equipped phone would not only capture a 2D image of my desk but would also instantly capture its three-dimensional shape (using either a range-finding sensor or a pair of cameras that work together).
This kind of advanced sensor would enable apps like Pokémon Go to offer much more realistic augmented reality. Pokémon Go would know the exact size, shape, and location of my desk, allowing the game to appropriately scale the Pokémon image and place it in a precise location in 3D space. If an object passed in front of the Pokémon, the augmented reality camera would recognize this and render the nearby object in front of it.
A character in this more sophisticated version of Pokémon Go could also interact with its surroundings. It could leap off my desk onto the floor and start walking down the hall, or hide under the desk.
While games are the most obvious application for augmented reality, they’re far from the only one. For example, if you had a smartphone with a built-in 3D sensor, you could take a photo of a room and have it automatically compute the size of objects in the shot. Ikea might make an app allowing you to visualize how new furniture would look in your living room. Or an architect could show a client proposed changes as they walk through a building.
In the 18 months after Intel’s RealSense chips were announced, there hasn’t been a lot of progress in integrating this kind of technology into smartphones. But I think it’s only a matter of time before this technology becomes commonplace. Google has a project called Tango that aims to bring augmented reality to Android phones.
But there still seems to be room for Apple to set the standard for smartphone augmented reality.
Building a high-quality augmented reality platform will require combining hardware and software innovations. Smartphones will need both 3D sensors and a software platform that allows app developers to make effective use of them.
Apple is the only company that makes both a popular smartphone and a mobile operating system (the iPhone and iOS). That means Apple is the only company that can guarantee to app developers that there will soon be millions of phones out there with both hardware and software support for its augmented reality platform.
This kind of hardware-software integration has long given Apple a leg up in pushing new technologies like Apple Pay and TouchID. I’d love to see the company follow the same approach to bring better augmented reality to a future iPhone.
This article originally appeared here