Pokémon Go is the biggest breakout hit of the year, and though it may be starting to slip from its colossal popularity peak, it’s still a very well played game that millions enjoy on a daily basis. Part of what made it so eye-catching is its augmented reality feature. But as cool as that is, having a psuedo-hovering Pokémon superimposed over the real world doesn’t feel very much like reality.
To make the game more immersive, we’d need some way for the pocket monsters to interact with the environment they’re in, and have it react back. How would that be possible? A research team at MIT believes it’s found a way — through the use of micro-vibrations.
“Essentially, we’re looking at different frequencies of vibration, which represent a different way that an object can move. By identifying those shapes and frequencies, we can predict how an object will react in new situations,” Abe Davis, the lead researcher on the project, told Digital Trends. Along with fellow researchers Justin Chen and Fredo Durand, they’ve built upon previous research they conducted on the concept of visual microphones, to draw even more data from standard video.
“One way to think about it, is if I point my camera at a bush and I watch the wind rustle that bush for a whole minute, I’m watching a bunch of tiny movements of the bush, which are responses to various forces,” Davis explained.
Those movements are categorized as vibrations operating at various frequencies. Then, software can take the video and analyze those vibrations. It can figure out the types of forces at play to create those movements, and then guess at how larger forces, or different combinations of those same forces, may make the object react.
By recording the bush’s reaction to the wind, the software can eventually figure out how it might react to a brick — or Pikachu.
Bringing pocket monsters to life
Extrapolating more than just visual data from video became a focus of Davis’ interest throughout his time at MIT, and it was ultimately the core of his dissertation. However, explaining just how visual data from a video can be used beyond the norm isn’t easy. When Pokémon Go was released, he saw a great way to break it down.
Davis is a Pokémon Go player, having reached level 19 at the time we conducted our interview. We were even introduced to his most powerful Pokémon — Fluffles, a CP 1,592 Arcanine, who’s been tearing up the gyms in his local area. Fluffles was caught at the SIGGRAPH conference where Davis and his fellow researchers first showed off vibration model technology.
… read the rest of this article by Jon Martindale on Yahoo Tech