Among the endless amounts of sessions at Google I/O, many of them were dominated with talks and interviews surrounding Virtual Reality. When you look at Virtual Reality now, your are incensed with pieces of information regarding these standalone devices, which cost a ton of money, but each providing a unique experience. Can you imagine getting the same experience with only your current device?
Well, that’s where Project Tango comes into play. Google began working on Project Tango back in November of 2012, and has gotten the project to a point where it has given everyone a more in-depth look. Before jumping too far, let’s take a look at what Project Tango is according to the official definition:
Project Tango technology gives a mobile device the ability to navigate the physical world similar to how we do as humans. Project Tango brings a new kind of spatial perception to the Android device platform by adding advanced computer vision, image processing, and special vision sensors.
This may seem confusing, but we were able to catch up with members of the Project Tango team last year at Google I/O 2015 and they offer a great explanation of what this project is.
Project Tango uses three pieces of information to create these experiences in Virtual and Augmented reality:
- Motion Tracking
- Depth Perception
- Area Learning
In order for Project Tango to make the magic happen, these three features must be tracked using components built into the device that is being used. These components are computer vision processors, depth perception sensors, and a motion tracking camera. Once these components are installed, and the application has been developed to access the hardware, you will be immersed into a one-of-a-kind experience with nothing more than your mobile phone.
Now that we’ve explained what Project Tango is, and what the project needs in order to work, let’s talk about the day 2 session at Google I/O. Johnny Lee, the Program Lead for this project, took to the stage and explained the “Why, What, Where” for Project Tango compared to this time last year.
The session started off with explaining the path that Project Tango had started down, over three years ago, in 2012, when there was nothing more than a concept drawing. Then we were taken through the various steps and achievements that Tango had reached between its inception and now. Some examples of major milestones in the project were pertinent to both facets of the concept, meaning both hardware and software.
With the release of Android N, these compatible Project Tango devices will have much more functionality. This will be possible in part due to the ability to take advantage of new API’s that will be introduced with Google’s latest software release. In addition to the new functionality that Android N brings, Project Tango and Google are teaming up with Qualcomm to provide Snapdragon processors for use with Project Tango-capable devices. These new processors will be specifically designed to help create a smooth transition between both Augmented Reality and Virtual Reality experiences with your mobile devices.
Which brings us into one of the major interactions of the Project Tango session at Google I/O. There were a few apps that were put to the test on stage, and none of these required any additional hardware, other than the developer version of the Project Tango tablet.
During the Project Tango session on Day 2 of I/O, these applications were used to showcase the power of what will be possible in the very near future:
- Dinosaurs Among Us
- and more…
Also in this session, it was announced that Lenovo would be unveiling the first consumer-focused Project Tango-enabled mobile phone, at Lenovo TechWorld 2016 on June 9th.
You can learn more about what Project Tango is, and check out the development progress here. Let us know what you think about Virtual and Augmented Reality and how it can be integrated in your day-to-day lives, in the comments below.