Whether I was gazing at a Masai tribesman or crystal clear water, the video I saw through a Gear VR headset looked clear and smooth. If I moved my head quickly enough, I could see the transition from low-grade to a higher-quality resolution, but for the most part the experience was fairly seamless. I was just viewing local video files though, so I can’t say how the technology could apply to typical streaming performance. For what it’s worth, they still loaded up quickly on the Gear VR.
Max Cohen, head of mobile at Oculus (which Facebook owns), also pointed out that the technology could make it possible to stream 6K video files. Since 360-degree videos have all of their pixels spread out throughout the scene, they’re not nearly as sharp as traditional 2D videos shot in the same resolution. So when comparing a video in 4K and 6K, there was a noticeable bump in quality. I could make out individual blades of grass in a 6K video, for example, while the 4K version tended to look muddled.
Given that 4K streaming requirements are hefty for 2D videos (you’ll need a connection between 12 Mbps and 15 Mbps a the least to stream them smoothly), adaptive streaming could make it possible for people with slower connections to enjoy the benefits of higher resolution video.
While Facebook is only discussing the technology around the Gear VR right now, I’d expect it to end up on the Oculus Rift eventually. It’s also the sort of thing that every other VR platform will want, so now the question is if Facebook will ever license it out. Expect to see plenty of similar solutions around VR moving forward, since focusing processing power on what you’re viewing, as opposed to everything else in the scene, is a simple way to optimize. NVIDIA, for example, uses a similar method for anti-aliasing to smooth out 3D scenes without a huge performance hit.