NAB 2016: Virtual and Augmented Reality Recap

0

Omni_life1.1The horizons of virtual and augmented reality were expanded in big ways at this year’s NAB. This was the first year that VR/AR had its own pavilion, which was action packed and constantly busy, with so much mind-blowing gear and a host of exciting new platforms and breakthroughs available to sample and check out. There were big showings from the heavy hitters (Oculus, HTC, Samsung, Nokia) and very compelling products and demonstrations to experience from a number of up-and-coming companies, as well.

The Kaleidoscope VR community held a three-day immersive exhibition showcasing 30 different cinematic experiences, such as this Mike Tucker-directed experience titled “Tana Pura” set to music composed by Radiohead’s Jonny Greenwood (There Will Be Blood, Inherent Vice): mike-tucker.com/15/tanapura.

There are so many new developments worth discussing, but we’ve condensed here, for your convenient consumption, some of the most compelling NAB 2016 exclusive VR/AR debuts.

Omni_life4.1GoPro Releases GoPro Omni and the GoPro VR Platform and Apps
GoPro made a big splash this year in the VR/AR realm. Strictly for the professionals, GoPro’s 16-camera Odyssey 360° camera rig made its NAB debut. GoPro also introduced a more accessible option nearer the prosumer divide, the Omni, a new six-camera 360° video rig which the company plans to start shipping by August. Priced at $5000 USD, the full Omni package includes the rig, six Hero4 Black cameras, and a complementary Autopano software package from Kolor. Alternatively, you’ll be able to purchase the rig on its own for $1500 USD. Benefits of the Omni system are its price point, its integration of an automatic genlock feature, and the intuitive Kolor software package optimized specifically for the system. Falling in line with GoPro’s general design principles, the Omni rig has a simple, rugged design and small form factor, which we imagine will help facilitate the widespread early adoption of this system. It’s easily held in your hand and can be securely attached to vehicles, gear, and whatever else.

GoPro also debuted their new VR app for web, mobile, and VR headsets. The platform allows users to share, view, and interact with VR content with headsets or by simply using their traditional screens (mobile, PC, TV) as a virtual portal into the original 360°/VR content from GoPro and a global community of artists. GoPro’s VR platform is free and available now on the web, and is set to directly compete with YouTube. Check out GoPro VR here: vr.gopro.com/browse/1.

Plus, GoPro had live demonstrations of a new motorcycle-mounted “custom solution” that provides a live, wireless 360° immersive video, powered by GoPro’s HEROCast, in broadcast quality. This “custom solution” will be used in the coming months by MotoGP, AMA Pro Flat Track, and MotoAmerica.

teradek_sphereTeradek’s Sphere
With Teradek’s new Sphere, live streaming 360° video content became a breeze (or at least a whole lot breezier). Just plug your 360° video capture camera solution into the Sphere’s four HDMI inputs, and you can live-stream your 360° content to any compatible online video platform, including Wowza. The Sphere can stream 4K video at up to 10 Mbps with only four frames of latency. Through Teradek’s proprietary encoding software (iOS and OS X), you can remotely monitor the video in real time and calibrate color and stitching, too. Daisy chained with a second unit, the system supports up to eight cameras simultaneously. The Sphere will ship by June of this year with a price tag of $3000 USD. Learn more about Sphere: teradek.com/pages/sphere

Also, Teradek did a slick and informative live stream at this year’s NAB. Check out their videos here: teradek.com/pages/nab.

radiant_mobius

Mobius POV Rig

Radiant Images and VRLIVE
The leading-edge cinema specialists at Radiant Images showcased their latest in VR gear and solutions at both the Band Pro booth and the Codex booth. Attendees had a chance to get hands-on with some of the latest VR camera systems, including the new Nokia OZO, Dark Corner from Sony, Mobius POV, and Headcase Codex VR.

Radiant Images partnered with the VRLIVE platform to live-stream 360° video of the event. For the Codex booth live-stream, they utilized the Headcase VR rig, which has also been used on such projects as a VR experience for Guillermo del Toro’s The Strain TV series at last year’s San Diego Comic-Con, and a special VR project with WWE at their Summerslam event last year. At the Band Pro booth, Radiant and VRLIVE used the Sony Dark Corner. More info about VR LIVE: radiantimages.com/cameras/vr360/1029-vrlive

Radiant Images had custom VR power and mobility solutions on display, as well.

cara-vr-nab-demoThe Foundry’s CARA VR Toolset
The Foundry, experts in software development for creatives, introduced their new CARA VR toolset. CARA VR is designed to work with The Foundry’s NUKE software range for VFX collaboration, which has been an industry-leading environment for creating VR content. With the new CARA VR toolset, The Foundry aims to solve some of the biggest challenges in virtual reality content creation. Building on NUKE’s existing native support for compositing multi-camera live-action footage, the CARA VR toolset enables artists to create seamless immersive experiences with 360° video content using a more streamlined workflow and a more powerful toolkit.

CARA VR’s development has focused on four major areas: powerful stitching tools, plate-correction capability that allows for footage stabilization and matching exposure and color balance between cameras, workflow compositing enabling intuitive tracking, painting, and ray-trace rendering, and a live review process for real-time monitoring using the Oculus Rift headset, directly within the NUKE environment.

Andy Whitmore, chief product officer at The Foundry, says: “Our research team is dedicated to solving the big challenges that our industry faces with new technologies. We have a long history of bringing products to market that take away some of the pain points of content creation and make it easier for artists to focus on being more creative. Our customers were already using NUKE on VR projects, but CARA VR streamlines those workflows, making high-quality VR content creation faster and easier without leaving the NUKE environment. In order to make sure we developed tools that actually helped our customers tackle that challenge, we worked in close collaboration with them to understand their requirements and the challenges they face in working with this new medium.”

The public beta of CARA VR is available now (sign up here), and will be available for purchase in the coming months.

deepDEEP Inc. Unveils Liquid Cinema
DEEP Inc. provided a preview of their new LIQUID CINEMA VR platform and authoring tool. LIQUID CINEMA is aimed at simplifying the creation and distribution of cinematic virtual reality experiences through their easy-to-use, affordable suite of VR authoring tools, players, and SDKs.

The LIQUID CINEMA authoring tool takes a unique approach to VR by implementing a new innovation DEEP calls “Forced Perspective” that’s designed to ensure the viewer never misses a vital piece of action, which is, of course, a problem unique to the VR experience, as viewers normally can focus on any part of the virtual environment. This is quickly achieved through customizing the layout, size, and behavior of the visual and interactive elements, and such modifications can be done even after the content has been uploaded and gone live.

DEEP promises there is never any code writing involved with using LIQUID CINEMA and that the authoring tool does not involve time-intensive rendering. Plus, the toolset integrates fully into your existing workflow.

“LIQUID CINEMA was our response to build the missing tools that are needed to create cinematic VR experiences that actually utilize the special qualities of this medium,” says Emmy Award-winning DEEP CEO Thomas Wallner. “We want the pioneers of cinematic VR to test LIQUID CINEMA to show them how this will revolutionize how they create VR content moving forward. As a filmmaker, I can relate to the frustration. A tiny mistake in a burned-in graphic or title and you dread what comes next… rendering the entire video, creating and uploading multiple streaming assets and waiting for servers to update. You just lost an entire day. Never mind those minor tweaks during the making of your VR film that cost you days and weeks. LIQUID CINEMA’s end-to-end live rendering approach changes all that and lets you focus on what matters the most – making your film.”

More info on DEEP and LIQUID CINEMA at their website: deep-inc.com. And check out DEEP’s latest VR short film “Edge of Space” here: kaleidovr.com/showcase/edgeofspace.

Waves_Nx_3D_SoundscapeWaves Audio’s Nx Technology with Intel’s RealSense Camera
Technical GRAMMY Award winning company Waves Audio displayed their new Waves Nx system for 3D immersive audio. Waves Audio has for years been at the forefront of research and development of psychoacoustic signal processing algorithms, which they’ve applied to everything from movies to music, video games to phone apps, and other consumer and professional level applications. Now they are bringing their expertise to immersive virtual experiences.

Waves’ proprietary Nx software (OS X, Windows), when combined with an Intel RealSense integrated SR300 camera, can enable real-time playback of immersive 360° audio experiences over headphones that naturally adapts to head positions. Basically, put on your headphones, plug them into your computer, run the Nx software, and you’re now immersed in a 360° virtualized audio environment that subtly and realistically simulates an ideal studio-listening situation, moving from binaural to full virtual-stereo without dramatic changes to the frequency response. Turn or move your head, and the audio changes to match your head position. A special algorithm inserts cues into the signal to convince the listener’s brain that sound is coming from virtual speaker positions in space, and the Nx software is able to create depth perception without unnaturally modifying or coloring the sound. For increased realism, Nx users even have the option of inputting their own individual head and ear measurements, so that critical delay and frequency changes are tailored specifically for the user’s head.

It’s an incredibly neat toy and offers some unique benefits for those who find themselves mixing and mastering from a laptop on the go. Waves Nx supports all types of headphones on all media formats. Waves even claims that using Nx helps reduce hearing fatigue on long listening sessions. Combined with its tight integration with the RealSense camera, it will be very interesting to see how this technology becomes more intertwined with the full VR experience. Learn more about Waves Nx: waves.com/plugins/nx

Share.

Leave A Reply