Project #SonicScape: A Better Way to Edit Immersive Content

We see the world in 360°, but the problem is, we’ve been looking at a cropped view of it for way too long. Thanks to new technology advancements in VR/AR devices, cameras, and microphones, we can now see and hear the world captured all around us — front, back, left and right, above and below, in full spherical 360. As the technology continues to advance, the environment in which we edit this immersive content needs to be reimagined. In the Adobe Design Lab, our team of designers and developers set out to do just that.

Project #SonicScape is a prototype aimed at creating the best UX for placement and editing of 360/VR video, audio, and graphics. The 3d prototype solves for the need for editors to be able to see and understand their 360 footage as they edit it, import 360 video and audio assets into the scene, change their orientation and easily position text, graphics and sound effects all in one interface. Project #SonicScape takes the guess work out of the immersive content editing experience by visualizing where the audio is, it’s frequency and intensity, and therefore making it that much easier to bring immersive content to life.

Subscribe: http://www.youtube.com/user/adobe

LET’S CONNECT
Facebook: http://facebook.com/adobe
Twitter: http://twitter.com/adobe
Instagram: http://www.instagram.com/adobe