In terms of functionality, Project SonicScape today is still fairly rudimentary, but the product is not yet shipping and will likely be refined and revised significantly in upcoming months. In a brief demo given to 9to5Mac, the VR interface looked like a fluid and intuitive way to tackle what has previously been a tricky problem. By clicking and dragging, the audio can be manipulated in space to change the perceived direction of sounds. This new visualizer moves beyond the traditional waveform by showing you not just audio frequencies themselves, but their position in 3D space. Spatial audio imported by Project SonicScape is shown onscreen, with different frequencies represented by colored dots. The magic begins by capturing 360-degree video and audio. Project SonicScape goes a step further by allowing you to edit 3D audio in VR through an immersive visual experience. Adobe Premiere now features real-time VR playback support and VR-enabled motion graphics templates. Project SonicScape is an experimental feature that builds on Adobe’s recently expanded support for immersive 360-degree and VR experiences. Alongside announcements of new Creative Cloud applications, Lightroom CC, and machine learning features powered by Sensei, Adobe is today previewing an entirely new tool it calls Project SonicScape, shown for the first time at Adobe MAX 2017.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |