MUSIC LENS - UNITY AR PROJECT
Is there a way we can elevate the experience of live performances further by making it more magical, visual and immersive? To find the answers and to explore this area, I decided to work on this passion weekend project and built a proof of concept.
Proof of concept video
Overview
I tend to go to as many music concerts as possible. Last year while attending Rock am Ring, I felt that the visual engagement through lights and pyrotechnics is limited around stage area only. Can this problem of limited visual engagement be solved by XR?
Focusing on one specific problem and making a clear problem statement
There were various ways to tackle this problem. For e.g. elevating the experience from the audience's p.o.v, performer's p.o.v. or both. I decided to design experience in such a way that both audience and performer feel more visually engaged.
I chose a very specific context of live performance. Electronica artists tend to use a pre-decided workflow which consists of live music, instruments, audio effects, automated mixing and visuals. Let's call this artist, "Apex".
Apex likes to use Ableton Push as a live music instrument which connects to a laptop. She also uses a 3rd party plugin to generate some cool real-time visuals which are usually projected on a big screen. Since big screen is usually behind her, she misses all the visuals and some key moments that audience would connect to and enjoy. Is there a way to bring visualisation to Apex in a easy way?
Idea
Music Lens: To make audience's experience more immersive, introduce a visualisation which is in sync with the music. I chose wave as a basic visualisation because it communicates the bpm of a song really well. (Also it's pretty easy to make it. ssshhhh...)
Virtual Screens: To allow performer to see the state of DAW (Ableton Live) and visualisation. For this part, I needed an object anchor to which augmented information can hook to. Since a performer usually have their unique looking instruments, I decided to use Ableton Push 2 (instrument) as an object anchor to augment visual layer on top of it.
To sketch 3d XR ideas, I use print out of axonometric isometric projection and draw various ideas on it. Here I drew a rough arrangement of virtual screens and also how the music lens would look like.
Out of the options A and B for Virtual Screen, option A made more sense as we don't want a virtual screen to come between the performer and the audience.
For the Music Lens, I felt if it originates from the instrument itself and then creating ripples all across the live arena, it'll communicate the deeper meaning that the music performed through that instrument is propagating throughout the arena like a sound wave.
Prototype
Making point cloud of the target object (Ableton Push 2 music instrument):
Building virtual screens as a prefab in Unity:
Building virtual screens as a prefab in Unity:
Creating waves using particle editor:
And that's it! Outcome of this is attached at the beginning of the page. This concept went viral on Reddit and many Facebook groups. I got lot of messages to turn this into a more polished app. Someday, I might do it!
Hi, I'm Sourabh Pateriya.
Currently I'm building Soundverse Inc. that I founded in June'23. I'm a Product leader who's led teams at Spotify, Samsung and Tobii with experience in the areas of generative AI, music-tech, extended reality, eye-tracking, voice assistant and mobile. I hold 10+ patents.