Conceptual Work: Eye-Tracking influencing the AR User Experience
All the conceptual videos are made in Adobe After Effects, prototypes are made in Unity.
AR Prototyping Work
Augmenting 3d assets on top of real world (Prototyped using Unity and ARKit2, runs on iPhone 6s+)
Realtime point cloud generation using 6d.ai, also interesting to notice is the rain getting occluded by the TV (Prototyped using XCode as a platform to generate particle rain and 6d.ai API for the realtime SLAM)
Modelled array of cubes in Blender, later imported to Scenekit (XCode). Prototype runs on iPhone6s+
Eye direction detection using ARKit in XCode (Used open source code to estimate the gaze direction using eye direction)
Prototyped the idea of virtual desktops in AR. Here I've placed 3 different intractable web views in front of me. Implemented in Scenekit/ ARKit using XCode.
Hi, I'm Sourabh Pateriya.
Currently I'm building Soundverse Inc. that I founded in June'23. I'm a Product leader who's led teams at Spotify, Samsung and Tobii with experience in the areas of generative AI, music-tech, extended reality, eye-tracking, voice assistant and mobile. I hold 10+ patents.