AR Subtitles  2022-03-30

AR Subtitles

Personally, whenever I'm listening to music I want the lyrics visible for me to reference. Also, sometimes when you're listening to a podcast, or really any audio media, a visual aid of some sort would be really useful, such as when an art and design podcast mentions an example of visual art. That's the motivation behind this AR prototype. The goal was to design an AR app that can surface subtitles or visual aids while you're listening to various media.

The UI for this project was shamelessly inspired by Leap Motion's Project North Star demos. I particularly like how the hand-anchored menu UX turned out, but there's a lot of room for improvement with this prototype that I'd like to explore: supporting 3D assets instead of just 2D visual aids an text, querying APIs such as Google Translate or a lyrics database (as the subtitles are currently included as srt and json files embedded within the build), and UX improvements to make it generally more seamless and unobtrusive

Also, in the future, I'd really like to port this prototype to an AR system with color passthrough (or better yet, optical passthrough), because between the size of the Quest itself, the quality of its video feed, and the fact the video is in black and white, Oculus' current passthrough implementation is obviously not ideal for casual use, e.g. listening to a podcast while going about your day.

The hardware this prototype was developed on was a Quest 2. The prototype was developed using Unity and the AR passthrough was accomplished using the Oculus Integration SDK