AR Object Annotations 2022-04-10
AR Object Annotations
After working with Unity for a while, I wanted to switch gears and play with ARKit/RealityKit natively in iOS. Mostly just because when I learned Swift and iOS development in college, Apple didn't have any AR SDKs to speak of. As a brief exploration of RealityKit, this turned out really well and I'm excited to dive deeper into this framework in the future.
This app simply scans the environment and tries to detect objects using whatever ML model you want (currently its just using Apple's InceptionV3 model). When the model detects an object (above a certain confidence threshold), it queries Wikipedia's API and gets the first sentence of its entry and displays a 3D annotation on the object in world space.
UX-wise, I definitely stole the idea for a 2D interface to pop up when the user puts their phone flat, and the AR/3D interface from a tweet I saw, but I unfortunately can no longer find said tweet to credit the designer or developer. That being said, I do love how that turned out, although Apple's built in orientation value of ".isFlat" doesn't trigger until the device is quite flat, so maybe implementing a custom controller for that feature where that UI pops up when the phone is only "relatively tilted downward" would feel better.
This prototype was developed natively for iOS using the following frameworks: SwiftUI, MLCore, RealityKit, and Vision