Thanks to a third-party developer, iPhone users will be able to reach out and touch, pinch, and otherwise interact with digital objects when iOS 11 launches later this year. UploadVR reports that Clay VR is integrating its functionality into Apple ARKit, and the tech should be working within the next week or two.
In lay terms, this means that developers will be able to add interactivity to their ARKit-based Augmented Reality apps with relative ease – and without any additional hardware. As Jamie Feltham reported:
Developers may already be familiar with Clay, it’s an SDK that allows smartphone apps to track the user’s hand in 3D with just the phone’s camera. It can recognize more than 30 gestures users make with their hands, allowing for controller-free navigation of experiences.
While Clay was originally designed for use in Virtual Reality, it’s already being used in some AR/Mixed Reality environments:
Currently you can see a similar solution with Microsoft’s HoloLens, where users pinch their fingers to interact with virtual objects and interfaces in the real world.
It’s worth repeating: Clay’s solution won’t require additional hardware. Apple’s control of the full iPhone stack — hardware, software, and everything in between — has long been one of their greatest advantages in the smartphone market. Now it’s shaping up to be perhaps the single most important factor to date in launching the biggest Augmented Reality platform we’ve ever seen. Experiencing bleeding edge AR on iPhone promises to be as simple as launching an app that just happens to have some Augmented Reality features already baked in.
Made With ARKit is the best thing to hit Twitter since Joel Embiid. If you don’t follow pro hoops, it might just be the best thing ever.
iOS11 with ARKit is still a few months away from leaving Beta, but developers are already cranking out some really cool examples of what’s possible with Apple’s new API for Augmented Reality. Some of them are practical. Others, like Minecraft, are just plain awesome: