Apple Applies for AR Glasses Patent

Jack Purcher, Patently Apple:

Apple acquired Metaio the creator of ‘Thermal Touch’ and a new Augmented Reality Interface for Wearables and beyond back in 2015. Their technology is thought to be behind Apple’s push into augmented reality and ARKit. This year a Metaio patent application surfaced under Apple for moving furniture in augmented reality. Apple was also granted a patent for indoor navigation that covered new capabilities for a future iDevice camera allowing it to recognize building names or paintings and then adding AR identifying markers on the user’s iDevice photos. Today another original Metaio patent application under Apple has surfaced relating augmented reality. More specifically it covers a method for representing points of interest in a view of a real environment on a screen of an iPhone with interaction functionality. The buzz is that the patent covers AR smartglasses as noted in our cover graphic, something that Apple has been adding to a series of new and updated trademarks of late ( onetwothree and four).

Worth noting that Apple applies for, and is granted, plenty of patents that never translate into shipping products.

Mira Prism AR Headset is Exactly What Apple Glasses Won’t Be

Jonathan Nafarrete, writing about Mira’s just-launched Prism, a $99 Augmented Reality headset that works with iPhone:

The design of the headset is beautiful, something I would expect from a company like Apple when it comes to attention to detail. Prism’s clear lenses easily remove and reattach with the gratifying snap of a magnet hold. The cradle for the iPhone is just as simple, allowing for varying device sizes to fit with an expanding spandex fabric.

Judging only from the press photos in Nafarrete’s post, the headset is neither beautiful nor anything like an Apple product. It more resembles science lab safety glasses than something from Jony Ive’s lab.

This is a simple headset powered by one of the most popular devices, all for the price of $99. But there’s only one thing missing from this equation to make Prism a hit with mass consumers—content. Mira shared that they’ve partnered with select immersive content studios to create an initial suite of interactive AR experiences for developers to learn from and consumers to play, all included when the Prism headset ships. Experiences range from solo challenges to multiplayer games, including mind-bending mixed-reality puzzles, holographic chess and digital warship battles, which make gaming in AR as dynamic as it is social.

“Only one thing missing”: The most important thing of all. Content is still king.

Prism won’t be the only iPhone-compatible AR headset to pop up between now and (if) when Apple’s own glasses drop. It’ll be interesting to see how third-party hardware makers attempt to leverage ARKit-powered experiences made to be viewed on iPhone screens. And, also, if any hardware makers get enough stuff right to pique Apple’s interest on the acquihire front.

I can’t imagine Apple buying another company’s hardware design (let alone one this ugly). But Apple does have a history of acquiring smaller companies and integrating their tech into future products of their own. If Mira – or some other company – finds a great way to leverage ARKit for a headset experience, don’t be surprised to see it incorporated into Apple’s own AR glasses.

You know, if Apple’s actually making its own AR glasses.

Apple’s AR Glasses Will Cost Less Than $1,000. Maybe only $699.

Former analyst and newfound venture capitalist Gene Munster made some waves a week or so ago with his five-year forecast for Apple. Munster, a longtime Apple watcher, had this to say about the rumored Apple AR glasses:

Our best guess is that Apple Glasses, an AR-focused wearable, will be released mid FY20…We believe Apple sees the AR future as a combination of the iPhone and some form of a wearable. With an average sale price of $1,300 we expect initial demand to be limited at just over 3m units compared to 242m iPhones that year. This equates 2% of sales in FY20 increasing to 10% ($30B) in FY22 when we expect the ASP to be about ~$1,000. This growth curve is modeled after the iPad initial growth.

Munster’s got the pricing all wrong; there’s no way any such glasses debut at $1,300 (ASP). If and when Apple ships AR glasses, they’ll sell for under $1,000.* My best guess right now is that they start at $699. Here’s why:

1. I agree with Munster’s notion that Apple sees a potential AR wearable working in combination with iPhone (and/or Apple Watch, see below). In this model, iPhone is the computer and the glasses are a “dumb” peripheral handling input and output but not running an OS or apps. The current, top of the line iPhone 7 sells for $969; there’s no way Apple launches a peripheral that costs 30% more than the phone it’s useless without.

2. Apple glasses would be built atop current and forthcoming Apple technology, both for ease of adoption/use and to keep production and service costs down. What kinds of tech are we talking about?

– High-end displays based on the ProMotion-equipped OLED panels everybody assumes this Fall’s iPhone will ship with.

– Advanced wireless connectivity to shuttle data between the phone and glasses with high reliability, minimal interference, and supreme ease of use. Like the W1 chip-based system in current AirPods, but on steroids.

– Miniaturization techniques to create high-powered, energy efficient embeddable systems for wearable computing. Like the S1/2 SoCs found in Apple Watch. But, you know, on steroids. Even as a peripheral that doesn’t run apps or an OS on its own, AR glasses will need their own computers to send and receive data (to/from an iPhone running an app, and also from cameras and sensors embedded in the glasses themselves), and turn it into imagery to display to the user.

Eventually, these miniaturized computers may be robust enough to power standalone glasses and/or an Apple Watch that renders the iPhone itself unnecessary. One day you may leave the house with your watch but no phone, and your glasses could run off the watch’s computer. But we’re not there yet.

– Personal audio technology that supports immersive listening, on-demand voice recognition, and wireless data with super-low latency to keep the audio and visuals in sync. Again, see also: AirPods with W1 (and — gulp! — Siri).

ARKit and the ecosystem of developers already working on the first AR apps for iOS 11.

Apple won’t literally re-use iPhone display panels in AR glasses. But they will leverage their investments in — and knowledge of — their existing hardware businesses to keep the cost of producing glasses as low as possible. Remember, Tim Cook’s a master of the supply chain.

Same goes for the software, developer ecosystem, back-end systems (app store, iTunes credit card database, etc.), and other important bits that make all of this stuff run: It doesn’t make any more sense to shove iOS into a pair of glasses then it would have to cram it into a watch. But just as watchOS was built on iOS was built on OS X, so too will the mythical glasses run a mythical glassesOS built atop the very real OS foundation that already exists.

(Important Aside: AR glasses would represent a paradigm shift jarring enough to the mainstream iOS user that asking him to adapt to Computer ized Glasses! and learn to navigate a new OS entirely inside of the glasses would be tantamount to blasting him to Mars without a spacesuit or survival guide. Just like the first iPhones relied on a desktop computer to manage apps, photos, music, and so on, so too will the first Apple glasses lean on iOS devices for all that stuff. They won’t represent a brand-new, standalone platform. That’s just not how Apple does it.**)

And all of that stuff saves time in getting to market with killer hardware and a legit app ecosystem. And saving time saves money, which helps keep the retail cost down.

Which brings me back to point #1: The glasses will depend on an iPhone (or, possible, iPad) to run. And there’s no way Apple prices an iPhone peripheral well above the top of the line iPhone. I know Apple views themselves as a maker of premium goods and charges accordingly, so their glasses may cost more than whatever they’re competing with come 2020. But they won’t cost $1,300.

$699. $999, max.

*I have no inside knowledge. I could be wrong. I very well may be wrong. But I think I’m right, so I’m presenting it as such, for the sake of good writing.

** I could be doubly wrong, etc etc, see above. But Apple followed the same concept with Apple Watch, which today relies on iPhone, but likely won’t sometime in the near future.

The New iPhone Will Let You Interact With Virtual Objects

Thanks to a third-party developer, iPhone users will be able to reach out and touch, pinch, and otherwise interact with digital objects when iOS 11 launches later this year. UploadVR reports that Clay VR is integrating its functionality into Apple ARKit, and the tech should be working within the next week or two.

In lay terms, this means that developers will be able to add interactivity to their ARKit-based Augmented Reality apps with relative ease – and without any additional hardware. As Jamie Feltham reported:

Developers may already be familiar with Clay, it’s an SDK that allows smartphone apps to track the user’s hand in 3D with just the phone’s camera. It can recognize more than 30 gestures users make with their hands, allowing for controller-free navigation of experiences.

While Clay was originally designed for use in Virtual Reality, it’s already being used in some AR/Mixed Reality environments:

Currently you can see a similar solution with Microsoft’s HoloLens, where users pinch their fingers to interact with virtual objects and interfaces in the real world.

It’s worth repeating: Clay’s solution won’t require additional hardware. Apple’s control of the full iPhone stack — hardware, software, and everything in between — has long been one of their greatest advantages in the smartphone market. Now it’s shaping up to be perhaps the single most important factor to date in launching the biggest Augmented Reality platform we’ve ever seen. Experiencing bleeding edge AR on iPhone promises to be as simple as launching an app that just happens to have some Augmented Reality features already baked in.