Although they are virtual, the elements – earth, air, fire and water – are as important in reality as they are in reality. These virtual elements must replace common sense perception with technology: spatial positioning, object detection, distance perception, and more. Combine all this and you have an operating system to simulate the reality of unreality.
Apple is making it.
AR WWDC22 is everywhere
WWDC 2022 has not seen Apple make any mention of Augmented Reality (AR) specs. We all think it works.
And while AR enjoyed a few mentions during the company’s keynote address to the public on Monday, it didn’t really promote the technology. Which is quite different compared to previous years. Six years ago when CEO Tim Cook said: “We are high on AR in the long run, we think there is a great thing for customers and a great business opportunity.”
It is clear that this year it did not say much.
While the company may not mention much on stage, the developer sessions during the show tell a different story. They seem to show open and secret examples of philosophy and technology to support ARK at almost every turn.
Even the ability to support multiple windows in the Swift UI app may be significant, as the company moves toward new ways of interacting with data. That amazing wide-angle-camera-powered desk view tool can be a tip in the proverbial hat towards new usability modes to show the user’s face and the overhead view of their desk at the same time.
Need a real keyboard at your fingertips when you can get a virtual one? When will we wear our Mac like sunglasses?
[Also read: Apple calls out Meta for hypocrisy]
What Apple is talking about, really
Apple’s WWDC developer sessions are organized with features on how to promote, enable or advise on an advanced stage of preparing the ground for growth across its platform. Just look at the session calendar and you will find relevant sessions, such as:
- Metal machine learning to create a more realistic gaming experience.
- MetalFX is a powerful API for enabling high performance and high quality graphics effects.
- In a session on ARKit6, developers can create an AR experience rendered in 4K HDR for a more photorealistic view.
Helping computers understand where they are and what they can see is also an essential element of AR creation. RoomPlan shows how Apple is developing the technology for it, and even the growing classes understood by its powerful lookup tool show a growing understanding of the surrounding environment. The live text in the video makes every word you can capture on the camera actionable across your app.
If you can read the room, you can read the street, I guess.
And there are more. Now look at the growing crop of location data available on Maps and MapKit, where you can explore entire cities in detailed 3D. The LiDAR camera gives you depth. UWB can be an alternative LAN and supportive technology like Universal Scene Description. Take a look at those sessions again and there are many who have dual responsibilities, supporting Apple’s existing platforms as well as underpinning those we think are slowly entering the light.
More than what is visible
I’ve just scratched the surface of what we’re seeing, but the takeaway is simple: Apple isn’t ready to discuss its larger plans yet, but at WWDC 2022 it’s equipping its developers with the tools they need to create a growing sophisticated AR experience.
The deepest part of this puzzle focuses on the intelligent perception and understanding of the immediate reality (machine) around the computer. This solution will enable Apple to offer tools that allow it to develop automated solutions for many roles. What we might end up calling RealityOS for the consumer experience could very easily become an “industrial OS” for intelligent production. Once your perception, depth, position and object are recognized, you have an automation opportunity.
Apple has all these things – and also makes silicone to run them.
I am already far from the reality we are in now. But when the apple seems Whisper If you dig deeper into the planning of AR during its main point, you will see where it counts – in its interaction with the developers at the event – the technology is very common to support AR and AR.
Meanwhile, everyone’s favorite mysterious Apple rumor machine, Ming-chi Kuo, Believes the firm will shed more light on its AR plans at a special media event in January 2023, the 16th anniversary of the iPhone launch in January 2007. Thai will be a move that has a certain historical resonance and speaks to the company’s growing confidence in the platform it seeks to build. In my opinion, the company already has a huge pile of developer technology to support this next step. It takes a little more time to fix things.
Apple’s invention engine remains “in full throttle,” says Morgan Stanley. I agree.
Follow me TwitterOr join me at AppleHolic’s Bar & Grill and Apple talks Groups on MeWe.
Copyright © 2022 IDG Communications, Inc.