Apple is building a transformative platform for the AR

Apple has shared some details about accessibility features that make it work, dropping some great hints on how it looks at augmented everyday reality. Will we look further into this at WWDC 2022 and how it will be implemented?

Making life accessible, creating data of reality

Two upcoming accessibility enhancements seem to suggest Apple’s approach: door detection and live captions. Here’s what they do:

  • Door detection: Using an iPhone camera, it will detect a door, navigate a user through that door, let them know if the door is open or closed, tell them how to open the door, and it can understand and read things like door numbers.
  • Live captions: Your Apple device will listen to any audio content and give you a real-time transcript of that conversation.

Both are incredible features, but when you consider them a little, they become quite amazing. I look at it this way: once an Apple device can make a real-time copy of what it hears, why would it be unable to translate that copy into different languages?

What could this mean?

We know Apple has the technology to do this – we use it every time we translate a web page. That process is too fast, so why not extend that translation to a transcription distributed by your Apple device?

It can work in two ways, also, you can tell the language you can’t speak with your device, enabling you to join complex conversations in multiple languages.

Apple has been using door detection technology for some time. You can easily use them yourself – open the photo and search for images in the “lamp post” and you’ll be able to explore each of your photos, including the lamp post.

Now, I don’t know about you, but if your device can recognize items in photos, it will be able to recognize them elsewhere using the same machine’s visual intelligence.

Sight + intelligence + context =?

This means that just as a blind or visually impaired person may be inclined to use a door detector to find and open a door, it is reasonable to assume that they will be able to use similar technology to recognize any of Apple’s other AI. There is a name for the devices:

“Hey Siri, where’s the orange in the vegetable shop?”

“They are three steps to your right, in the second box from the front. They cost ড 1. “

Door detection tells us that this will happen because the technology to enable it already exists. It just needs to be out of the building.

So, what’s revolutionary about all this? This means that Apple has already integrated many building blocks that enable its technologies to recognize and communicate with the world around us. Technology once realizes that the worlds, it can help guide our interactions, enhance our decisions with the information we can use.

A blind or visually impaired person buying লা 1 oranges may be told that the same fruit is available at half price down the street. Or a field service engineer may see that their device has already opened a troubleshooting manual for the hardware they are looking at.

[Also read: Apple calls out Meta for hypocrisy]

We have two technologies here, apparently designed for accessibility, that give the company’s devices an interactive understanding around vision and sound. This understanding enables the device to provide the user with relevant and useful information about what they see and hear.

This could be a direct answer to the question, or, reflecting what Apple is doing with the Siri suggestion, driven by the device’s knowledge of the type of help you typically request.

The expansion of human experience has begun

You don’t have to be an enterprise professional to acknowledge that it opens up a range of opportunities for powerful tools and services for customer users, with deeply powerful enterprise applications in the vicinity of Machine Vision Intelligence and Industry 5.0 across multiple sectors.

One of the great things about these apps is that since they are based on accessibility technology, they also enable those who may not yet be equally represented as they should in some cases take a more active part.

This is what I call augmented reality. And that’s what I think we’re going to learn a lot more about at WWDC 2022.

Not surprisingly, Apple has begun leaking information to company executives about displaying these technologies and the design challenges that led to the creation of the most logical car for such technologies, Apple Glass.

Step by step, the building blocks of this many years of effort are now falling into place even faster. I can already hear the critics getting ready to be wrong again.

Follow me TwitterOr join me on AppleHolic’s bar & grill and Apple discussion group on MeWe.

Copyright © 2022 IDG Communications, Inc.

Leave a Reply

Your email address will not be published.