If you missed it yesterday, Apple gave us our first official look at iOS 16. Of course, that’s not to say that as it did last year, the company highlighted several impending accessibility improvements to its operating system, saying only that they were bringing “software updates across the Apple platform later this year.” This is basically the code for iOS 16, iPadOS 16, watchOS 9 and macOS 13.
While Apple’s announced features are remarkable improvements for a variety of vision, speech, or motor impairments, they also speak of some overall improvements বিশেষ especially AI and machine learning যা that we’ll probably see across the next generation of Apple’s operating systems. If we read the announcements, here are some of the big improvements we can expect to see in iOS 16:
Live captions = good speech recognition
There is a live caption feature from version 10 of Android and now Apple will get it three years later. By enabling this setting, your iPhone or Mac (if it contains Apple Silicon) will automatically create captions for virtually any audio content, including video, facetime calls, phone calls, and more. This is a normal extension of on-device speech processing that was introduced in iOS 15 last year, but it speaks of a big improvement in the sophistication of that feature.
We hope this means Siri’s understanding of your commands and melodies has improved, but one can easily see these features shown elsewhere. Take, for example, the Notes app, where one can imagine a “transcribe” feature to create text from an audio recording or video. If Apple billed it as an accessibility feature, the live caption transcription would have to be rock-solid and open up a whole new world of possibilities for the rest of iOS 16.
Apple Watch Mirroring = Airplay Improvements
Another accessibility feature coming later this year will allow you to use your iPhone’s display to mirror your Apple Watch and control your watch on your iPhone. Additional accessibility features.
However, the Apple Watch mirroring also has an interesting effect. Apple says the feature “uses hardware and software integration, With progress made in AirPlay“Not that Not necessarily Which means it’s about to be the most delusional time of the year, as well.
Significantly, it looks like it allows devices to communicate control in a way that AirPlay doesn’t do now. AirPlay pushes audio and video across devices and allows general control (play / pause, volume, and more), but allowing AirPlay-compatible devices to give advanced touch control signals seems new and could lead to some incredible new features.
Here’s a killer scenario: If Apple could mirror your Apple Watch to your iPhone and let you interact with it completely, it could probably mirror your iPhone to your Mac or iPad and do the same! This alone will be a game changer feature.
Door Detection = Real-World AR Object Recognition
Apple has been quietly improving its object recognition for some time. For example, you can search all sorts of things in the Photos app and get pictures containing them, and iOS 15 has added a neat visual lookup feature that uses the camera to detect plants and animals, famous landmarks, artwork and more.
Now Apple has announced that it will add the ability to detect doors in real time using the Magnifier app, including judging their distance and reading text on them. This is only for devices with LiDAR (which is how it measures the range), but it does speak of a vast improvement in object recognition.
The most obvious use-case is augmented reality glasses or goggles, which are not expected to be released early next year. But Apple already has a powerful ARKit framework for developers, which is used for AR apps and includes the ability to recognize and track some everyday items. And for Apple, the new technology will not be out of character, which has not been around for some time.
It seems reasonable to assume that the door detection feature is a natural extension of Apple’s already augmented reality view and work on object detection. So don’t be surprised to see a demo of the new ARKit Framework feature at WWDC for developers. It may start with the new AR app on iOS 16, but as Apple moves its AR software tools toward the final integration of AR specs, it is bound to appear on larger projects.