Apple’s decision to invest in iPhone photography was incredibly wise. Smartphone cameras will deliver better quality images than you can from DSLR in three years.
Terushi Shimizu, President and CEO of Sony Semiconductor Solutions (SSS), said, “We expect that still images will surpass the image quality of single-lens reflex cameras in the next few years.” This statement comes from pro photographers (and video makers) Increasingly use the iPhone for professional work – but Machine Vision has reached a tipping point to enable intelligent enterprise and industrial applications.
Throughout the history of the iPhone, Apple has focused on using the device as a camera. It has created an extensive ecosystem to support this use. Think about the photo app, built-in document scanning, AI-powered person recognition, machine-driven text detection and translation, and most recently, the ability to identify pictures of flowers and animals using visual lookups. Apple’s efforts were arguably accelerated with the iPhone 7 Plus, which for the first time included multiple lenses and zoom functionality in a smartphone.
See your image
Sony, which occupies 42% of the global image sensor market for the phone and has three of the highest IMX sensors inside the iPhone 13, believes the sensor will double in high-end devices by 2024.
This will enable a “new imaging experience,” Sony said. This will inevitably include zoom effects boosted by AI and Super HDR and should extend up to 8K video capture on smartphones. I think they will extend to true 3D imaging capture which is enough to support a truly immersive 3D experience. These predictions make it clear that Apple’s cinematic mode is a stocking horse from which future evolutions of smartphone camera sensors will have to be exploited.
But this type of machine vision intelligence is the front end of consumers for much more complex activities that should translate into attractive enterprise opportunities. I’ve written before about the Triton sponge, which enables surgeons to more accurately track a patient’s bleeding during surgery, and I think everyone now understands how camera intelligence can optimize performance across delivery, warehousing, and logistic chains combined with augmented reality. The industry is also embracing smartphone-quality imaging intelligence – factories use error detection systems mounted on iPads and other devices to monitor production to maintain quality control, for example, and AR-based retail experiences continue to improve.
But what we know in today’s use should only be considered along with the results that come now onstream, especially in the vicinity of autonomy and augmented reality. We think Apple is working on the final launch of its AR glasses, which could be equipped with an array of eight cameras – probably early next year and possibly from Sony, which I believe has worked with Apple on and off. In this project for years. A Very interesting thread Many of these plans have been predicted by Robert Scoble, but those cameras will probably be used to analyze and enhance the reality you are in, as well as provide a virtual experience that you can safely explore.
A man wearing a set of AR glasses would rely on a similar set of technology, like a car, using the visual acuity of a machine to drive himself on a public highway. The perfect image sensor integrated with the type of AI we use on the iPhone will be fundamental to the development of the Apple car. As recently announced, Door Detection is a fantastic example of how AI and Vision can work together to help a person understand and navigate the world.
Similar combinations of technologies (extended by supporting technologies such as UWB and LiDAR) will be used to assist in the creation of vehicles for autonomous road comprehension and navigation. In terms of health, we can easily predict that with the development of smartphone cameras it will be more possible to support the implementation of remote patient care and semi-autonomous automated surgery.
Of course, given the proliferation of such imaging-based applications, it is reasonable to expect accelerated innovation in the CMOS sensor development industry, which is clearly what Sony Key Bank is doing. Impact? Eventually the camera you wear on your glasses will be able to take as good a picture as you would today using a DSLR – maybe you can say “Hey Siri” and faster than the processor you use on your Mac.
Follow me TwitterOr join me on AppleHolic’s bar & grill and Apple discussion group on MeWe.
Copyright © 2022 IDG Communications, Inc.