Future iPhone could take better underwater photos with automatic optical analysisSeptember 22, 2020
A future iPhone may be able to take clearer photographs underwater, with Apple looking into ways to process and adjust an image when it detects the camera sensor is submerged.
While underwater photography can be interesting to look at, with imaginative shots showing what isn’t normally seen, it takes a lot of effort for the photographer to get the perfect shot. Simply putting a camera underwater could work in some locations with extremely clear water, but not every body of water offers ideal conditions.
Water resistant hardware is available to make camera equipment more capable of being underwater without damage, though even amateur photographers can get a decent image using their water-resistant iPhone in some situations closer to the surface.
Even with the perfect hardware, problems are still evident with such photography, such as a lack of available lighting and reduced visibility over distance because of murky water. There is even the issue of color differences, with water shots potentially having a greenish tint compared to above-water photography.
In a patent granted to Apple on Tuesday by the US Patent and Trademark Office titled “Submersible electronic devices with imaging capabilities,” Apple suggests a potential solution to the problem, by analyzing a couple of factors. This is a filing seen previously, with it initially being published in application form in 2019.
After automatically detecting if a photograph is being taken underwater, the proposed device can make a number of changes to improve the chances the image will be a good one. The detection can include collecting a number of data points that it can use in processing, such as the amount of ambient light via a color ambient light sensor, used to measure ambient light spectra both above and below the waterline to determine how much light is being absorbed by the water.
Depth data for the device from the surface, the distance from the subject, water pressure, and the orientation can be combined with ambient light sensing and detecting backscattered light to characterize how murky the water is in theory. Once determined, the device can then employ multiple strategies to overcome the various challenges that are thrown up in that particular situation.
For example, the device may want to introduce more illumination on a subject if there isn’t enough ambient light. Other post-shot edits could include changing the tinting of the image to mitigate light absorbed or reflected by the water’s surface, as well as artificially coloring the water to make it more desirably blue instead of murky, and to generally enhance the contrast and other image attributes to make the assumed subject easier to see.
The patent lists its inventors as Po-Chieh Hung, Prashanth S. Holenarsipur, and Serhan O. Isikman. The patent application was originally filed on September 27, 2017.
This isn’t Apple’s only attempt at improving underwater photography for mobile device users. A patent titled “Underwater User Interface” from April 2020 suggested that anyone using a device underwater needs to have simplified controls while submerged.
Rather than making an interface simpler, the patent also covers changes in app flows, such as removing “Are you sure?” dialog boxes to reduce the user’s “cognitive burden,” as well as reduce processor demands and battery life. It also includes the possibility the device will take into account a reduction in ability to input commands, such as how a touchscreen’s effectiveness is severely reduced when attempted to be used underwater.
Naturally, Apple is also working to improve the water resistance of its devices, which it has explored in other patents by replacing the Lightning port with contacts and magnetically-attached and aligned cables.