Technology

New iPhones 12 Pro and Pro Max will tell blind users when and how far away people are

Apple has bundled an innovative new accessibility functionality into the latest beta of iOS: a device that senses people's appearance and distance from an iPhone camera so that blind users can easily take social distance, among many other things.
 
The functionality originated from Apple's ARKit, for which the company created "people's occlusion," which senses people's shapes and enables simulated objects to move in front of and behind them. The usability team found that this, coupled with the precise distance measurements given by the lidar systems on the iPhone 12 Pro and Pro Max, could be an incredibly useful method for those with visual impairments.
 
Of instance, during the pandemic, one automatically thinks of standing six feet away from other people. But understanding where the others are and how far away is a simple visual activity that we're using all the time to prepare where we're headed, what line we're going to get in at the supermarket, if we're going to cross the street, and so on.
 
The latest functionality, which will be part of the Magnifier software, uses the Pro and Pro Max lidar and wide-angle cameras to provide input to the consumer in a range of ways.
 
Next, it advises the customer whether there are people to see at all. If someone is there, they will tell how far away the nearest person is in feet or meters, update periodically as they pass or travel farther away. In stereo, the sound refers to the direction the person is in the view of the camera.
 
Second , it helps the consumer to set tones that correspond to those distances. For eg, if they set a gap of six feet, they would hear one tone if a person is more than six feet away, another if they are below that range. After all, not everyone needs a continuous feed of precise distances if all they care for is to remain two paces apart.
 
The third feature, probably extra helpful for people with both vision and hearing impairments, is a haptic pulse that goes faster as a person gets closer.
 
Last is a tactile feature for people who need a little help discerning the world around them, an arrow pointing to the person being detected on the device. Blindness is a continuum, after all, and any amount of vision issues could make a person want some support in that respect.
 
The machine wants a good picture on the wide-angle frame, so it won't work in the dark pitch. And while restricting the functionality to the high end of the iPhone line limits the scope considerably, the ever-increasing use of such a product as a kind of vision prosthetic is likely to make engaging in hardware more palatable to people who need it.
 
Here's how it works:
 
 
This is far from the first app like this — many tablets and dedicated systems have features to find items and individuals, but it's not always that it comes built in as a regular feature.
 
People identification should be applicable to iPhone 12 Pro and Pro Max running the iOS 14.2 release candidate that has only been launched today. Details are likely to surface shortly on Apple's dedicated iPhone accessibility section.
 
 

 






Follow Us


Scroll to Top