iPhone can now detect people around you, tell you where they are and how far away they are. Blind or low-vision users can maintain social distance and even navigate through lines.
New AI tool describes the environment for people with visual impairments
Apple added People Detection to the Magnifier app in the latest iOS 14.2 update. It uses the camera and LiDAR sensor in the iPhone 12 and 2020 iPad Pro, and could change how visually impaired users navigate a space.
“Even after the pandemic, I can see applications for this technology,” Aaron Preece, editor in chief of the American Foundation for the Blind’s AccessWorld, told Lifewire via email. “For example, finding a path through a huge crowd that even a guide dog can’t navigate.”
People Detection uses two key features of the iPhone 12. One is the LiDAR sensor, a type of laser radar built into the iPhone’s camera array. It allows the iPhone to detect the position of objects around it and is used, for example, to enhance the camera’s background-blurring Portrait mode.