Apple’s People Detection, accessibility, and the voice-first enterprise


To understand the future of voice-first computing, you should consider accessibility on every platform; this is particularly true on Apple’s platform.

Computers are getting out of the way

Apple’s investments in accessibility are almost as old as the company itself. It opened its first office of disability in 1985.  

Historically, Apple has always been ahead of the curve in terms of making accessible software, which it sees as a human right. This has won a great deal of recognition from key advocacy groups worldwide.

Fundamental to this work is an effort to build alternative user interfaces: the GUI, MultiTouch, and, of course, Apple’s next big user interface innovation: VoiceOver.

Apple’s work with VoiceOver has been iterative and revolutionary. It first appeared in the iPhone in 2009, took a big step forward in 2019 with Voice Control, and now has a new feature that’s even more deeply transformative: People Detection on the iPhone 12 Pro and Pro Max.

What is People Detection?

People Detection is a new accessibility feature that’s currently only available on high-end iPhones. It exploits Apple’s accessibility technologies, on-device artificial intelligence and the Neural Engine, People Occlusion in ARKit, VoiceOver, and the iPhone’s LiDAR scanner to help identify people using the iPhone camera and let you know how far away they are.

Copyright © 2020 IDG Communications, Inc.





Source link