Microsoft Seeing AI is getting a massive update this week on Apple’s iOS devices. Available now on the App Store, version 4.0 of the tool that helps visually impaired users has plenty of cool new features. Not least, this is the first update for the app since Apple released its new iPhone 12 series of phones, including the iPhone 12 Pro Max.
Perhaps the biggest new addition to Microsoft Seeing AI is support for the LiDAR sensors on Apple’s new smartphones. This allows users to see unfamiliar locations in a 3D spatial audio map. This essentially means users with visual impairments will hear objects within the room.
Elsewhere, Microsoft is also taking advantage of the haptic proximity sensor on the iPhone 12 Pro and Pro Max to use LiDAR to help assess the distance of objects. Other improvements include a better main user screen and better descriptions for the Scene channel.
It’s an extensive update that goes beyond those standout changes. With that in mind, it’s worth checking out the full release notes below:
- “The new World channel, available on devices with a LiDAR scanner running iOS 14, enables you to explore an unfamiliar space in 3D, using spatial audio. When wearing headphones, you will hear objects around you announced from their location in the room. You can also find a particular object by placing an audio beacon on it. We are keen to hear your feedback on this early experiment, and invite you to work with us as we explore this new area together with the community.
- On iPhone 12 Pro and Pro Max, the haptic proximity sensor enables you to point the LiDAR scanner and feel the distance to things around you
- The main screen has been visually redesigned to improve contrast and widen the camera’s field of view
- Improvements to image descriptions on the Scene channel, and when browsing photos on your phone
- Improved text recognition accuracy on the Document channel
- Seeing AI is now available in seven additional languages: Czech, Danish, Finnish, Greek, Hungarian, Polish, and Swedish
- Plus, various bug fixes under the hood”
Announced as a prototype and launched in 2017, Microsoft Seeing AI The smartphone app uses computer vision to give visually impaired users a description of their surroundings and environment.
Once downloaded, users point their iPhone camera at a person and let the AI take over. The app will say who the person is and their current emotion. Seeing AI will also work on items, such as products. In this instance, the app will tell the users what the product is. With LiDAR support on iOS, it also now names objects and distances.
The AI runs natively on the devices, so there is no need to be connected to the internet or cloud. Since launch, Microsoft has added handwriting and currency recognition, and tap to describe on iOS.
Users can get Microsoft Seeing AI with the new update from Apple’s App Store.
Tip of the day:
Is your system drive constantly full and you need to free up space regularly? Try Windows 10 Disk Cleanup in extended mode which goes far beyond the standard procedure. Our tutorial also shows you how to create a desktop shortcut to run this advanced method right from the desktop.