January 19, 2021
Borough of Manhattan Community College (BMCC/CUNY) Computer Information Systems (CIS) Professor Hao Tang has been developing an iPhone app that uses assistive technology to help blind and sight impaired individuals (BVI) navigate unfamiliar places or facilities.
Tang’s research project, “Exploring Virtual Environments by Visually Impaired using a Mixed Reality Cane without Visual Feedback,” recently received a $7,500 CUNY Community College Research Grant (CCRG). Four students worked with Tang on the development of the project in 2019 and 2020 and two additional students will be joining the effort in the Spring 2021 semester.
The Mixed Reality Cane utilizes simple auditory and haptic feedback—relating to the sense of touch— on an iPhone to help the cane’s users interact with virtual objects and environments as they are walking in real world situations. The visually impaired can use the app to plan their trips, and learn the physical layout of unknown environments using virtual reality, which encourages them to travel independently.
“The app uses the iPhone on a selfie stick to simulate a long white cane in virtual reality (VR) and applies augmented reality (AR) techniques to track the iPhone’s real time position both location and orientation,” said Tang.
AR is an interactive experience that mimics a real-world environment where the objects that reside in the real world are enhanced by computer-generated perceptual information. VR is a simulated experience that can be similar to, or completely different from the real world.
Tang says the integration of virtual and augmented reality alerts BVI users with auditory and vibrating feedback through an iPhone whenever the virtual cane comes into contact with objects such as a wall or desk.
“If there is a virtual door five feet away in the VR environment, then the virtual cane hits the door and the user hears the audio feedback and vibration so he or she knows the cane hit a door,” said Tang.
He says the cane could make it possible for BVIs to conduct facility orientation and mobility training remotely, which is especially noteworthy given the ongoing COVID-19 pandemic.
“We can build a 3D model of a specific facility, such as BMCC’s Fiterman Hall and import it into our app,” said Tang. “The BVI’s movement in the real world will be synchronized to a long cane in the virtual reality environment.”
Tang and his team plan to improve the app’s design and publish it on the Apple app store. He will also be filing a patent through the CUNY Technology Commercialization Office.
- CIS Professor Hao Tang receives CCRG grant for project
- Cane uses simple auditory and haptic feedback on iPhone to help users navigate
- App applies both virtual and augmented reality techniques to track phone’s position