AbleData does not produce, distribute or sell any of the products listed on this website, but we provide you with information on how to contact manufacturers or distributors of these products. If you are interested in purchasing a product, you can find companies who sell it below.
---- PROTOTYPE --------- PURPOSE: To create a prototype of an adapted Kinect sensor to help improve the indoor navigation of individuals with visual disabilities. Engineering students at the University of Konstanz in Germany have developed a prototype to improve indoor navigation for individuals with visual impairment using the Microsoft Kinect camera, a vibrotactile waistbelt, and markers from the AR-Toolkit. Navigational Aids for the Visually Impaired (NAVI), consists of a helmet-mounted Kinect sensor connected to a computer in a backpack, a special belt containing vibration motors to warn the users of obstacles ahead and to the sides, and a Bluetooth headset to provide verbal feedback. Altogether, the NAVI device assists and individual with who are blind or have low vision navigate to a specific location with tactile and verbal warnings of objects in their path. The system can even detect bar-coded signs and provide further information to the user. The Kinect is mapped onto three pairs of Arduino LilyPad vibration motors located at the left, center and right of the waist. These pairs of vibration motors are hot glued into a fabric waist belt and connected to an Arduino 2009 board. To increase the impact of the vibration motors, each was they put into the cap of a plastic bottle. The Arduino in the waist belt is connected via USB to a laptop that is mounted on a special backpack and has holes for cables and fan. Students wanted to utilize the RGB camera of the Kinect, so they placed several markers of the AR-Toolkit on the walls and doors of the building thereby modeling a certain route from one room to another. The markers are tracked continuously all along the way and the developed software provided synthesized auditory navigation instructions for the person. These navigation instructions vary based on the distance of the person to the marker . For example, if the user walks toward a door, the output will be “Door ahead in 3”, “2”, “1”, “pull the door” as the distance to the marker on the door is reduced. The software was written with C# and .NET. Students used the MangedOpenNI (https://github.com/kobush/ManagedOpenNI) wrapper for the Kinect and the managed wrapper of the ARToolkitPlus (http://code.google.com/p/comp134artd) for marker tracking. Voice synthesis was done using Microsoft’s Speech API (http://msdn.microsoft.com/en-us/speech/default). All input streams were glued together using Reactive Extensions for .NET (http://msdn.microsoft.com/en-us/devlabs/ee794896). TITLE: Project NAVI, a Kinect Hack That Helps Visually Impaired Navigate Indoors. WEBSITE: medGadget. REF: http://www.medgadget.com/archives/2011/03/project_navi_a_kinect_hack_tha....
print close directions