Orientation & Mobility

A blind pedestrian is using a guide dog and five technologies for navigation. This figure illustrates the need for an integrated navigational system.
The guide dog aids with mobility and obstacle avoidance.
The compass provides the user with heading information when stationary.
The GPS receiver integrates with a GIS database (digital map) to provide position and heading information during outdoor navigation.
The talking signs receiver gives orientation cues by identifying the direction and location of important landmarks in the environment.
The digital sign system (DSS) receiver picks up barcodes from signs and sends them to a database to facilitate indoor navigation.
The BrailleNote accessible computer, the hub, represents the ‘‘brain’’ of the system, allowing Braille input and speech and Braille output.

Are sighted individuals this necessary?
Illustration by Jorge Martins
Assistive Technology for Orientation and Mobility
Moving in a 3D space requires orientation and navigation capabilities. For that purpose, we are able to gather, interpret and build up knowledge about our environment in a multifaceted skill. The set of skills, techniques and strategies used by blind people to travel independent and safely are known as Orientation and Mobility (O&M).
Though closely related, there is crucial difference between orientation and mobility:
Orientation means to know where he (she) is in space and where he (she) wants to go
Mobility is to be able to carry out a plan to get there
The function of any sensory aid is to detect and locate objects and provide information that allows user to determine (within acceptable tolerances) range, direction, and dimension and height of objects. It makes non-contact trailing and tracking possible, enabling the traveler to receive directional indications from physical structures that have strategic locations in the environment with additional object identification if possible.
The assistive technology solutions for pedestrians with visual impairment and reveals that most of the existing solutions address a specific part of the travel problem. Technology-centered approach with limited focus on the user needs is one of the major concerns in the design of most of the systems. State-of-the-art sensor technology and processing techniques are being used to capture details of the surrounding environment. The real challenge is in conveying this information in a simplified and understandable form especially when the alternate senses of hearing, touch, and smell have much lesser perception bandwidth than that of vision. A lot of systems are at prototyping stages and need to be evaluated and validated by the real users. Conveying the required information promptly through the preferred interface to ensure safety, orientation, and independent mobility.
Since 1960s evolving technology helped many researchers built electronic devices for navigation. These can be categorized as follows:
Vision Enhancement involves input from a camera, process the information, and output on a visual display. In its simplest form it may be a miniature head-mounted camera with the output on a head-mounted visual display.
Vision Replacement involves displaying the information directly to the visual cortex of the human brain or via the optic nerve.
Vision Substitution is similar to vision enhancement but with the output being nonvisual, typically tactual or auditory or some combination of the two and since the senses of touch and hearing have a much lower information capacity than vision, it is essential to process the information to a level that can be handled by the user.
O & M devices are broadly categorized as:
Electronic Travel Aids (ETAs): devices that transform information about the environment that would normally be relayed through vision into a form that can be conveyed through another sensory modality.
Electronic Orientation Aids (EOAs): devices that provide orientation prior to, or during the travel. They can be external to the user and/or can be carried by the user (for example, infrared light transmitters and handheld receivers).
Mixed EO&TAs
Recommended reading
Assistive Technology for People with Visual Loss, Delhi Journal of Ophthalmology, 2020
Orientation and Mobility / Blind Cane Travel, Blind to Billionaire: Videos on use of Cane for mobility, 2020
Wearable Technologies: Concepts, Methodologies, Tools, and Applications (Critical Explorations), by Information Resources Management Association, 2018
ICTs for Orientation and Mobility for Blind People: A State of the Art (Paper), 2018
Electronic Travel Aids for Blind Guidance: An Industry Landscape Study, Sutardja Center, 2015
ICTs for Orientation and Mobility for Blind People: A State of the Art (Book), 2013
Tactile Wayfinder: A Non-Visual Support System for Wayfinding, 2008
Orientation and Mobility for Children Who Are Blind or Visually Impaired, FamilyConnect
Orientation and Mobility Training: The Way to Go, TSBVI, 1998
O&M Living History -- Where Did Our O&M Techniques Come From?, 1996
About this Page
We present a glimpse of the orientation and mobility aids in the following sections:
Dogs and Canes - Friends of the Blind: Guide dog and Canes (often White Cane) have been the most popular friend of the users for O&M. Yet debate continues.
Electronic Travel Aids and Electronic Orientation Aids: ETAs: Various ETAs and EOAs, commonly called ETAs together, have evolved from simple White Cane to various levels of mechanical, electro-mechanical, and IT-and-AI enabled smart wearable aids.
Torch-like ETAs were some of the first prototypes; more specifically, ultrasounds based ETAs. The problem with the torch-like devices is that one hand of the user is employed to handle the ETA. Moreover, blind users hardly renounce to the white cane, since it is the most reliable device to prevent from falling. So, both arms and hands are occupied by different tools. Because of that, cane-like products started to appear, being the laser cane the first of them. The main problem of many mobility devices is the occupation of the hands to use them. Thus, some other implementations have been proposed, where sensors are mounted on a belt, while the computation unit is carried in a bag. In order to make the ETAs less ostensive, researchers have tried other wearable ETA possibilities like the head, the chest, the tongue, simply worn in the vests, or in the shoes. Several ETAs carried by the users simultaneously use pre-installed hardware in the environment like GPS, Mobile Network or Indoor Positioning Systems.
Innovations: Way beyond the era of dogs and canes, continuous innovations continue to take the horizon of O&M aids forward. These are at various stages of preparedness - ranging from individual skill to laboratory prototype to early commercialization. We cover a few promising ones including SmartCane, Tongue Click Sonar, Canetroller, EyeCane, and VL Eyes using LiFi. We also outline DIY projects for smart glasses.
Technology: Provides a brief overview of visual assistive technology for O&M.
Dogs and Canes - Friends of the Blind
The only, absolute and best friend that a man has, in this selfish world, the only one that will not betray or deny him, is his DOG.
- King Frederick of Prussia, 1789
Dog naturally comes to great help for people with blindness or low vision, They are the guide dogs. In contrast, a cane - colored, suitably designed - can provide a different type of independence. The debate continues on which support is better, or should both be used (like the cover picture of the page)? VisionAware takes a non-committal approach, saying: Choose the Option That Is Best for You?, while others continue to debate.
We present some facts.
Source
Advantages of Cane Travel
A cane is easily replaceable and affordable. With a cost between free to $40, you can have a spare on hand in case of emergencies.
Canes give you tactile information about your environment. You can stop and smell the flowers when you know exactly where the flower box planter is on the sidewalk.
You can learn your environment faster and more thoroughly. The tactile information you gain from the cane finding fixed landmarks helps you understand the terrain you are exploring and provides concrete objects to ensure your orientation is correct.
Disadvantages of a White Cane
Increased interference from the public wanting to assist – kindhearted people always want to help by grabbing your arm, cane or clothing but sometimes their help isn’t helpful. (Hint: Always ask first!)
Cane travel can be more cumbersome and not as fluid. A cane gets stuck in cracks and you get a poke in the stomach – ouch!
Weather impacts cane travelers. A 6" or more snowfall with a cane can really wreak havoc getting around, as it is difficult to tap or sweep the cane and some landmarks may not be available to check orientation.
Cane User (Left) vs Dog User (Right)

Advantages of Guide Dog Travel
Faster and more graceful travel in general—with a dog you breeze by people and obstacles without much change in pace or direction.
A guide dog can be a bridge to the general public opening opportunities for conversation and making new connections. People have made many new friends talking “dogs” with my fellow commuters and folks who are interested in learning about guide dogs.
Guide dogs can be a deterrent to potential personal attacks. While guide dogs are not trained to attack, a thief may think twice before trying to take your purse, wallet or smart phone.
Disadvantages of Guide Dog Travel
Time and responsibility of daily care for a guide dog – feeding, watering, relieving, grooming and playtime are all a part of a guide dog handler’s day.
2-3 week commitment to train with a new guide dog – it may be nice to get away from it all and have your meals prepared and your room cleaned, but it is still time away from work, family and other responsibilities.
Expenses incurred with a guide dog – big dogs eat lots and vet bills are not inexpensive.
Dog attacks are increasing and can ruin a dog’s confidence and ability to work. With the increase in pet-friendly hotels and apartments, therapy dogs, emotional support dogs and the like; we are running into more and more dogs in our daily travels. Dog encounters can be a dangerous situation with one serious act of aggression ending a dog’s working life.
Dog hair on clothing and in home – lots of grooming and a lint brush and tips for getting dog hair off fabric surfaces is a must.
Source
Further reading

Also known as seeing eye dogs, these are assistance dogs trained to lead blind or visually impaired people around obstacles. Although dogs can be trained to navigate various obstacles, they are red-green color blind and incapable of interpreting street signs.
The human does the directing, based on skills acquired through previous mobility training. The handler might be likened to an aircraft's navigator, who must know how to get from one place to another, and the dog is the pilot, who gets them there safely.
Breeds
Guide dog breeds are chosen for temperament and trainability. The most popular breed used globally today is the Labrador Retriever. This breed has a good range of size, is easily kept due to its short coat, is generally healthy and has a gentle but willing temperament. Crosses such as the Goldador (Golden Retriever/Labrador), combine the sensitivity of the Golden Retriever and the tolerance of the Labrador Retriever.



A white cane is a device used by many people who are blind or visually impaired.
A white cane primarily allows its user to scan their surroundings for obstacles or orientation marks, but is also helpful for onlookers in identifying the user as blind or visually impaired and taking appropriate care.
The latter is the reason for the cane's white color, which in many jurisdictions is mandatory.
Mobility canes are often made from aluminium, graphite-reinforced plastic or other fibre-reinforced plastic, and can come with a wide variety of tips depending upon user preference.
White canes can be either collapsible or straight, with both versions having pros and cons. The National Federation of the Blind in the United States affirms that the lightness and greater length of the straight canes allows greater mobility and safety, though collapsible canes can be stored with more ease, giving them advantage in crowded areas such as classrooms and public events.
Blind people have used canes as mobility tools for centuries, but it was not until after World War I that the white cane was introduced.
In 1921 James Biggs, a photographer from Bristol who became blind after an accident and was uncomfortable with the amount of traffic around his home, painted his walking stick white to be more easily visible.
Long cane: Designed primarily as a mobility tool used to detect objects in the path of a user. Cane length depends upon the height of a user, and traditionally extends from the floor to the user's sternum.
Often long canes can be folded.

Guide cane: A shorter cane, generally extending from the floor to the user's waist, with more limited potential as a mobility device. It is used to scan for kerbs and steps.
Identification (ID) cane: Used primarily to alert others that the user is visually impaired, but not to the extent where they require a long cane or other variant. It has no use as a mobility tool.
Support cane: Designed primarily to offer physical stability to a visually impaired user, the cane also works as a means of identification.
Kiddie cane: This variant functions exactly the same as an adult's long cane but is designed for use by children.
Green cane: Used in some countries, such as Argentina, to designate that the user has low vision, while the white cane designates that a user is completely blind.

Red-and-White cane: For deafblindness

Red and White Canes – and persons with deafblindness
White canes are often used by blind pedestrians and/or those with a visual impairment, and this is well known in most countries around the world.
However, persons with deafblindness (with both sight and hearing impairments) use a red and white striped cane to navigate, therefore, it is very important to raise awareness of this symbol.
In some countries the red and white cane is already well recognized, but in some other countries, there is still much work left to do, to increase the recognition by the general public and authorities.
Read the story of Alegria, a Spanish older woman with deafblindness, for the importance of the red and white cane.
Electronic Travel Aids and Electronic Orientation Aids: ETAs
Various ETAs and EOAs, commonly called ETAs together, have evolved from simple White Cane to various levels of mechanical, electro-mechanical, and IT-and-AI enabled smart wearable aids. Multitude of services have also been built around O&M. These are covered below under Torch-like, Cane-like, Wearable, and Infrastructure ETAs.
The key parameters of ETAs are listed below with their Space of possibilities:
Energy Management: Active, Passive
Technology: Ultrasounds, Incandescent light, Infrared (IR), Laser, Global Position System (GPS), Compass, Mono/Stereo Vision, RFID, WLAN, Gyroscope, GSM/GPRS/UMTS (Mobile Phone Network), Bluetooth, Inertial sensors
Hardware Use: Belt, Cane, Chest, Hand, Head Mounted, Neck, Phone, Shoe, Skin, Tongue, Worn, Earphones, External (implemented over the urban space)
Information Channel: Tactile Electro Mechanic, Vibration, Sounds (unstructured, stereo or mono), Synthetic Voice, Recorded Voice, Braille, Bone conduction, Mechanical guidance , Direct Stimulation of Brain Cortex
Information Structure: Discrete (from binary to 'N' symbols), 1-D, 2-D, 3-D, or n-D
Field of Application: Indoor, Outdoor
Also, most ETAs are carried by the user. Avoiding obstacles and providing a safe travel depends on the specific movements of the user. Carried ETAs fit much better for this kind of problem. In the historical review we found that the first ICTs applied to blindness belonged to this group (and so it happens with non-ICT technology such as the canes). As we will see, most of them codifies the information in sounds, what is called sonification.
Source
Wearable Technologies: Concepts, Methodologies, Tools, and Applications (Critical Explorations), by Information Resources Management Association, 2018
Torch-like ETAs
Torch-like ETAs were some of the first prototypes; more specifically, ultrasounds based ETAs. We have seen some of these examples in history, such as the G-5 Obstacle detector or the Signal Corps (even if limitations in technology forced to build "bag-like" ETAs). After those devices, Kay developed, in the late 60', the Ultrasonic torch and the Torch, setting up the paradigm of this approach. Now we have a wide range of them.

Hi-tech mobility for the blind and visually impaired Ray was designed to provide a handheld, lightweight and compact supplement to traditional canes for the blind. It is a small, extremely sensitive electronic mobility aid that senses obstacles and alerts the user by emitting audible or vibrating signals (or both.) Ray features easy 2-button operation and is so compact that it can fit in the palm of your hand, slides into a pocket for fast storage, or worn around neck.
Using an ultrasonic emission similar to the cone of light of a flashlight, Ray can recognize obstacles up to a distance of 2.85m away and announce them to the user via an audible or vibrating signal.
Ray is intended as a complement to traditional canes for the blind, not as a replacement. While Ray recognizes obstacles in your path, it cannot detect drop-offs such as curbs.
A special Escape mode enables the user to locate small gaps such as door entrances or passageways through a crowd of people (again the user can choose between audible or tactile feedback.)
This paper evaluates the efficacy of using far infrared thermal imaging with a haptic display to simplify the problem of quickly identifying the presence and location of people relative to a blind user. The idea is to use a haptic display that images the output of an infrared camera. Each pixel of the display is a binary up or down determined by comparing the IR camera output to a threshold set just below human skin temperature.
A prototype device was constructed consisting of a 50x50 tactile array from KGS Corp. and an infrared camera mounted in a textbook-sized frame as seen in Fig. 1. The frame can be held and aimed at a target with hand straps on each side. The prototype device also attaches, via a tether, to a notebook computer. The device was tested by five blind users to establish the efficacy of the system in a variety of real world scenarios.

Prototype device in hand held mode. The infrared camera is mounted on top of the tactile display. The tactile display shows the image of two people in front of the user
(To Next column)
(From Last column)

Future hand held device with integrated miniature microbolometer based infrared camera and fingertip tactile array. The device will also include a thermal threshold thumbwheel. The battery and processor will be in a small belt worn box attached via a cord.

Portable electronic travel device that uses ultrasound to detect objects to provide tactile or auditory feedback by vibrating or chirping sounds more rapidly as the user approaches an object. When used with a cane or dog guide, it can help a blind person avoid obstacles and overhangs, locate landmarks, locate items such as mailboxes or trash cans, and find paths through crowds at ranges from 20 inches to 26 feet. Has two large button controls; five default ranges (ranging from ½ meter to 8 meters); user-friendly advanced settings that allow user to select range presets, change type of auditory feedback, and manage optional remote unit; and a durable plastic case. Optional remote unit for instructors can receive the same tactile feedback as the student.

Obs Avoid using Haptics&Laser is a virtual white cane using a laser rangefinder to scan the environment and a haptic interface to present this information to the user. Using the virtual white cane, the user is able to poke at obstacles several meters ahead and without physical contact with the obstacle. By using a haptic interface, the interaction is very similar to how a regular white cane is used. The length of the virtual cane can be chosen by the user, but it is still limited. The wheel chair will be controlled by Joystick using right hand and sensing the environment will be controlled by Falcon (haptic interface) using the other hand.
This is particularly useful for blind people with motion impairment.

The Novint Falcon, joystick and 2D LiDAR
Cane-like ETAs
The problem with the torch-like devices is that one hand of the user is employed to handle the ETA. Moreover, blind users hardly renounce to the white cane, since it is the most reliable device to prevent from falling. So, both arms and hands are occupied by different tools. Because of that, cane-like products started to appear, being the laser cane the first of them.

Developing safer alternatives since 2010, the UltraCane is a primary electronic mobility aid for use by people who are blind or visually impaired and is THE ONLY electronic mobility aid that utilizes state of the art narrow beam technology, allowing the user to safely avoid obstacles and navigate around them, both in the user's forward path and just as importantly, giving valuable protection at head/chest height. No other electronic mobility aid utilizes this technology.
The UltraCane detects street furniture and other obstacles within 2m or 4m (depending on setting) and it does this by emitting ultrasonic waves from two sensors. It also detects up to 1.5m ahead at chest/head height, giving tactile feedback to the user through two vibrating buttons on the handle over which the user places their thumb.
The two buttons, when vibrating, indicate the direction of the obstacle; the frequency of the vibration lets the user know the proximity of the obstacle.
This type of feedback stimulates a spatial mind map in the brain, enabling the user to obtain information about the layout of their immediate environment and surroundings and guides them safely through and around obstacles.
The UltraCane enables the user to make decisions much more quickly, thus allowing them to move around more safely, confidently and effectively.

Smartcane device is an electronic travel aid which fits as handle of the white cane. As white cane can only detect obstacles up to knee height, Smartcane compliments its functionality by detecting obstacles from knee to head height. It detects obstacles using sonic waves and the presence of obstacles is conveyed through intuitive vibratory patterns. It is powered using rechargeable Li–ion battery like cell phone and can be used in both indoor (1.8m) and outdoor (4m) navigation modes. It has been designed to accommodate varying types of user grips which are commonly used by visually challenged.
Smartcane users have found that it is an extremely useful device in detecting different types of obstacles such as tree branches, side of a truck, hanging cloth strings, protruding equipment from walls of an office corridor such as air conditioners, railings, construction equipment and accessories. In other words it helps users to navigate independently without sighted assistance even in unstructured environments. Many users found it extremely useful in avoiding awkward collisions, way finding in narrow pathways and even to detect movements of human beings and animals.
It’s varying intensity tactile feedback helps in pro-active planning of mobility pathways without collisions. Users of smartcane found the device easy to use, ergonomically convenient, easy to train and adapt to. After a few days of use and adaptation, user found that it reduces travel time as compared to use of normal white cane.
The SmartCane was developed by computer scientist Rohan Paul when he was an undergrad at the Indian Institute of Technology Delhi. After nearly a decade of testing, the device launched last year; all or part of its $50 cost is often subsidized by nonprofits or the government, and so far 10,000 Indians have picked one up.

2 modes of obstacle detection - Indoor & Outdoor

Close-up of SmartCane
SmartCane Device: An Introduction [hindi]
स्मार्टकेन डिवाइस - एक परिचय [हिंदी]

By G-Technology Group increase mobility independence with safe navigation around potential hazards in the path of travel that are beyond the reach of a white cane. With the adjustable detection range, it can be used either indoors or outdoors and for slow or fast walkers.

Interactive Smart Cane With GPS Navigation
Kürsat Ceylan, a blind man from Istanbul, Turkey, has developed WeWALK, a wonderful smart cane that detects obstacles, provides lighting and assists those with visual impairments navigate their respective paths wherever they are. This innovative cane connects to a proprietary app through Bluetooth and is available both through Apple Store and through Google Play.

The current method that visually impaired individuals use to navigate is by using their white canes and tactile paving, which can cost up to 30 million dollars to implement in a city like Toronto. This method of navigation is expensive and inefficient.
Today, that changes. EyeCane is proposing a socially assistive "robotic guide dog" that guides a visually impaired individual to their desired locations - automatically!
This patented prototype can navigate away from obstacles, have a GPS locator to guide the person to their destination, and heart rate monitors in case of medical emergency. This invention will also help the aging population with mobility inside houses, shopping malls, hospitals, and other critical infrastructures both indoors and outdoors.
SONAR GUIDE: Mobility Aid for Wheelchairs & Walkers
SONAR GUIDE by G-Technology Group alerts the wheelchair and walker user to curbs, step-ups, drop-offs, and low-level obstacles in their path of travel. Users are alerted to potential hazards by audible warning tones and LED lights.
Sonar Guide can be installed on wheelchairs or walkers.
Sonar Guide detects obstacles in the user’s path of travel below waist level at 3 different distance settings, customizable at request. The peripheral obstacles outside of the sensor’s range will not be detected. Sonar Guide will detect changes in elevation (drop-offs and step-ups) of 3 or more inches.
When an obstacle is detected, the alert LED will turn red and you will hear a voice in the headset say “Obstacle Right”, “Obstacle Left”, or just “Obstacle”.
When drop-off or step-up is detected, the LED will turn yellow and you will hear a voice in the headset say one of the following: “Dropdown Right”, “Dropdown Left”, “Step-Up Right”, “Step-Up Left”, or either just “Dropdown”, or just “Step-Up”.

Structural Diagram

Quantum Edge Adapter

Dolomite Adapter

Nitro Adapter
Wearable ETAs
The main problem of many mobility devices is the occupation of the hands to use them. Thus, some other implementations have been proposed, being the first one the Navigational Aid for the Blind in 1990, which works with two infrared emitters, The most known, ultrasound based ETA in this group is the so called NavBelt.
Sensors are mounted on a belt, while the computation unit is carried in a bag. Other implementations hang from the neck and use other information channels, such as the Guelph Project “Haptic Glove". This device, based on stereo vision processing, holds from the neck of the user. The data is sent to the user by means of tactile gloves. In order to make the ETAs less ostensive, researchers have tried other possibilities like the head, the chest, the tongue, simply worn in the vests, or in the shoes.

BrainPort Vision Pro is an oral electronic vision aid that provides electro-tactile stimulation to aid blind persons in orientation, mobility, and object recognition.
BrainPort translates digital information from a wearable video camera into electrical stimulation patterns on the surface of the tongue. Users feel moving bubble-like patterns on their tongue which they learn to interpret as the shape, size, location and motion of objects in their environment. It is seeing with your tongue.

Wearable Assistive Devices for the Blind
Assistive devices can be worn on different parts of human body as the schematic above suggests:
Head: Brainport (tactile on tongue), The vOICe (speech), SonicGuide / KASPA
Tongue: BrainPort (tactile on tongue)
Eyes: Glasses: OrCam MyEye (speech)
Fingers and hands: EyeRing / FingerFinder, Finger Braille
Wrist and forearm: Finger Braille Wristwatch, Metro Dot, Sili Eyes
Torso: Vests and belts: Path Force Feedback Belt, Tactile Wayfinder
Feet: Shoe-integrated tactile
OrCam MyEye is a voice activated device that attaches to virtually any glasses. It can instantly read text from a book, smartphone screen or any other surface, recognize faces, help shop on your own, lead independent life!
OrCam MyEye conveys visual information audibly, in real-time and offline. It is used in over 50 countries in 25+ languages.
OrCam MyEye is not designed for mobility, but it can greatly improve the mobility experience by identifying objects, people and reading signs around.

Finger Braille System: DeafBlind
Finger Braille is one of tactual communication media of deafblind people. In finger Braille, index finger, middle finger and ring finger of both hands are likened to keys of a Braille typewriter. A sender dots Braille code on the fingers of a receiver like whether he/she does the type of the Braille typewriter.

Finger Braille Code
Example of tactile communication with persons with deafblindness.

Sili Eyes is an assistive navigator for blind that adapts GSM and GPS coordinator. It helps the users detect their current location, hence, navigating them using haptic feedback. In addition, the user can get information about time, date and even the color of the objects in front of him/her in audio format. The device is attached within a silicon glove to be wearable.

The vOICe vision technology for the totally blind offers the experience of live camera views through image-to-sound renderings. Images are converted into sound by scanning them from left to right while associating elevation with pitch and brightness with loudness.
In theory this could lead to synthetic vision with truly visual sensations ("qualia"), by exploiting the neural plasticity of the human brain through training.
The vOICe also acts as a research vehicle for the cognitive sciences to learn more about the dynamics of large-scale adaptive processes in the human brain. Neuroscience research has already shown that the visual cortex of even adult blind people can become responsive to sound, and sound-induced illusory flashes can be evoked in most sighted people.

SonicGuide and the Kaspa system is head-mounted and similar to radar systems: a laser or ultrasonic beam is emitted in a certain direction in space and the beam is reflected back from objects that it confronts on its way. A sensor detects the reflected beam, measures the distance to the object and indicates that information to the user through audio or tactile signals.
Virtual Leading Blocks for Finger Braille (DeafBlind)
It is a navigation system, which consists of a wearable interface for Finger-Braille, one of the commonly used communication methods among DeafBlind people in Japan, and a ubiquitous environment for barrier-free application consisting of floor embedded active radio-frequency identification (RFID) tags.
The wearable Finger-Braille interface using two Linux-based wristwatch computers has been developed as a hybrid interface of verbal and nonverbal communication in order to inform users of their direction and position through the tactile sensation.
The system mimics Watermelon Splitting Game wherein a blindfolded player tries to hit and split a watermelon in front of them with a stick.

The player is guided by the voices of the surrounding people. The essence in watermelon splitting is to choose verbal and/or nonverbal instruction depending on the player's position and movement. Verbal instruction is auditory information such as right. Nonverbal instruction is provided by auditory intonation, strength, and frequency such as right, right, right, ....

Cognitive Guidance System (CG System)
CG System is a guidance system for blind people through structured environments. It uses Kinect sensor and stereoscopic vision to calculate the distance between the user and the obstacle with help of stereo vision, vanishing point and fuzzy rules. The vanishing point will help us in structured spaces to have a main direction. Since the guidance system anchors to a spatial reference system, the vanishing point is used like a virtual compass which helps the blind to orient himself towards a goal.


A Path Force Feedback Belt (PF belt), designed by Fradinho Oliveira, helps blind people navigate outside through their road. It has two video cameras that take the video stream and then generates a 3D model of the user’s surrounding area which then generates vibrations at one or more of the 12 points on the waist belt.

The Path Force Feedback belt with individual force feedback actuators covering 360º around the user

Finger-Braille with Wristwatch Computers
These tactile devices convey information using intermittent alert-like signals and are used convey simple patterns such as alert-like information to the blind for example when approaching an obstacle. They are typically integrated with Braille Watch.

The bandage-sized tactile display is an innovative touch stimulation device based on Electro-Active Polymer (EAP) soft actuator or artificial muscle technology. It is soft and flexible and can to be wrapped around the finger like a band-aid. This new wearable display could be used as a Braille display or as a multi-purpose tactile display to convey visual information to the blind.
Tactile feel is produced by actuating the 20 contact points independently. Both vibration and upward/downward patterns can be generated using an external user computer interface. The Japanese Finger-Braille interface is a wearable assistive device to communicate information to the deaf-blind.

Digital maps and route descriptions on a PDA have become very popular for navigation. A visual support for wayfinding, however, is not reasonable or even possible all the time. A pedestrian must pay attention to traffic on the street, a hiker should concentrate on the narrow trail, and a blind person relies on other modalities to find her way.
Tactile Wayfinder is a non-visual support for wayfinding that guides and keeps a mobile user en route by a tactile display. It has a belt with vibrators that indicates directions and deviations from the path in an accurate and unobtrusive way.

(a) One vibrator is used for direction presentation. (b) Two adjacent vibrators are activated simultaneously for the presentation of a direction in between. (c) Interpolation over the intensities of two adjacent vibrators is applied and allows a smooth, continuous, and accurate presentation

In the first step, the traveler is on the track. The belt vibrates in the front. From Step 2 on the traveler deviates from the path and the tactile stimulation moves continuously to the right from the perspective of the traveler. In Step 4 the traveler begins to turn in the direction of the route and the vibration wanders back to the center, until the traveler is back on the track again as indicated in Step 5.

EyeRing is a supportive reading solution for blind people called FingerReader is an aid for disabled people in reading printed texts with a real time response. This device is a wearable device on the index finger for close up scanning. So, the device scans the printed text one line at the time, then the response comes in tactile feedback and audio format. FingerReader is continuous work to EyeRing which was presented in for detecting a particular object once at the time by pointing and then scanning that item using the camera on the top of the ring.

The Metro Dot is a bracelet type transportation card in Braille for the visually impaired. It relays information like your subway station and how many more stations to go before your stop arrives. The bracelet transmits vibrations to let the user know when to get off the train. The electronic signals are sent to the surface raising the constant magnet to make a Braille pattern on the silicon rubber surface.
The destination and train travel information are transmitted to the subway through the rail tracks, used as a conductor. The Metro Dot is able to provide station location information by catching a series of electronic signals that are being sent to the subway’s receiver antenna.

Shoe-Integrated Tactile Display for Directional Navigation
is an on-shoe tactile display that enables users to obtain information through the sense of touch of their feet. A 16-point array of actuators stimulates the sole of the foot by inducing different vibration frequencies. Experiments to study how people understand information through their feet were conducted with 20 voluntary subjects. Results show that some shapes and patterns are discriminable and that tactile-foot stimulation could be used for a wide number of applications in human-machine interaction. In particular, results show that it is possible to exploit podotactile information for navigation in space.

Distribution of mechano-receptors in the foot sole

Conceptual representation of a FAI cell stimulation device for the foot sole

Shoe-integrated tactile display: back and forth
Carried and External (Infrastructure) ETAs
It is mandatory to present a mixed group of ETAs, which are carried by the users, but simultaneously they used pre-installed hardware in the environment. Most of them use radio waves to inform where the user is, and this information is transmitted to the user by means of verbal information or other methods. The paradigm of this family of APs is the Remote Guidance System in 2010. This system needs a human-controlled tele-center to guide the user and help him or her avoiding obstacles. Obviously, it implements a two-way information transmission, so it is important to have a network with wide coverage and bandwidth. Indeed, the implemented solution is based on the mobile phone networks. It provides the mobility information to the user by means of vocal messages. Other systems are the Metronaut in 1997 based on laser identification of tags with a cane, or the Electronic Guide Stick in 2006, which interacts with sensors implanted in the curve, acting as a guide. The Smart Bat Cane in 2005 and the Vibrator Cane in 2003 use radio frequencies to identify some features of the external world.
As of today most of these ETAs depend on GPS, Mobile Network or Indoor Positioning Systems.

Braille Signage @ Public Places


The term ADA sign typically refers to facilities signage used to mark specific building rooms, spaces or features. This type of signage provides visually impaired and blind persons greater access to public buildings, and is regulated by the Americans with Disabilities Act (ADA) in order to prohibit discrimination against those with disabilities.

Is a system of textured ground surface indicators found on footpaths, stairs and railway station platforms, to assist pedestrians who are vision impaired.
Tactile warnings provide a distinctive surface pattern of truncated domes, cones or bars, detectable by a long cane or underfoot, which are used to alert the vision-impaired of approaching streets and hazardous surface or grade changes. There is disagreement between the design and user community as to whether installing the aid inside buildings may cause a tripping hazard.
A system of tactile paving was first instituted in Asia, starting at Okayama in Japan at pedestrian crossings and other hazardous road situations; followed by Singapore especially on the Mass Rapid Transit (MRT), Light Rail Transit (LRT) and its sidewalks, and on South Korean city subways.

A set of yellow truncated domes can be seen on the down-ramp in a parking lot

ADA compliant color contrast detectable warning installation on a high traffic area in New York City.

A visual example of a 24 satellite GPS constellation in motion with the Earth rotating. Notice how the number of satellites in view from a given point on the Earth's surface changes with time.
Source: Global Positioning System
GPS is a navigation system using satellites, a receiver and algorithms to synchronize location, velocity and time data for air, sea and land travel.
The satellite system consists of a constellation of 24 satellites in six Earth-centered orbital planes, each with four satellites, orbiting at 20,000 km above Earth and traveling at a speed of 14,000 km/h. Each satellite in the network circles the earth twice a day, and each satellite sends a unique signal, orbital parameters and time. At any given moment, a GPS device can read the signals from six or more satellites.
While we only need 3 satellites to produce a location on earth’s surface, a 4th satellite is often used to validate the information from the other three. The fourth satellite also moves us into the third-dimension and allows us to calculate the altitude of a device.
There are five main uses of GPS:
Location — Determining a position.
Navigation — Getting from one location to another.
Tracking — Monitoring object or personal movement.
Mapping — Creating maps of the world.
Timing — Making take precise time measurements.
While GPS is the greatest navigational infrastructure available around the world, it has been adopted in several ways, with devices and / or apps, to aid the mobility of the blind.

Wayfinder Access
GPS software on a device (typically a cell phone) helps answer Where am I? Interface lets users explore unfamiliar areas as well as identify, select and navigate to points of interest using pre-recorded prompts and a screen reader. Allows users to save destinations; select car, taxi or pedestrian routes. It also provides information such as street crossings, points of interest and favorites within a vicinity, as well as speed, altitude and coordinates.

Kapten PLUS GPS
Personal navigation device that can be used to determine location as well as plot routes to local businesses or a specific address. Also features an MP3 player, as well as a memo recorder and an FM radio. Can be controlled either by pressing keys, or by issuing voice commands.

The Talking Signs® are a Remote Infrared Audible Signage (RIAS) System used in Way Finding, Orientation, and Mobility for the blind community. They provide directional voice messages making confident, independent travel possible for visually impaired or print handicapped individuals.
The Talking Signs® Transmitter acts like a homing device and will guide a person to the location where the transmitter is placed. Through a handheld receiver, the silent infrared signal from the transmitter is converted to an audible message. This allows a person to locate the signal, from a distance, and be guided to locations such as the entrance or exit of a building, office, or restroom; as well as, to more specific locations like a drinking fountain, public phone, elevator or information desk.
ClickAndGo NextGen Talking Signs
It leverage iBeacons for the sole purpose of delivering audible landmarking cues in both indoor and outdoor environments. For pedestrians traveling in airports, on college campuses, in transit stations, etc., iBeacons are installed to convert visual signage or prominent visual landmarks into audible messages.
The above video shows a cane traveler locating a bus stop, entering and exiting a bus, and walking outdoor and indoor routes on a college campus.

Mobile Crowd Assisted Navigation for the Visually Impaired is a webapp over Google engine for smartphones called Mobile Crowd Assisted Navigation was developed in to navigate the visually-impaired people between two points online. The app’s primary objective is to assist a visually impaired or blind user in navigating from point A to point B through reliable directions given from an online community. The phone is able to stream live video to a crowd of sighted users through our website. The crowd is then able to give directions from the website with the push of one of the four arrow keys, indicating either left, right, forward, or stop. The aggregation of these directions will be relayed back to the user by audio.

BrailleNote GPS
GPS system that runs on the BrailleNote and VoiceNote electronic notetakers. Allows user to automatically create routes for either walking or riding in a vehicle; understand street layout before traveling to a new city, using the "virtual explore" mode; generate detailed information about speed, direction, and altitude; and calculate the distance and direction to a street address or intersection.
Inclusivity is a basic requirement of any smart city. Here are examples of smart city solutions that can help visually impaired persons navigate urban sites and fully experience city life.
GPS Wayfinding Apps: The most popular solutions to navigate unfamiliar spaces are GPS-powered navigation apps such as Seeing Eye GPS or BlindSquare. The user can enter their destination via voice command or type with the help of VoiceOver function, which is available on nearly every device. The GPS signals can tell users their location, calculate routes and transmit directions on mobile devices by sound or vibration signals.

Beacon-powered Smart Cities: Beacons are small transmitters placed around buildings that send real-time site information directly to mobile devices. They can be installed in public buildings, offices or small locations like bus stops.
Beacons send accurate location data to mobile devices in the area and work both indoor and outdoor.

Tactile and Talking City Maps: Tactile city maps are easy to read and can offer useful details on distances, building structure, street gradients and other topographical features. Combine this with audio components and you get rich additional information like street and place names and even live transport data.
This creates tactile and talking maps, tailored to the needs of blind people, like the ones developed by LightHouse. Audio-tactile technology allows the maps to maintain a clean design and provide easily changeable information. The information can be accessed with smart pen and doesn’t rely on the users’ Braille skills.

4 Ideas From 4 Continents: Helping the Blind Navigate Cities
Warsaw, Poland: Beacons show the way:
Japan: 3D maps for the blind:
Nigeria: Ultrasound guides:
Denver, USA: Transit for all:


Innovations
Way beyond the era of dogs and canes, continuous innovations continue to take the horizon of O&M aids forward. These are at various stages of preparedness - ranging from individual skill to laboratory prototype to early commercialization. We cover a few promising ones including SmartCane, Tongue Click Sonar, Canetroller, EyeCane, VL Eyes using LiFi, and DIY projects for smart glasses.
In this section, we present a few recent innovations and ongoing futuristic projects.
A lot more of information on innovations can be found in:

Elegant Design

Fits Easily on white Cane

Adjustable sensor mechanism

Ergonomic grip
SmartCane: AssisTech, IIT Delhi
A person with blindness can easily detect obstacles on ground, surface textures, pot holes etc. while travelling with a standard white cane. However, white cane cannot detect overhanging objects like tree branches, sign boards, open glass windows, etc. Also, at times using a white cane could result in scratching a parked vehicle with a cane, bumping into another person, etc.
A SmartCane solves the above challenges and empowers visually impaired through independent and safe mobility.

Two modes of obstacle detection
Here's how a SmartCane works:
An electronic travel aid which fits on top fold of the white cane
Enhances white cane's capability by detecting objects from knee to head height in front of a person
Uses ultrasonic ranging to detect obstacles, and conveys distance information to the end users through distinct vibratory patterns
Helps users to avoid collisions with over-hanging and protruding objects, such as tree branches, signboards, underside of parked vehicles, open glass windows, thereby enabling them to navigate in different social settings with safety and confidence.
Informs about presence of objects before actually touching the object with the cane and thus helps in preventing unwanted contact
Source

User Centric Approach
MAVI: Mobility Assistant for Visually Impaired
Mavi is an ambitious project aimed at enabling mobility for visually impaired individuals, especially in India.
A major challenge in this regard is the wide range of complexities that arise in the Indian scenario due to non-standard practices.
After grand success of SmartCane, we identified more serious problems which could be solved using the current technologies. The three major problems we want to tackle with MAVI are:
Safety
The MAVI device intends to provide real time safety against stray animals, potholes or obstructions in the walkway which could be potentially missed/dangerous for the cane.
Social Inclusion
Visually impaired individuals are able to recognize their friends with Face Recognition technology.
Navigation
While we're targeting signboard reading currently, there is a possibility to including SLAM technology for future versions
A mix of computer vision, web based and cloud based techniques are being used in order to achieve the same. The idea is to have a system which could work both offline and online since the network conditions aren't very reliable for a huge majority.
Source
Mobility Assistant for Visually Impaired, Assistech, IIT Delhi


MAVI device mounted on user with a chest strap

Block Diagram for different components of MAVI

MAVI Device mounted on the SmartCane

3D model of the current MAVI prototype
Computer Vision Components of MAVI

Texture Classification
Texture Classification is used to guide the visually impaired about oncoming potholes or sidewalks which can be used

SignBoard Detection
Signboards make life simpler for all of us. Being able to process signboards while walking is a critical requirement for independent mobility.

Face Detection
The delight of meeting a friend on the way is often missed by a visually impaired person. Face Detection/Recognition enables social inclusion for the visually impaired.

Animal Detection
When using the cane to guide oneself, it is important to be aware of any animals in the way. It is a scenario very common in the Indian environment.

INCLUNAV: Making Indoor Navigation Smooth and Accessible
Indoor wayfinding and service accessibility are challenging, especially if you visit a large indoor facility like healthcare centres, transport terminals, museums and shopping complexes for the first time. Information unavailability and poor accessibility in such facilities have brought these challenges to stand as a mainstream problem for indoor spaces. INCLUNAV provides a compact and scalable solution.
Source
Welcome to Inclunav, Assistech, IIT Delhi

Annotate Tool
Annotate as many buildings you want
Upload floor plan, tag floor elements
Tag information such as rooms, doors, services, washrooms and floor connections like lift, stairs on floor plan
Test smoothly using our testing interface
Make it live to the users
Inclunav - Indoor Navigation Solution, 2020
Navigate Tool
Smooth and accessible indoor navigation
Easy localization to help user to find a location inside the building
Quick search for rooms and nearby services while navigating
Easy multi-floor navigation assistance
Mobile App and Web versions available
ONBOARD: An RF based Bus Identification System
Public transport is the only viable mobility option to seek education, work and social connectivity for a majority of blind and visually impaired persons. They face major difficulties in independently accessing public buses since they cannot read the route number and are unsure about the physical location of the bus and its entry/exit door. Despite constantly seeking help from sighted fellow travelers, blind persons frequently miss their desired bus, are unable to reach the entry gate and frequently get hurt in the process. The possibility of these events adds to the fear and anxiety of the visually impaired traveler.
OnBoard offers a simple solution to this problem. It consists of consists of two modules: a mobile-llike hand-held user module, and a bus module installed inside the bus near the front door of each bus with a large speaker facing outwards. Both modules can wirelessly communicate with each other.

How OnBoard Works

Query
When user standing at the bus stop hears any bus approaching, they can get the route number of the bus by just pressing query button. Each bus on receiving the query will respond to the handheld device with their respective route numbers. On receiving these route numbers user module will read them out sequentially.
Selection
User can select the desired route by pressing select button when the desired bus number is read out by the module. This selection is sent to all the bus modules in the vicinity. But only the bus with the selected number will readout the route number on ithe speaker of the bus module. Hearing the audio cue, the user can easily navigate himself towards the entry door of the bus without any external assistance.
Key Features
Low Power
No Structural Support required at Bus Stop
Easy Selection
Audio Cues
RF Technology
Unsupervised boarding with a user, on BEST Buses, Mumbai, 2015

User Identifying Buses in the Vicinity

The Bus Module mounted near the Entry Door helping the User Home in, towards the waiting Bus
Navigation: Tongue Click Sonar
Daniel shares his life story, his journey behind spreading the echolocation concept, and his message to the blind community across the world that nothing is impossible and seeing the world could only a matter of few tongue clicks away.
Further reading
Enabling People with Visual Impairments to Navigate Virtual Reality with a Haptic and Auditory Cane Simulation, Yuhang Zhao Cynthia Bennett Hrvoje Benko Ed Cutrell Christian Holz Meredith Ringel Morris Mike Sinclair, 2018 ACM Conference on Human Factors in Computing Systems (CHI) | April 2018
Technologies coming soon in 30 Apps, Devices and Technologies for People With Vision Impairments by American Academy of Ophthalmology, 2020
Canetroller
Traditional virtual reality (VR) mainly focuses on visual feedback, which is not accessible for people with visual impairments. Microsoft created Canetroller, a haptic cane controller that simulates white cane interactions, enabling people with visual impairments to navigate a virtual environment by transferring their cane skills into the virtual world. Canetroller provides three types of feedback:
Physical resistance generated by a wearable programmable brake mechanism that physically impedes the controller when the virtual cane comes in contact with a virtual object;
Vibrotactile feedback that simulates the vibrations when a cane hits an object or touches and drags across various surfaces; and
Spatial 3D auditory feedback simulating the sound of real-world cane interactions. We designed indoor and outdoor VR scenes to evaluate the effectiveness of our controller. Our study showed that Canetroller was a promising tool that enabled visually impaired participants to navigate different virtual spaces.

(A) A blind user wearing the gear for our VR evaluation, including a VR headset and Canetroller, our haptic VR controller. (B) The mechanical elements of Canetroller. (C) Overlays of the virtual scene atop the real scene show how the virtual cane extends past the tip of the Canetroller device and can interact with the virtual trash bin. (D) The use of Canetroller to navigate a virtual street crossing: the inset shows the physical environment, while the rendered image shows the corresponding virtual scene. Note that users did not have any visual feedback when using our VR system. The renderings are shown here for clarity.
At EyeCane, we strive to improve the navigation lives of those who are visually impaired without the use of expensive haptic technology. We have invented and patented the most innovative solution in navigation for the visually impaired. We plan to solve the navigation problem for visually impaired individuals for good.
The Problem
Current Technologies require white canes and tactile paving
This method of navigation is expensive and inefficient
Costs about 30 Million USD to implement in large cities
The Solution
A robotic cane which will revolutionize navigation and replace tactile paving with thermoplastic paint line.
Patented technology which will have obstacle detection, GPS locator, and Medical Sensor
Help the ageing population with mobility, Accessibility , and more
Source

A map of the Jay Street train station outlining where new accessibility features will be implemented.
Source
New York City pilots new accessibility measures for public transit
Not all accessibility-minded innovation need revolve around technology. New York City, for instance, is testing a number of new low-tech features at a Brooklyn subway stop in an effort to learn more about how public transit can be made more accessible to riders with disabilities.
From the New York Post: “The MTA will test out... Braille and ‘tactile’ signage, interactive station maps and multiple cell phone apps aimed to assist the visually impaired navigate stations. Diagrams will also be posted throughout the station informing riders who rely on elevators and escalators how to exit in case of an outage.”
These types of initiatives are critical to ensuring people with disabilities are able to get around as independently as their fellow travelers without disabilities. They also serve as a great reminder that tremendous progress and innovation can happen with pre-existing technology!
Sources
LiFi is a wireless technology holds the key to solving challenges faced by 5G. LiFi can transmit at multiple gigabits, is more reliable, virtually interference free and uniquely more secure than radio technology such as Wi-Fi or cellular.
LiFi is a mobile wireless technology that uses light rather than radio frequencies to transmit data. The technology is supported by a global ecosystem of companies driving the adoption of LiFi, the next generation of wireless that is ready for seamless integration into the 5G core.
Radio frequency communication requires radio circuits, antennas and complex receivers, whereas LiFi is much simpler and uses direct modulation methods similar to those used in low-cost infrared communications devices such as remote control units. LED light bulbs have high intensities and therefore can achieve very large data rates.

R-LiFi Nano v2 is an easy-to-use, plug-and-play evaluation platform for a wide array of visible light communication applications in consumer, wearables, industrial, medical and Internet of Things (IoT) markets. LiFi Nano consists of two modules, LiFi transmit module connected to the lighting LED and LiFi receive module connected to the host communicating system.
VL Eyes using LiFi, SCEM, Mangaluru, India
This is a path breaking technology of smart glasses for blind from India
VL Eye Glasses Helping Blind Persons Move Around Freely using LiFi Tech! It’s a path-breaking, innovative eyeglasses, conceptualized, researched and designed by RDL Visible Light Communication and Research Centre at Sahyadri Innovation Hub at Sahyadri Engineering College, Mangaluru. It helps the visually impaired to enable the visually impaired / blind person to move around freely within indoor environments like home or office or any unfamiliar spaces, with freedom and without facing any obstacles. This medical breakthrough allows the visually impaired to see the faces of loved ones, read, work, study, and participate in virtually any Activity of Daily Living indoor.

Light is the source of energy and life. But for RDL technologies, light is the source of inspiration and a medium to develop the technology that has far-reaching implications.
And RDL has invented this product by using LiFi technology.
Smart Glass Projects
Most blind people preserve some form of vision, often limited to the perception of light and movement. The smart glasses developed at the University of Oxford take advantage of this residual vision to enable the blind to get their bearings and move through unknown environments. The glasses use a system comprised of cameras and software to detect nearby objects and present them in a form that is recognizable to the user.
Its creator, Dr. Stephen Hicks, has already created a prototype and is seeking funding for its industrial production. If successful, they could become available on the market later this year at a price equivalent to that of a mid-range smartphone.
DIY Smart Glasses
Smart Glasses with varied functionality can be built with off-the-shelf components and easy programming in Python. The following blog has step-by-step explanation
Anti Collision Glasses for the Blind
and the videos explain with illustrations.
Assistive technology for orientation and mobility Importance
Information is needed regarding the presence, location, and preferably the nature of obstacles immediately ahead of the traveler, from ground level to head height and over a wide enough area horizontally to cover the width of the traveler's body. The minimum distance or range over which this information is needed is a comfortable stopping distance at normal walking speed. A greater range is desirable.
Information regarding the path or surface on which the traveler is walking is highly desirable; this includes texture, gradient, and upcoming steps (both up and down) and boundaries to left and right (including step-downs at sidewalk edges).
Information regarding the position and nature of objects to the sides of the travel path is desirable. This includes hedges, fences, doorways, trees, etc., forming part of the shoreline on either side of the path.
Other information to enable the traveler to maintain a straight course is extremely helpful, notably the presence of some type of aiming point in the distance, often provided in practice by distant traffic sounds. Knowledge of absolute or relative direction of travel is also helpful.
Information on landmark location and identification is needed. This can include information under the above categories (especially 3) and also includes the ability to positively identify specific environmental features such as building entrances, room numbers, elevators, rest rooms, floor numbers, intersections, etc
Sufficient information must be provided by one means or another to allow the traveler to build up a mental map, image, or schema for the chosen route to be followed, including turns and other discontinuities.
Visual Assistive Technology for O & M
These are devices that provide pedestrians with directions in unfamiliar places. Type Various types of technology for orientation and mobility are :
Electronic mobility aids: These are devices that use ultrasonic waves to reflect off of obstacles in front of the individual to tell them what is coming in front of them. The usefulness of these devices is debated and they often need to be used in conjunction with a long cane or a service dog.
Electronic Orientation Aids (EOAs): These are devices that provide pedestrians with directions in unfamiliar places Defining the route to select the best path; Tracing the path to approximately calculate the location of the user; Providing mobility instructions and path signs to guide the user and develop her/his brain about the environment.
Position Locator Devices (PLD): These are devices that determine the precise position of its holder such as devices that use GPS technology. Our focus in this paper is on the most significant and latest systems that provide critical services for visually-impaired people including obstacle detection, obstacle avoidance and orientation services containing GPS features. A brief description is provided for the most significant electronic devices.
Fusion of Artificial Vision and GPS (FAV & GPS): An assistive device for blind people to improve mapping of the user’s location and positioning the surrounding objects using two functions that are: based on a map matching approach and artificial vision . The first function helps in locating the required object as well as allowing the user to give instructions by moving her/his head toward the target. The second one helps in automatic detection of visual aims. this device is a wearable device that mounted on the user’s head, and it consists of two Bumblebee stereo cameras for video input that installed on the helmet, GPS receiver, headphones, microphone, and Xsens Mti tracking device formation sensing.
Cognitive Guidance System (CG System): A guidance system for blind people has been proposed through structured environments. This design uses Kinect sensor and stereoscopic vision to calculate the distance between the user and the obstacle with help of fuzzy decision rules type Mandani and vanishing point to guide the user through the path.
Mobile Crowd Assisted Navigation for the Visually-impaired (Mobile Crowd Ass Nav): A webapp over Google engine for smartphones is called Mobile Crowd Assisted Navigation was developed to navigate the visually-impaired people between two points online. The aim of this framework is to offer to the user accessible, efficient and flexible crowd services for visually-impaired people. GPS, compass, accelerometer and camera are used onboard. The smartphone streams the videos and sensory information to crowd server to be used by the volunteers.
Canes: White cane and dog guides are the most popular. White cane is the simplest, cheapest, most reliable and thus the most popular navigation aid. However, it does not provide all the necessary information such as speed, volume, and distances, which are normally gathered by eyes and are necessary for the perception and the control of locomotion during navigation
Source
A survey on Assistive Technology for visually impaired, Internet of Things, 2020
