Assistive Technology for People with Low Vision

228

Every morning you wake up, open your eyes and get out of the bed. Having a glance behind the window, you notice how the day starts. Neighbor walks a dog, children go to school and the cars pass by the street. You pay attention to the weather. Is it sunny, rainy or cloudy day? These observations are made in a blink of an eye. Without thinking any longer, you go to the bathroom and look in the mirror. You see your face, a few seconds and again you move to the next activity. Maybe you are having breakfast, while reading the morning news or maybe you are getting dressed in a hurry to run to your duties.
Our perception of environment is formed thanks to messages received from the external world. Majority of impressions are perceived with our eyes. Ability to see enable us to perform certain activities fast and automatically. However, if vision is hindered, information coming from the outside become fragmented. The surrounding scenery is unfocused, the weather not sure, the letters in the newspaper you read every morning are blurred and run to work is not possible.

This kind of scenario is a daily life of people with visual impairment. Image of the world becomes incomplete and faded. From this perspective, different actions and tasks may seem uncertain and difficult to perform. According to a W.H.O. report (2023) globally, at least 2.2 billion people have a near or distance vision impairment.
The degree to which vision impairments may impede performance and scale of their occurrence indicates how crucial is ensuring access to assistive technology for the people struggling with low vision.

Current technologies are based on two approaches. One set of techniques rely on enhancing vision when people partially see. Another strategies employ sense of touch or auditory perception to receive information from the environment.

Technology based on vision cues

The main aspects employed in the technology enhancing visibility of objects are based on increasing size, improving lighting, reducing glare and strengthening contrast. Currently, there are available different devices in this type.
• Magnifiers – enlarge the view of objects by using lens or cameras. Examples: screen magnifiers relying on a software which make onscreen content bigger; big seize screens; handheld magnifiers like magnifying glass.
• Products that use color or contrast – main characteristic is increased contrast or bright color of these items, making them more noticeable. Examples: color-contrasting strips or tapes on the stairs aiming to differentiate surface-level changes, doors marked by light or darker colors or contrasting frame of doors for signing entrance or exit.
• Products reducing glare – don’t create glare. Example: anti-glare spectacles.

Technology based on tactile sensations

Principle of this technology is using different textures to identify objects by touch. It includes devices such as:
• Tactile markers and Braille labelers – aid in distinguishing objects or spaces by different tactile impressions. They can be used in various environments (home, workplace, school, etc.). Examples: personal items covered with rubber band (i.e. glass, toothbrush); certain marks embroidered on some parts of clothing, so they’re easier to identify; braille watches; furniture with textured surface; styrofoam letter stickers on any items for differentiating them; Braille books and notetakers.
• Braille Embosser – used to print Braille materials.
• Braille keyboards – enable typing on smart devices.

Audio technology

Employ audio perception using, among others, devices listed below.
• Audio devices – provide audio information. Examples: watches and clocks informing about the date and time; smart televisions which provide narration for their content.
• Optical Character Recognition (OCR) – software converting printed text into audio output.
• Voice assistant apps – may support performing tasks in various environments. In work or school they aid in managing a calendar or sending notifications about appointments; in home voice assistant apps may control smart home devices to regulate heating or lighting; during the leisure time they allow listening to music or hearing audiobooks and podcasts.
• GPS and navigation apps – applications providing auditory directions and location information.
• Smartphone apps for object recognition – applications using the camera in the smartphone to report about objects in the surrounding. Example: bar code reader app which identifies different products in shops or pharmacies, providing details about them. Moreover, QR codes reader apps are also available.
• Bone conduction headphones – headphones which transmit sound waves through the skull, instead of ear canal. These waves cause vibration in the bones in the skull consequently facilitating listening to device audio. This technology enable to keep ears free which allow for total awareness of the surroundings.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.