A few days ago, I was watching Professor Sheena Iyengar's TED talk on "The art of choosing". Something that the camera captured during the video stuck with me for several hours after I saw the video. She was tapping the notes that she prepared for her talk. Then the hard reality struck me. She was using the Braille script.
In her talk she referred to a couple of websites. Hours later, I wondered how the visually challenged use the internet. I closed my eyes and tried to use my personal computer. I could not access a single webpage. Then I tried my mobile phone. Given that it has a touch screen tapping the screen did no good while I closed my eyes. I realized that a significant portion of today's technology is based on the visual effects. Is the technology marginalizing a large and important section of end users?
Out of curiosity, I searched and learned about internet for the visually challenged. The good news is Engineers have always been thinking of ways to make the impossible possible. The most popular one in use for surfing the internet seems to be the screen reading softwares like VoiceOver in OS X and JAWS for Windows. It isn't as simple as it sounds. Imagine how long it would take if we read out loud every word on a webpage! Wouldn't reading the enormous amount of data on a single webpage take a lot of time? Websites like Facebook have an option in their settings for the voiceover of each webpage. There is an incredible amount of smart work that engineers have put into this to improve the internet experience for the visually impaired. All this is with the personal computers and laptops. How does this work for mobile phones and the apps?
Braille type is an Andriod App that helps the visually challenged touch the screen in the braille script format. It is a multi-touch technique that allows the user to tap the screen at multiple locations simultaneously and recognizes the letter that was typed. But this is only when the user needs to input something to the mobile phone. How about interpreting the touch screen? How can they figure out where each icon is on the screen? Should they always depend on a sighted person to do this for them? Thankfully the answer is no. There is a lot of effort being put into enabling tactile feedback on touch-screens. There is also a rule called slide -rule that researchers at the University of Washington proposed. The rule recognizes 4 simple gestures like one finger tap, two finger tap, flick and an L -gesture. Each gesture commands the device to perform a specific task.
These efforts raised hope in me that the technology is becoming friendlier to the visually challenged. There is very clearly a strong continued interest in developing touch screens and better apps to help improve the technology experience for the visually challenged.
Kudos to all the engineers for their vision and efforts in making technology inclusive! As an Engineer, I feel impelled to be more sensitive and thoughtful in making technology accessible to everyone.
In her talk she referred to a couple of websites. Hours later, I wondered how the visually challenged use the internet. I closed my eyes and tried to use my personal computer. I could not access a single webpage. Then I tried my mobile phone. Given that it has a touch screen tapping the screen did no good while I closed my eyes. I realized that a significant portion of today's technology is based on the visual effects. Is the technology marginalizing a large and important section of end users?
Out of curiosity, I searched and learned about internet for the visually challenged. The good news is Engineers have always been thinking of ways to make the impossible possible. The most popular one in use for surfing the internet seems to be the screen reading softwares like VoiceOver in OS X and JAWS for Windows. It isn't as simple as it sounds. Imagine how long it would take if we read out loud every word on a webpage! Wouldn't reading the enormous amount of data on a single webpage take a lot of time? Websites like Facebook have an option in their settings for the voiceover of each webpage. There is an incredible amount of smart work that engineers have put into this to improve the internet experience for the visually impaired. All this is with the personal computers and laptops. How does this work for mobile phones and the apps?
Braille type is an Andriod App that helps the visually challenged touch the screen in the braille script format. It is a multi-touch technique that allows the user to tap the screen at multiple locations simultaneously and recognizes the letter that was typed. But this is only when the user needs to input something to the mobile phone. How about interpreting the touch screen? How can they figure out where each icon is on the screen? Should they always depend on a sighted person to do this for them? Thankfully the answer is no. There is a lot of effort being put into enabling tactile feedback on touch-screens. There is also a rule called slide -rule that researchers at the University of Washington proposed. The rule recognizes 4 simple gestures like one finger tap, two finger tap, flick and an L -gesture. Each gesture commands the device to perform a specific task.
These efforts raised hope in me that the technology is becoming friendlier to the visually challenged. There is very clearly a strong continued interest in developing touch screens and better apps to help improve the technology experience for the visually challenged.
Kudos to all the engineers for their vision and efforts in making technology inclusive! As an Engineer, I feel impelled to be more sensitive and thoughtful in making technology accessible to everyone.