Regulations promulgated under the Americans with Disabilities Act (“ADA”) state that a reader is deemed a reasonable workplace accommodation for a blind person. In the past that meant a human. Now, it is AI: text to speech.
From the time my vision started to significantly deteriorate when I was in middle school, to now in my mid-40s, AI has slowly and steadily transformed how I do my work and how I engage with human intelligence. I now eagerly await the day that AI can be my perfect sighted assistant.
Let me talk about how AI has impacted my life in three ways – reading, searching for information, and navigation. In all of these areas, I utilize a combination of AI and human intelligence. However, I would not mind the day when AI could be used alone when necessary, and human intelligence could be a choice – a supplement to improve my experience but not necessary for essential functions.
Text to speech revolutionized the world of reading. Braille was extremely slow. Audio books had to be recorded by human beings; the selections were limited. Now, I can read virtually any text document, email, web page, and even spreadsheets and slide decks through screen reading software. I can speed it up. It never gets tired. It never has a hoarse voice, but only the voice I select. It never gets sick or has a headache. I use AI for my work. I listen to dramatic readings of my favorite novels on audio books recorded by professionals who love to enrich the experience of reading for anyone – not just blind individuals – who want that unique, human, audio experience.
When it comes to searching for information, I utilize the devices that we all do – the smartphones, my home assistant, voice search, and similar AI assistants. As we all know, such search functionality is great for simple questions. My home assistant can tell me one recipe or a quick health fact; the weather, the time, a phone number, etc. But if I ask a more nuanced question or have it read out five top search results, it will not do that. The combination of the AI and text to speech – when I search on my computer and my text to speech reads the results aloud – works better.
The problem arises, though, when search results get more complicated. Some links are harder to access, some websites have more images than text, and unlabeled graphics block text-to speech recognition. That is when I turn to my human assistants. At this point they are very necessary for my functionality in this area, but AI helps me give them better instructions and be more efficient with their time.
Ebook | AI insights survey: Adopters, skeptics, and why it matters.
And finally, the world of navigation. I have been saying forever that the only thing I cannot do is drive until there are self-driving cars. Unfortunately, my dream of having self-driving cars has been undermined by accidents and regulatory challenges. Nonetheless, the world of navigation has come a long way. I rely on rideshare apps incessantly to get places, and to let me know when I am getting to my destination, and if I’m on the right track, regardless of whether the driver speaks English. Such verification was impossible in the old days of cabs alone, and thus AI has increased my safety and independence. I also can give directions listening to navigation platforms, to sighted humans when they are driving, just like any other passenger might, a feat that has greatly impressed my aging mother who was forced to pull over and look at maps constantly when I was growing up.
By contrast to driving navigation, I have tried using the audio navigation while walking around with less fruitful results. As a blind person listening to the sounds of traffic, the sounds in my environment, and listening to directions has proven challenging.
In summary, I would say in the first category – plain reading – we have made the most progress. I can listen to a novel with a beautiful human voice, making dramatic passages sound glorious to the ears on a voluntary basis, and get through my daily work with my text to speech AI just fine. The only time I run into a problem with reading is when we get to what we do not read – images.
The same challenge presents itself with searching for information. So much of the stumbling point of making the web accessible, as has been exhibited by a myriad of lawsuits, has been image detection and labeling. Therein lies the rub and the need for human beings to describe to me images and navigate the plethora of non-text on the worldwide web.
In the world of navigation, AI has come a long way, but I still await that perfect sighted assistant. The perfect robot companion could describe to me every cafe and pizza place that I walk by in case I want to make a stop; let me know about pedestrians who are idling on the sidewalk in my way; when to cross the street; and how to find the entrance to the address for which I am looking.
We are still a long way from the perfect AI auxiliary aid for a blind person; I believe it is the goal we should aspire towards. The robot that fulfills essential needs for a variety of disabilities would open the door to true equitable access in employment and society.
To learn how financial services professionals are using (or aren't using) AI, download the ebook, AI insights survey: Adopters, skeptics, and why it matters.
The opinions provided are those of the author and not necessarily those of Fidelity Investments or its affiliates.
1063730.1.0