New developments in accessible technology have made it easier than ever for the visually-impaired to connect.
Gus Chalkias challenges me to spell my name. Chalkias, who lost his vision at age 28 and now runs the demo center at the Computer Center for Visually Impaired People at Baruch College in midtown Manhattan, hands me an iPhone with a darkened screen. I’ll have to do it by touch and by physical memory of the letters’ location.
“Trail one finger across the screen,” he says. An automated voice barks in rapid-fire, staccato speech: Camera. Messenger. Calendar. Thursday, August 20. “Slower,” he says. Stocks. Camera. YouTube. Stocks. I’m shocked by how quickly it’s talking. I can’t keep up.
“Take one finger and swipe it right,” Chalkias advises. “Your finger’s staying on the screen too long. Keep the contact briefer.” A blip—like leveling up in a video game—tells me I’m in the right place. “The lower third of the screen is now your keyboard,” he says. “Type your name.” I find the W. Then R, T, Y, and L. Finally, E. WLIJJ is the best I can do—three minutes later—before I accidentally close the window. Gus spells “Jessica” in 8 seconds.
The demo center, a suite of charcoal cubicles filled with tactile keyboards, is a place to train blind and visually impaired users on accessible apps and add-ons. The center also hosts public events. The last one, “Love is Blind, and So Am I,” tested the accessibility of online dating sites.
Chalkias has cloudy blue eyes and tufts of blonde hair. He’s wearing an oversized purple shirt and pants with a tear over the knee. He sits with his body turned towards me and laughs conspiratorially. He’s invited me to come learn about the products that are shaping the landscape of accessible technology for Americans with visual impairments—a demographic that numbered 20.6 million in 2014, according to a survey from the Centers for Disease Control and Prevention.
Though technology can aid blind users’ daily lives, Chalkias tells me, it still falls short when it comes to helping them navigate their worlds.
Empowerment through independence
Chalkias was diagnosed with retinitis pigmentosa, a degenerative ocular condition, when he was 17. “I knew my vision was going to change, but I hoped that it wouldn’t—and there was no way that prepared for the transition,” he says. His vision was stable until he was 28. “When I did my own research online, I skipped over the inevitable blindness,” he laughs. Then, over the course of two or three months, his vision worsened so rapidly that he had to quit his job as an accountant. “My eyes were getting really tired by the end of the day,” he says.
For a few months, he had a revised schedule with shortened hours. But his fatigue started setting in earlier and earlier in the day. “So then I said, ‘I gotta go, I gotta go. I can’t.’” he says. “It was too hard to subject myself to that daily question of, ‘Will this be a good day? Will this not be a good day?’” His vision didn’t degrade incrementally; for a while, it fluctuated. “I’d have 20/20 vision one day, and then be using a cane the next week,” he notes. He’s now in his 40s and almost completely blind.
“When I lost my vision, my whole concept of self was devastated. I walked around thinking that I was broken and defective for a number of years,” he says. “It took a lot of work to undo that.” Over time, that sense of doubt—projected and internalized—chips away at self-esteem. He thinks that’s one of the reasons that so few people with disabilities enter the work force. (The Bureau of Labor Statistics cited it as just over 17 percent in 2014.)
“I think there are still a lot of misconceptions about what people with disabilities are capable of,” he adds. “That’s hard to overcome.”
But now he’s adjusted to life as a blind person—and that’s largely because of assistive technology. He’s lived in New York since he was four years old, almost 40 years now, and doesn’t want to leave. “Here,” he says, “I never have to worry, ‘Will I get home? Will I be able to find my way back?’” His first accessible cell phone, which he bought in 2004, cost $1,700. Now, free iPhone apps help him on a daily basis.
Wayfinding apps and getting around
One of his favorite apps is Where The Hell Am I?, a no-frills GPS locator that he uses when he’s traveling by car, or walking down an unfamiliar street. But in Manhattan, which he calls an “urban canyon” bordered by craggy skyscrapers, his cell phone signal sometimes wavers, causing service to flicker in and out—meaning that he’s roaming without any guidance.
This is where the burgeoning field of responsive crosswalk infrastructure, such as the prototype for Responsive Street Furniture, could come into play. It works like this: Users log on to a website and select from a range of potential accommodations, including seating areas, brighter street lights, audio cues, and more time to cross. The site stores users’ data, and when they pass an intersection equipped with the technology, the environment adapts to fit the preselected preferences.
The problem is, existing wayfinding apps don’t help once you get inside a building. “It’ll get me to the vicinity of where I want to go, but it won’t get me to a door,” Chalkias says.
Chalkias is about to start his fourth year as a graduate student at Hunter College, where he has to navigate labrynthine buildings connected via bridges that weave over Lexington Avenue. When he first started school in 2012, the complexity of the task threw him into a panic. Nothing was intuitive. Different elevators have different button sequences: sometimes there are four columns, sometimes there are two. “There were times when I was on the elevator, looking for the ‘3,’ and I couldn’t find it, and the elevator’s going up and down, and up and down,” he says. “I was near hysterics.”
Chalkias found a low-tech solution to that particular problem: He worked with a mobility instructor, a service offered by the New York State Commission for the Blind. The instructor helped him identify various routes from his subway stop to his classroom, considering physical landmarks, elevators, and stairs, so that he can navigate independently. “When I go through the turnstiles in the north building, I know that about 10 feet from there is a corridor that I have turn right on to go to the elevators, and then I can trail a wall until it ends.” The hum of escalators also serves as an aural cue.
To type or not to type
A lot of blind people are overwhelmed and intimated by smartphones and computers—for instance, Chalkias left his first iPhone in the drawer for eight months because he was worried about not being able to feel the keypad. He encounters many people at the demo center who want to access websites but don’t want to learn how to type. But he says it’s not as hard as it seems. “Once you figure out the basic layout of the keys, you know relatively where to touch the keyboard,” he adds.
To help locate the position of the keys, his Apple keyboard is outfitted with adhesive dots. The keys are flush and flat, so Chalkias has marked 4, 8, and 12 on the function keypad. Instead of having to start with the Escape key and count over five, he can use the peel-off dots as tactile wayfinding cues. PC keyboards tend to be a bit easier to maneuver around, because the function keys are already grouped into clumps of 4. This is the same idea that informs the standard placement of ridges on the F and J keys of any keyboard—people can position their hands and type by feel.
But some visually impaired users do still prefer devices that have buttons. Blaze EZ—a multi-playing entertainment device—mimics many things about a smartphone, but it has a standard touch-tone keypad. The pads can offer the comfort of familiarity.
“It’s what a lot of people grew up with,” says Karen Luxton Gourgey, director of the Computer Center for Visually Impaired People. “People are used to memorizing things, especially if you grew up blind or visually impaired. Your memory is one of the things that you work on and you learn to use it as a very important tool.”
Social networking and entertainment
Many features of a standard iPhone become immediately accessible to visually impaired users who turn on the VoiceOver option—which speaks text on the screen. Users can also choose from a list of potential speaking speeds and pitches. (Speeds, as shown in the screenshot at left, range from “tortoise” to “hare.”)
Hovering over display text reads it aloud. Chalkias uses this function to hear summaries of the salacious young adult novels that he devours via the Audible app. “I’m very big on escapist reading, like YA or paranormal books,” he says. “I don’t like to interact with reality any more than I have to.”
Another app, the KNFB reader, scans printed material—magazine articles, or even receipts or invoices—and reads it back. “Most of us who are blind are used to synthesized speech, so we can crank it up really fast,” says Danielsen. “You wouldn’t be able to understand it if you started today, but you could work your way up there.”
Major corporations are launching pop-up think tanks devoted to accessible technology. For instance, New York University and AT&T recently co-sponsored a three-month competition—the ConnectAbility Challenge—to develop apps for people with various physical disabilities. More than 60 projects from all over the world vied for $100,000 in prize money, Wired reported. (Chalkias was one of four test subjects.)
One entrant was the Alt Text Bot. Send a photo tweet to @alt_text_bot, and the account uses an algorithm to scan and describe the shared picture. (For instance, “Man standing in front of a sunset with a bridge in the background.”) Chalkias was thrilled. “It’s really frustrating when someone posts a picture on Facebook with no description, and then people comment, ‘This is awesome!’” he says. “I want to know what’s awesome!”
Apple products became accessible around 2008, says Danielsen, after the Massachusetts Attorney Genereal declared that educational products released in Massachusetts must be accessible for people with disabilities. The ruling most directly impacted iTunes U, but was applied to various devices.
“We start with education because it’s something that’s definitely covered by disability laws,” Danielsen explains. Some commercial products may not be covered by legislature such as the Rehabilitation Act, Americans With Disabilities Act, and Individuals With Disabilities Education Act—but educational materials are. “It’s a way to put pressure in the education space, and the education space in turn turns to industry and says, ‘We need accessible products from you,’” Danielsen continues. “That’s how we’re going to be able to serve our students.”
Chalkias appreciates that developers are tackling the issue of accessibility, which may not have previously been on their radar. Sometimes, though, those good intentions are obscured by design that may be less than functional. Take the Dot, which markets itself as an accessible smart watch with a braille display.
The product, inspired by the CEO’s blind college classmate, received rapturous reviews from many media outlets. But Fast Company took aim, criticizing the fact that it only displays four letters at a time:
Imagine reading a word as simple as "ency-clop-edia." Just this one word would take four refreshes of the watch. Even a short, 140-character tweet would take 35 screen refreshes to read.
Gourgey, the center director, echoes that sentiment. “That sounds absolutely obnoxious,” she says. “I need to be able to read a text as easily as you do. Frankly, with phones as they are now, I can pretty much do that,” she adds. “I would throw the thing out the window.”
A few weeks ago, with the help of the VoiceOver function, Chalkias went to Astoria Park and took a photo of the sunset. He couldn’t see the clouds, but made out traces of the bright glare. “I posted it on Facebook and people were like, ‘You posted that yourself?’” he laughs. “I was like, ‘Mmm hmm, that’s right!’ all proud.” He pulls up the photo for me. The sun slinks below orange sherbet clouds, underneath the bridge and flanked by two silhouetted trees. It’s beautiful.
He decides to snap one of me. VoiceOver helps him frame and focus the picture. “One face, small face, face near top-right edge,” it reports. I laugh and duck out of the frame. “Zero faces,” the phone blares. I stop squirming. “Face centered,” it tells him. He clicks the shutter button and shows me the photo. “How did I do?” he asks. I tell him that my face is centered, and in crisp focus—and I’m smiling.