Emily Ackerman is a Ph.D. student in chemical engineering at the University of Pittsburgh and a disability rights activist.
One afternoon last month, as I was crossing a busy four-lane street that runs through the University of Pittsburgh campus, I looked up to see a robot blocking my path.
This wasn’t unexpected. Over the summer, several four-wheeled, knee-high robots had been roaming campus, unmarked and usually with a human handler several feet behind. Recently they’d multiplied, and now they were flying solo. They belonged to Starship Technologies, I learned, an autonomous delivery service rolling out on college campuses across America.
As a chemical engineering Ph.D. student at the University of Pittsburgh who uses a power wheelchair, I figured it wouldn’t be long before I met one of these bots in a frustrating face-off on a narrow sidewalk. What I didn’t realize was how dangerous, and dehumanizing, that scenario might be.
The robot was sitting motionless on the curb cut on the other side of Forbes Avenue. It wasn’t crossing with the rest of the pedestrians, and when I reached the curb, it didn’t move as the walk signal was ending. I found myself sitting in the street as the traffic light turned green, blocked by a non-sentient being incapable of understanding the consequences of its actions.
I managed to squeeze myself up on the sidewalk in a panic, climbing the curb outside the curb cut in fear of staying in the street any longer—a move that causes a painful jolt and could leave me stuck halfway up if I’m not careful.
Then I did what a lot of upset people do: I sent off a thread of angry tweets about the experience.
i (in a wheelchair) was just trapped *on* forbes ave by one of these robots, only days after their independent roll out. i can tell that as long as they continue to operate, they are going to be a major accessibility and safety issue. [thread] https://t.co/JHo5PlzMFs— Emily Slackerman Ackerman (@EmilyEAckerman) October 21, 2019
The response was larger than I expected: Messages from around the world flooded my mentions and inbox. Most expressed support; some retweets asked other universities to review their involvement in the same program and called for deeper discussion about the necessity of diversity in tech. A local news station picked up the story. After Starship and the university reached out to me, I spent hours the next day talking with their leadership teams. We discussed what went wrong, the steps that had already been taken to update the robot’s programming, and the future of Pitt’s involvement in the program. After our call, Starship released a statement of commitment to the disability community. Its robots were back on campus four days later, now under heavier human surveillance.
My incident may have had lots of only-in-2019 elements, but it was just the latest in a long fight for fair public access for people with disabilities. The social movement gained traction in the 1970s, when Denver activists hosted a sit-in to protest the lack of accessible public transportation, and Berkeley grad students led a campaign for the installation of curb cuts in the city. Though these efforts were successful on a small scale, accessible public transportation, curb cuts, and other accommodations weren’t guaranteed nationwide—and therefore, didn’t happen in many places—until the passing of the Americans with Disabilities Act (ADA) in 1990.
However, anyone who’s lived under the ADA knows that it’s not without its flaws. America’s largest subway system lacks accessible stations. Older buildings and small towns nationwide suffer from an acute absence of ADA-compliant infrastructure. And commercial air travel still isn’t covered by the ADA, despite decades of calls to action by the disability community. The airline industry is often in the headlines for mistreating disabled people and mishandling our adaptive equipment. In September 2019 alone, 813 wheelchairs and scooters were damaged by the 17 most popular U.S. airlines.
Now, thanks to the presence of these new delivery robots, the regular walk to my office has earned a spot on my ever-growing “Things to Worry About Daily” list. But don’t mistake this story for a protest against a singular company, or a warning about our possible autonomous future. In fact, the disabled community as a whole could greatly benefit from a delivery service for food or medicine.
Instead, my experience is representative of a much larger, evolving problem. The advancement of robotics, AI, and other “futuristic” technologies has ushered in a new era in the ongoing struggle for representation of people with disabilities in large-scale decision-making settings. These technologies come with their own set of ethical design challenges, with more unknown consequences than ever before. And we have yet to have an honest, critical conversation about it.
Whether intentional or not, inequity for people with disabilities exists at all scales, and each day is a fight to receive or maintain fair access. Everyone knows what to do when a sidewalk is inaccessible, but the knowledge needed to address accessibility within existing, complex technological systems such as sidewalk delivery robots is specialized. Too often, companies are allowed to basically self-regulate, which can come into conflict with a “move fast, break things” ethos. (See, for example, how startups introduced dockless electric scooters before adequately preparing their host cities for the sidewalk-blocking hazards their vehicles could be.) When technological advancement comes at the expense of a marginalized group in this culture, it seems the best we can expect is a product update with a brief message of commitment to the community. While this can be meaningful and result in positive change, it’s not enough to erase the dehumanizing and dangerous experiences of those who inspired the change.
We need to build a technological future that benefits disabled people without disadvantaging them along the way. Companies must practice accountability from their positions of power. The most critical step is increasing participation—not only by opening feedback channels with their users but also by hiring disabled engineers and programmers in all stages of the development process. Accessible design should not depend on the ability of an able-bodied design team to understand someone else’s experience or foresee problems that they’ve never had. The burden of change should not rest on the user (or in my case, the bystander) and their ability to communicate their issues.
As we move into the autonomous future, it’s more important than ever to consider the social and ethical consequences of how technology is designed. After all, truly good design has to account for the fact that no two humans are the same, including no two members of any minority group. A solution that works for most at the expense of another is not enough. As an engineer, I recognize that this leads to very difficult design questions with no clear answer—a nightmare for people in my profession. But as a disabled engineer, I know that refusing to take up that challenge will only perpetuate the cycles of injustice that are built into our world.