On Tuesday, the U.S. Department of Transportation released new guidelines for emerging self-driving car technology. Said Secretary Anthony Foxx in a press release about the guidelines: “Automated vehicles have the potential to save thousands of lives, driving the single biggest leap in road safety that our country has ever taken.”
Also on Tuesday, in Charlotte, North Carolina, where Foxx served as mayor from 2009 to 2013, Keith Lamont Scott, an African American, was shot dead by a police officer while either in or at his car. Police say that Scott exited his vehicle with a gun “posing a threat”; Scott’s daughter and witnesses claim that he was sitting in his car reading a book when police shot him. The truth is probably somewhere in between, but history indicates that the account favoring the victim will likely be ruled out.
This occurred during the same week that millions viewed, from several angles, an African-American man named Terence Crutcher get gunned down by police in Tulsa, Oklahoma, while at his car. Crutcher’s SUV was stopped in the middle of a road at the time, and some reports say he needed help with his vehicle. Instead, he got Tasered and shot by Tulsa police while his hands were up.
President Barack Obama has not yet commented on the police killings, but he did comment on the new rules for self-driving cars, or “highly automated vehicles,” as the Transportation Department calls them. Wrote Obama in an op-ed:
Right now, too many people die on our roads—35,200 last year alone—with 94 percent of those the result of human error or choice. Automated vehicles have the potential to save tens of thousands of lives each year. And right now, for too many senior citizens and Americans with disabilities, driving isn’t an option. Automated vehicles could change their lives.
There have been hundreds of African Americans killed by police—185 this year alone, according to The Guardian—often the result of police error (a Tulsa deputy shot and killed an African-American man last year when he mistook his own gun for a Taser) or choice. And right now, for too many African Americans, the simple act of driving a car is a risky option whenever there is any kind of a police encounter. If automated vehicles can change and save so many lives, we should talk seriously about how they can protect black lives.
The new guidelines for automated cars say this: “The self-driving car raises more possibilities and more questions than perhaps any other transportation innovation under present discussion.”
Cool. Here are some questions:
- Are black people less likely to get pulled over by police in self-driving cars?
- Do automated vehicles come with bulletproof seatbelts, to shield African Americans restrained in passenger seats, like Philando Castile was?
- Do they come equipped with Facebook Live-connected cameras throughout the car that can be activated when a police officer approaches?
- Can the car testify in court on behalf of a black person killed by police, when the victim’s narrative differs from that of the police—especially when the police need to cover their own asses?
- Will police still force black people to put out their cigarettes while riding in automated cars—and can the cars prevent black people from later ending up dead in jail, like what happened to Sandra Bland?
- Can the automated vehicle protect a black woman from being pulled out of a car by police and slammed onto the pavement, like what happened to Breaion King?
I dug through the 116-page “Federal Automated Vehicles Policy” paper in search of answers. Strangely, there is no information on how the cars make black people safer from racism. I did learn how automated cars can avoid a crash, but nothing about how to avoid a Crash. (And we really need to teach the next generation about how to avoid Crash; we really, really do.)
Foxx acknowledges in the policy guide’s introductory message that important questions not addressed in the current document will emerge. Some of those questions, writes Foxx, include, “What ethical judgments will [automated vehicles] be called upon to make? What socioeconomic impacts flow from such a dramatic change?”
These are serious questions that deserve immediate attention, especially from the perspective of racial justice. By “ethical judgments,” Foxx likely means something along the lines of whether self-driving cars should swerve from a potential car collision if it means possibly hitting a pedestrian. Fine. But I’m also curious about how to code problem-solving skills into an automated vehicle for an ethical dilemma like the one below, which is excerpted from the U.S. Department of Justice’s investigation into the Ferguson police department:
[I]n the summer of 2012, a 32-year-old African-American man sat in his car cooling off after playing basketball in a Ferguson public park. An officer pulled up behind the man’s car, blocking him in, and demanded the man’s Social Security number and identification. Without any cause, the officer accused the man of being a pedophile, referring to the presence of children in the park, and ordered the man out of his car for a pat-down, although the officer had no reason to believe the man was armed. The officer also asked to search the man’s car. The man objected, citing his constitutional rights. In response, the officer arrested the man, reportedly at gunpoint, charging him with eight violations of Ferguson’s municipal code. One charge, Making a False Declaration, was for initially providing the short form of his first name (e.g., “Mike” instead of “Michael”), and an address which, although legitimate, was different from the one on his driver’s license. Another charge was for not wearing a seat belt, even though he was seated in a parked car. The officer also charged the man both with having an expired operator’s license, and with having no operator’s license in his possession. The man told us that, because of these charges, he lost his job as a contractor with the federal government that he had held for years.
For example, a 2011 complaint described an incident in which two white officers told an African-American man who had double-parked his car and was blocking the street to “move this car, n****r!” The man was double parked in order to assist his aunt into her home in Northeast Baltimore and was not charged with any offense. The man’s complaint—the one complaint BPD correctly categorized as a “racial slur” in the more than six years of data we examined—was assigned to be investigated at the command level and administratively closed six months later. The file BPD provided has no record of the investigation or any attempt to identify the officers involved.
The DOJ investigation into Baltimore police turned up the fact that “African Americans accounted for 82 percent of all BPD vehicle stops, compared to only 60 percent of the driving-age population in the City and 27 percent of the driving-age population in the greater metropolitan area.”
It’s not trivial to wonder whether cops will continue this trend by stopping automated cars occupied by people with suspicious skin tones. (While we’re on this, if police order an automated car to pull over and it doesn’t, due to some technical glitch, who’s at fault? That will be one police chase that African Americans will not want to be a party to.)
The point here is that, while the car represents freedom and mobility for many people, it has become the locus of police-driven killings for too many African Americans, regardless of whether laws and safety guidelines are being followed. Philando Castile was wearing his seatbelt when police killed him. Terence Crutcher’s hands were visibly up when police shot him. In the Keith Lamont Scott case in Charlotte, even if Scott had a gun as the police say, North Carolina is an open carry state. If, in any of those scenarios, an automated car would have helped them avoid death, then maybe this new technology would truly be a game-changer.
Sure, self-driving cars weren’t created with these particular safety issues in mind. But how safe is a highly automated vehicle in the face of highly automated racism?