a photo of Housing Secretary Ben Carson in Baltimore in July.
Housing Secretary Ben Carson appears in Baltimore in July. Julio Cortez/AP

The Department of Housing and Urban Development plans to revise the “disparate impact” rule, which could fundamentally reshape federal fair housing enforcement.  

Updated: August 19, 2019

Get the news and ideas you need to shape your city. Sign up for the CityLab Daily newsletter.

The Trump administration will introduce a new rule on Monday that may reshape the way the government enforces fair housing law, making it harder for people to bring forward discrimination complaints under the Fair Housing Act.

The proposed regulation from the U.S. Department of Housing and Urban Development would replace an Obama-era rule on disparate impact, a legal theory that has guided fair housing law for more than 50 years. Disparate impact refers to practices or policies that have an adverse impact on minorities without discriminating against them in explicit terms. The Supreme Court has recognized this form of bias as prohibited under the Fair Housing Act. But the new rule from HUD would substantially raise the burden of proof for parties claiming discrimination.

The new regulation also goes further: The HUD rule carves out an unprecedented guidance for the automated decision-making systems that power the housing market. These are the algorithms used by lenders and landlords that deliver judgments on credit risk, home insurance, mortgage interest rates, and more. Under the new dispensation, lenders would not be responsible for the effects of an algorithm provided by a third party—a standard that critics say would build an industry backdoor to bias.

“This is a proposal to very dramatically revise and effectively destroy an existing 2013 civil rights regulation,” says Megan Haberle, deputy director for the Poverty & Race Research Action Council. “This is a core part of the Fair Housing Act, and very early fair housing cases across the country have recognized the discriminatory effects standard.”

Housing Secretary Ben Carson signaled that the department was rethinking the disparate impact doctrine last June. The new rule, a version of which was leaked to Politico, will be published in the Federal Register on Monday, triggering a 60-day comment period before it can be officially implemented.

Under the current rule, disparate impact cases proceed by meeting a three-part burden-shifting test: A plaintiff makes an allegation, a defendant offers a rebuttal, then the plaintiff responds. The new rule would set a five-point prima facie evidentiary test on the plaintiff side alone. This means that a party looking to bring a discrimination case under the Fair Housing Act would need to establish some level of evidence in the pleading stage. To bring forward an accusation of implicit discrimination, plaintiffs would need to demonstrate—before any discovery process—that the policy itself is flawed.

Under the five-point burden test, plaintiffs would need to 1) prove that a policy is “arbitrary, artificial, and unnecessary” to achieve a valid interest; 2) demonstrate a “robust causal link” between the practice and the disparate impact; 3) show that the policy negatively affects “members of a protected class” based on race, color, religion, sex, family status, or national origin; 4) indicate that the impact is “significant”; and 5) prove that the “complaining party’s alleged injury” is directly caused by the practice in question.

“This shifts so much of the responsibility to the plaintiff to make allegations nearly impossible, without having gone through a discovery process, of this tight causal link between the policy and the effect,” says Urban Institute senior fellow Solomon Greene.

In addition, the new HUD rule would establish three new defenses for landlords, lenders, and others accused of discrimination based on models and algorithms. The first defense would enable defendants to indicate that a model isn’t the cause of the harm. The second would allow the defendant to show that a model or algorithm is being used as intended, and is the responsibility of a third party. Finally, the new rule would allow the defendant to call on a qualified expert to show that the alleged harm isn’t a model’s fault.

Critics say that this new development gives lenders and landlords a big loophole. Many if not most financial institutions are not capable of developing their own in-house credit-risk algorithms; instead, they turn to third-party vendors. By putting the onus of fairness on these vendors, HUD is establishing a perverse incentive for banks and vendors alike to decline to study the outcomes of automated decision-making systems, according to Jacob Metcalf, a researcher for the nonprofit research institute Data & Society and founder of Ethical Resolve, a data-ethics consultancy.

“As long as the bank or lender is buying this tool from a third party that claims it has been adequately tested for algorithmic fairness, then the bank or lender is shielded from liability,” Metcalf says. “That’s a problem because there are no established standards—and the HUD rule doesn’t set out to establish any standards—about disparate impact.”

If a bank isn't liable for discrimination that results from an algorithm that it uses, then it doesn’t have any incentive to shop for a company that will guarantee that its algorithms won’t discriminate. By the same token, vendors who make automated decision-making systems don’t have an incentive to invest the time and labor to demonstrate transparently that their products are safe from a liability perspective. Investing the time and money it takes to give that kind of guarantee would put a company at a comparative disadvantage. There’s an incentive for all involved to simply not know what kind of disparate impact algorithms might be generating.

Meanwhile, a plaintiff has no way of knowing what data a vendor uses to model credit risk. A plaintiff might not be able to determine which vendors are responsible for what algorithmic effects. Third parties would be able to shield their practices behind trade secrets; any plaintiff looking to suss out whether an algorithm has a discriminatory impact might wind up “lost in a web of vendor relationships,” Metcalf says—with little recourse, especially prior to the discovery stage.

This is the first federal regulation to directly address algorithms and disparate impact. Attorneys couldn’t point to any caselaw that addresses algorithmic models and disparate impact, either. It’s not a wholly unreasonable idea for a regulation, Metcalf says: Many banks don’t have the resources to gauge the liability of an algorithm, after all. But without sufficient due-diligence standards, vendors will have every incentive to drag their feet. And as long as their models aren’t blatantly discriminatory, then the vendors likely wouldn’t be held responsible for disparate impacts, either.

If HUD instead required banks to run tests on the models they use, then vendors would have an incentive to design platforms that provide those reports as a service. That would be a value-add for lenders, Metcalf says, since banks would be willing to pay more for a disparate-impact report that was reliable, easy to run, and a safeguard to keep them in compliance.

But under the proposed rule, it falls on the plaintiff to determine, case by case, how an algorithm affects them by suing the company or companies responsible for making the algorithm—without any standard in place for what algorithmic fairness means.

“How do you build a model to avoid disparate impact?” Metcalf  says. “How often should it be tested? When does it need to be retested? How do you know if it’s appropriate from one population to another? Maybe it’s fair for the population of Ann Arbor. Maybe it’s unfair for the population of Detroit. How do you know which population it was trained on?”

He adds, “If HUD isn’t going to answer those questions, it’s a get-out-of-jail-free card. They’re creating the liability loopholes that all of the potential plaintiffs will fall through by default.”

Civil rights organizations are already gearing up for a fight over the rule. The National Fair Housing Alliance, Leadership Conference on Civil and Human Rights, NAACP Legal Defense Fund, and others are joining forces under the banner of Defend Civil Rights. This new coalition aims to oppose efforts by the Trump administration to dial back regulations that safeguard minorities from discrimination, according to a civil rights attorney familiar with the project who couldn’t speak on the record before the group’s launch on Monday.

Defend Civil Rights will not only advocate for protections in housing: The coalition also plans to address education, labor, healthcare, environmental justice, and other fronts. For example, Secretary of Education Betsy DeVos has proposed canceling an Obama-era push to eliminate racial disparities in school discipline.

“When it comes to policymaking, most institutions, whether they’re lending institutions or landlords, have long since abandoned explicit racial [or other] discrimination. Disparate impact is really the best tool we have to level the playing field,” Greene says.

Defenders of the administration’s efforts say that it’s necessary to bring the department’s regulations in line with the Supreme Court’s 2015 decision in Texas Department of Housing and Community Affairs v. The Inclusive Communities Project. Francis Riley, a partner for Saul Ewing Arnstein & Lehr who represents defendants in the civil rights arena, says that the decision will constrain claims from plaintiffs.

“It puts the courts front-and-center to control claims that move on to discovery,” Riley says. “If [plaintiffs] are using a defendant’s [Home Mortgage Disclosure Act] data, or regional HMDA data, that shows that a particular area is not being served by the defendant, that is not enough. They have to actually assert, and in more than just a perfunctory way, that the lender has a policy or practice that they are effectively enforcing which has the goal of discriminating against those individuals.”

Riley says that the new rule will still allow plaintiffs to pursue landlords and lenders who are guilty of unfair housing practices. Those who discriminate should be hauled into court to answer for wrongful discrimination, he says. But he says the rule will prevent the plaintiff’s bar from bringing forward cases based on statistical data alone. While he thinks the new HUD regulation is a step in the right direction, he notes that it conflicts with other existing standards used by other federal agencies.

“We know what HUD’s doing,” Riley says. “What’s the [Consumer Financial Protection Bureau] going to do? What’s the [Federal Housing Administration] going to do? All of these departments have fair housing divisions.”

The language of the new regulation relies heavily on the text of former Justice Anthony Kennedy’s 5–4 decision for the majority in Inclusive Communities. The court ruled that disparate impact is “cognizable under the Fair Housing Act,” affirming prior decisions by eleven federal appellate courts that relied on this doctrine. Kennedy’s decision did not rely on the Obama-era HUD rule on disparate impact, which codified practices across the department. But the Trump administration saw the decision as a reason to revise the housing department’s rule.

In the Inclusive Communities decision, the court considered an ongoing challenge from the Dallas area. There, housing authorities had long been distributing housing tax credits, which are used to build low-income housing, in a way that consolidated construction in mostly black areas. This is a straightforward example of disparate impact, with none of the complications of machine learning or artificial intelligence that the new HUD rule anticipates.

On Friday, advocacy groups such as the National Low Income Housing Coalition and the National Community Reinvestment Coalition condemned the rule in strong terms. Civil rights attorneys worry that the new standard will unwind the protections afforded by the Fair Housing Act.

“Fundamentally, if this rule is adopted, and disparate impact is no longer available as a legal bulwark against facially neutral or unintentionally discriminatory policies,” Green says, “we’re in a lot of trouble.”

About the Author

Most Popular

  1. Three men wearing suits raise shovels full of dirt in front of an American flag.
    Equity

    How Cities and States Can Stop the Incentive Madness

    Economist Timothy Bartik explains why the public costs of tax incentives often outweigh the benefits, and describes a model business-incentive package.

  2. photo: A woman crosses an overpass above the 101 freeway in Los Angeles, California.
    Transportation

    Navigation Apps Changed the Politics of Traffic

    In an excerpt from the new book The Future of Transportation, CityLab’s Laura Bliss adds up the “price of anarchy” when it comes to traffic navigation apps.

  3. A view of a Harlem corner.
    Equity

    How Ronald Reagan Halted the Early Anti-Gentrification Movement

    An excerpt from Newcomers, a new book by Matthew L. Schuerman, documents the early history of the anti-gentrification and back-to-the-city movements.

  4. A photo of a police officer in El Paso, Texas.
    Equity

    What New Research Says About Race and Police Shootings

    Two new studies have revived the long-running debate over how police respond to white criminal suspects versus African Americans.

  5. Design

    Reviving the Utopian Urban Dreams of Tony Garnier

    While little known outside of France, architect and city planner Tony Garnier (1869-1948) is as closely associated with Lyon as Antoni Gaudí is with Barcelona.

×