Monday, August 26, 2019

Sign the petition: Fight racist robots

Tell HUD that you do not support algorithmic discrimination in housing

Department of Housing and Urban Development:
"The HUD proposal that gives landlords and lenders permission to discriminate by algorithm will undo the Fair Housing Act and undermine policies that prohibit disparate impact. Using biased and flawed technology should not be a part of the application process for housing. This proposal will encourage racist and sexist practices in housing. I do not support this proposal."

Add your name:

Sign the petition ►

Dear Katy,

Tell HUD that you do not support algorithmic discrimination in housing

Imagine being told you cannot get a place to live, even if you meet all the requirements. The landlord or lender makes this life-altering conclusion based on answers from an unaccountable, opaque computer algorithm. And you can't dispute the decision.

Trump's Department of Housing and Urban Development (HUD) just proposed a rule that would open the door to this type of discrimination and we have to raise our voices against it.

Before Trump, HUD's central mission was to support and improve the lives of low-income families and communities. Now, the department is shredding that mission and giving landlords and mortgage lenders license to use computer algorithms to discriminate.

Tell HUD: Don't dismantle the Fair Housing Act. Click here to sign the petition.

HUD wants to use algorithms to determine if potential residents are worthy of housing.These algorithms would do everything from looking at an applicant's credit score to scouring through the applicant's social media to build a personality profile or track how frequently the applicant goes out to clubs or bars.1

Take facial recognition algorithms for example of how this could be problematic: they constantly misidentify Black people. And algorithm based job recruiting tools have shown a bias against women.2 Algorithmic decision-making programs mirror how our society functions but tend to supercharge the racism and misogyny. HUD is ignoring the fact that many algorithms are faulty and biased.

This is more than unacceptable, it's a blatant attempt at undermining 51 years of civil rights protection in housing. We have to show HUD that we not only notice, but we care. The proposal is open for public comment right now. We know Big Tech will do anything to make sure their pockets are lined with government money unless we speak out.

Tell HUD: Stop encouraging discrimination by algorithm. Click here to sign the petition.

Housing discrimination explicitly works against communities of color and low income communities, preventing both individuals and families from building wealth generation after generation. Federal civil rights law attempts to right the wrongs of history. It is currently illegal for businesses to outright discriminate but policies and practices are also illegal if they have disparate impact. Disparate impact is when policies or practices seem neutral from the outside but actually have a disproportionately negative impact on minority groups.3 Using proven racist and sexist algorithms will have disparate impact on those who HUD is supposed to protect and make it harder for people to bring forward housing discrimination under the Fair Housing Act.

This proposal is a part of the Trump administration's attempts to roll back civil rights protections and undermine equality. We have to fight back and make sure they hear us loud and clear.

The government's public comment period about this technology just opened. When you sign this petition, we'll count your signature in our comment to HUD on this issue. Click the link below to sign:

https://act.credoaction.com/sign/no-algorithims-in-housing?t=10&akid=33761%2E12967895%2Ez6VUkd

Thanks for fighting back.

Jelani Drew, Campaign Manager
CREDO Action from Working Assets

Add your name:

Sign the petition ►

References:

  1. Andrew Selbst, "A New HUD Rules Would Effectively Encourage Discrimination by Algorithm," Slate, Aug. 19, 2019.
  2. Karen Hao, "This is how AI bias really happens—and why it's so hard to fix," Technology Review, Feb. 4, 2019.
  3. Emily Badger, "Who's to Blame When Algorithms Discriminate?" The New York Times, Aug. 20, 2019.

FB Share on Facebook
Post to your wall
Tw Tweet this
Post to Twitter
CREDO action

© 2019 CREDO. All rights reserved.

This email was sent to katy63.kelso@blogger.com.

To change your email or mailing address, please click here: https://act.credoaction.com/me/update/?t=21&akid=33761%2E12967895%2Ez6VUkd

To unsubscribe, please visit our subscription management page at: http://act.credoaction.com/cms/unsubscribe/unsubscribe/?t=23&akid=33761%2E12967895%2Ez6VUkd

No comments: