I recently wrote about the ProPublica and New York Times expose on housing discrimination whereby Facebook advertisers were excluding people from seeing housing ads based on their “ethnic affinity.” Now both landlords and lenders are pressuring the Department of Housing and Urban Development to make it easier for businesses to discriminate against potential tenants using the same automated tools for which Facebook was legally sanctioned. Where a case from earlier this year in Connecticut is testing the limits of racial discrimination in housing, we are only now beginning to understand how technology can have incredibly negative impacts for individuals of certain ethnicities and economic means.

One primary factor in housing law established through a Supreme Court case in 2015, is called the “disparate impact” standard, which states that if a policy disproportionately affects one community over another then this may be illegal. Still there are policies enacted around public housing that are patently discriminatory such as the recent scandal of Detroit’s facial recognition software being employed around the city’s housing estates. While such systems are being touted by landlords and housing commissions as means of security, privacy groups are rightfully worried about the way this technology is being used to invade privacy and potentially serve to feed criminal justice databases under the guise of “security.”

Still many cities are opting for high security options oblivious to the way the biodata is being used and abused internationally.