Maybe it’s not so bad to have algorithmic overlords — at least when they are pressured into protecting people rather than exploiting them.
Earlier this week, Facebook declared that it will no longer let certain kinds of advertisers engage in racial profiling. Specifically, it will prohibit credit, housing, and employment advertisers from using “ethnic affinity†categories — marketing profiles that correlate strongly with race — in deciding whom to exclude from their audience.
The reform didn’t happen out of the blue. Pressure came first from a ProPublica investigative experiment last October, in which the news organization successfully submitted to Facebook a housing advertisement that excluded minority affinity categories. This led to a lawsuit claiming that Facebook was violating federal fair-housing and civil-rights laws.
Interestingly, educational advertisers, such as for-profit colleges that have been known to target poor African Americans, have been left out of the new policy. That’s possibly because there’s no federal law against this kind of profiling. Which is to say, Facebook is starting to act like a private regulator, going after and policing unlawful activities.
A similar story can be told about Google, which in 2016 announced — after Facebook had already made a similar move — that it would get rid of payday-lending advertisers, and has since reported removing 1.7 billion so-called “bad ads,†including 5 million for payday loans. So Google has taken on the role of policing as well.
This might be a desirable development, especially amid growing concern that President Donald Trump’s administration may rein in the Consumer Financial Protection Bureau, the federal agency charged with protecting the American public from financial predators. But it also raises some difficult questions.
First, both Google and Facebook rely on algorithms to vet advertisers. So will they prevent bad actors from gaming their vetting systems?
Payday lenders have shown themselves to be crafty — for example, by charging undisclosed or inflated fees to appear to qualify as prime lenders. I was involved in such a case, working as a data consultant for the Illinois Attorney General’s Office, which extracted a $3.5 million settlement from a company that advertised low interest rates but actually charged an exorbitant “insurance†fee. Such subterfuge could easily slip past an algorithm designed to filter out high-interest-rate lenders.
Another potential issue: If the government outsources regulation to private companies, federal regulators might atrophy, leaving them unable to do their jobs even if a new administration wants to activate them. This is troubling because private companies ultimately have no accountability to the public, and offer little transparency. They are acting
primarily out of concern for their
reputations.
That said, let’s welcome Facebook’s new policies. If Trump chooses to pull back on consumer protection, this might be as good as it gets. —Bloomberg