Prev

The Human Cost of Constant Surveillance: Scientists Find Greater Conformity, Anxiety, and Distrust

Next

Google's Photos of Poor Neighborhoods Become Fodder for Magazine Feature

Connections

Decisions Driven by Undisclosed “Big Data” May Escape Civil Rights Scrutiny, Researchers Warn

At MIT’s EmTech conference last week, a researcher warned of “big data” analytics’ discriminatory impacts.

Companies and governments can now obtain insights far beyond the immediate meaning of the data they collect. Analysis of demographic info, shopping habits, and location histories tell a richer story. Facebook “likes” can predict ethnicity, sexual orientation, use of addictive substances, age, and gender with high confidence. These insights can be startling: the retailer Target accurately inferred a teen girl’s pregnancy. And, whether they are accurate or not, these inferences will increasingly impact lives.

Corporate and governmental decision-making is starting to rely on these new techniques. Companies strive to predict credit-worthiness and may decline to show credit advertisements to ostensibly-poor individuals. The FBI uses the date, time, type, and location of recent crimes to predict future crime and focus officer patrols. Researchers say that other government services and public goods will soon be similarly reshaped by these technologies.

These decisions can easily reinforce inequalities. For example, researchers Kate Crawford and Jason Schultz pointed out that when police use past crime data to single out an area for heightened enforcement, future crimes in that area become disproportionately likely to be detected, included in statistics, and used to shape future resource allocation decisions.

An immediate problem is that these systems’ secrecy disempowers both individuals and regulators:

Big data systems suffer from many of the same weaknesses as government administration systems . . . Individuals or groups that are subjected to predictive privacy harms rarely receive any meaningful notice of the predictions before they occur or are implemented, and even then, providers are unlikely to share the evidence and reasoning for the predictions that were made. Certainly, there is currently no legal requirement that the provider archive any audit trail or retain any record of the basis of the prediction.

Discriminatory practices are difficult to combat without evidence. And even with transparency, traditional protections against the collection, use and disclosure of data may not apply to these new practices — they may not stop businesses or government from sharing their best guesses about what can be inferred from the available data.

Schultz and Crawford conclude that “individuals who are privately and often secretly ‘judged’ by big data should have similar rights to those judged by the courts with respect to how their personal data has been used,” an idea they call “procedural data due process.”

As a society, we are at “the very start of a conversation about how to do this better,” said Crawford.

We'd love to hear from you. Send us an email: hello@teamupturn.org