Who We Are As Data Might Soon Become More Important Than Who We Are As People

Mmaxer/Shutterstock.com

Online, you live in a realm functionally distinct from the world you thought you knew.

In our digital worlds, there is a gap between the real you and the algorithmic “you.” Online, you live in a realm functionally distinct from the world you thought you knew: one where your data assigns you a gender different from your own, or a citizenship unlike the one in your passport.

In some cases, it’s easy to check out the algorithmic you. For example, the plug-in Citizen-Ex by U.K. artist James Bridle uses your browser’s metadata to calculate your algorithmic citizenship—and the answer might surprise you. National citizenship is normally seen as binary: You either are, or are not, a citizen of a country. But Bridle’s plug-in assigns you a percentage-based citizenship where you can be 54.8 percent Irish, 43.7 percent American, 1.49 percent German and even 0.01 percent Estonian, as I currently am.

I say “currently” because our algorithmic selves alter minute by minute and byte to byte depending on how we’re using the internet. Last night, after chatting with friends living in England, you might have skewed British. This morning, chatting to your cousin in Spanish, Mexican.

This is citizenship by algorithm. The concept comes from leaked 2013 documents from the National Security Agency that outlined the agency’s PRISM program, which allows analysts access to the data caches of private companies like Google, Microsoft, Yahoo, AOL and Apple. NSA avoided the privacy protections against domestic surveillance U.S. citizens normally enjoy by creating a new, datafied template for citizenship online. In other words, instead of halting domestic spying, the federal government redefined what it means to be a U.S. citizen by creating an algorithmic citizen made only of data.

Your right to privacy from government surveillance modulates according to how NSA interprets your data via its algorithmic logic of citizenship. U.S. citizens whose data appears foreign become classified as foreigners—and therefore lose their constitutional right to privacy.

This applies to a lot of Americans: To NSA, Americans are likely to be perceived as foreign if they have an IP address outside the US, talk to people outside the U.S., use languages other than English, encrypt their communication, or even have friends who are “reasonably believed” to be foreign. In fact, you’re considered algorithmically foreign if an NSA agent is only “51 percent confident” of your “foreignness.”

But it doesn’t stop at nationhood: You also have an array of algorithmic genders, races, class statuses, sexual orientations and even statuses of celebrity. And like citizenship by algorithm, these additional interpretations are functionally disconnected from how we understand ourselves. Our algorithmic identities are simply fabrications made by the institutions profiling you for purposes of selling products and/or exerting digital control.

You can check out what gender and age Google thinks you are based on your search queries and website visits by clicking here. But be prepared: Google’s algorithmic gender and age identifications will probably seem wrong. (For example, you might be a 30-year-old woman, but Google thinks you’re a 65-year-old man.)

This “error” actually has nothing to do with your real age or gender because Google is measuring something completely separate from the human notion of identity. These models are created by categorizing certain search terms and websites and then parsing our data to determine what algorithmically fits or doesn’t. So if you’re a woman who is algorithmically interpreted as a man, that merely means you’re more closely aligned to Google’s model of a man than a woman.

Because algorithms draw from our data, not our lived experience, it largely doesn’t matter if we’re incorrectly identified. (And as much as it sometimes may seem, Google is not invested in explicitly maintaining the patriarchy.) Instead, Google wants to provide advertisers with a consumer base of users who are seen to be profitably man-ish. Similarly, NSA really doesn’t care if a user is citizen or foreigner, as algorithmic citizenship itself is only a legal caveat that protects them from constitutional overstepping.

But it still raises the question: What would the real world look like if users were identified based only on their algorithmic self?

This is already happening to some extent. Google’s gender and age audience analytics determine which users are targeted with content and advertisements, as well as how websites interpret who is visiting their site. For example, if your data suggests you’re algorithmically wealthy, you might be shown higher prices for hotels or flights on a site like Orbitz.com, because your data suggests you can pay. Or, like the case of a Wisconsin man this week, you might be denied parole because you’ve been identified as an algorithmic reoffender.

A life algorithmically ordered and reordered often beyond our comprehension ushers us into a dangerous terrain of lopsided knowledge. On this plane, users have little to no idea how they are defined, but commercial firms and governmental agencies use our data to determine privacy rights, targeted content, plane-ticket prices and our position in society. As humans continue to produce increasing amounts of mineable information, who we are as data might soon become more important than who we are as people.