Imagine a world in which intimate details of your life could be bought, sold, and traded in a shady multibillion dollar industry; where lists of rape survivors could be served up for a fee, where alcoholics could be outed at the click of a button, where every sex toy and package of condoms you'd ever purchased would be on record for any entity interested in that information.
You don't have to imagine that world, because you're living in it. Welcome to the world of data brokers, who have exploded thanks to the internet and all that it offers in terms of tracking consumers. Unlike the major credit bureaus -- Experian, TransUnion, and Equifax -- they're not subject to tight government regulations in terms of where they get their data, how they use it, and who they distribute it to.
Who are data brokers, and where are they getting their information?
They've been around for decades, collecting information on buying and spending habits, demographics, and more. Relying on a sleazy network of information exchanges, data brokers use tools like store credit cards, public records (including DMV records, title records, voting registration records, and more), and other traceable patterns to compile profiles.
Those profiles are used to slot people into various categories, creating bundles that can be sold for direct marketing campaigns and similar advertising programs. They're also used by potential employers, lenders, and others interested in probing into your past to make decisions about your future. Think your prospective landlord is just getting a standard credit report before renting to you? It's possible, but your landlord could also be looking at more extensive data to find out who you are. Information from data brokers is used in identity verification (those goofy questions they ask about streets you've lived on and pet names) and fraud prevention as well.
Clients of data brokers can ask for lists of, say, 20-something women with a history of buying sex toys, or 35-year-old white men who like outdoor sports, or any number of other clusters. They use that information to create targeted advertisements, direct mailings, and more, hoping to zero in on your precise interests. (This explains why I'm now getting 18 million bulb catalogs after ordering a bushel of narcissus last fall.)
Even if you keep a pretty low profile, data brokers are probably still watching you. They might not have detailed information, but they've got something; 11 of the 11 September hijackers were found in the databases of one major data broker, for example. For data brokers, collection isn't about creating incredibly detailed profiles of every person in the US as much as it is about creating broad identity categories that provide information about spending patterns and other habits. It doesn't need to know whether you personally have purchased, say, lube with your condoms -- it just needs to know that people in your spending category generally do, so it can target you with lube advertisements.
This isn't just about what you buy. It's also about who you are. Regularly picking up prescriptions for, say, Lamictal and Celexa? You or someone you know must be bipolar, and that information is being filed away somewhere. Searching for information about local AA meetings and alcoholism resources online? That information is being stored too.
Do public records, including newspaper reporting and police reports, identify you as a rape victim or survivor? Well, big data has that too, and it will sell that information, for the right price, along with the addresses of women's shelters, home addresses of law enforcement officers, and more, as Jezebel notes. Companies even compile lists of older adults suffering from dementia and other diseases of aging -- which, the World Privacy Forum points out, makes them vulnerable to exploitative advertising and lending offers.
There are few controls on how data brokers use this kind of information; theoretically, consumer privacy is supposed to be protected when data is used for insurance, credit, housing, and employment. That leaves a system wide open to abuse, though, especially since most of this data is being used for marketing, where there are no checks. The World Privacy Forum and other advocacy organizations want to see Congress acting to restrict the release and use of this kind of information, arguing that it violates privacy expectations...and puts people in danger.
Women's shelters, for example, are supposed to be protected by law. What happens when their addresses are widely known? Not only does that make it easier for predators to find victims, it also makes women understandably nervous about seeking shelter in a "shelter" that isn't so safe after all. Likewise, lists of people in professions like law enforcement or, say, abortion services, are not the kind of thing we want widely distributed, because providing their home addresses puts them and their families at risk.
Revealing medical information that should be private, like mental health status, a history of addiction, or other health issues, is also troubling. As is providing information about people who have been victims of rape and incest -- what possible reason could marketers have for needing that information? Why does the privacy and safety of victims need to be violated, yet again?
Data scientists argue that this is simply how things work, and that people should become accustomed to a surveillance culture. They're also fond of overcollecting, on the grounds that it's easier to work with more data than you need than to go back and try to collect data retroactively. That might be true in the case of, say, health surveillance by the CDC, but it's not in this instance. Clear checks and balances are needed to create a system where people can exert some autonomy over their private information, and enjoy a reasonable amount of security when conducting their daily business.
Want to avoid data collectors? Well...Julia Angwin identified over 200 in her newly-released book "Dragnet Nation," and only a handful of them were even willing to send her overviews of the data they'd collected on her. Fewer still provided options for correcting erroneous data, or opting out altogether. Every time you use Google, swipe your credit card, click online ads, write about your cats on Twitter, or add a new Facebook like, someone's collecting that information, and deciding where you fit in the vast metric of social categories in the United States.