"— I am a decent honest lawful person, why should I care if someone gathers information about me? I have nothing to hide, and I prefer useful targeted ads and personalized services". This is a popular opinion. Why is it completely wrong?
More and more people now start to care about their personal data that is harvested by the state, by big and small businesses, by local, foreign, and international companies. Yet even more people are still nonchalant or careless about who knows what about them and how this information is being used. Quite many think that sharing personal data is even good: it makes ads more relevant and the world more secure overall.
When companies claim that you profit from sharing your data, you should remember that making profit is the first and often the only reason why companies exist. There is just no point to do their job for them.
Let's put together the reasons why you should be alert and proactive about protecting your personal data. And as you will be able to see very soon, there's no shortage of these reasons — personal information is a hot commodity these days.
Targeting ads. Sometimes it is simple: you have searched online for a bike or a fridge, closed Google not even having clicked any link, so it is obvious to the algorithm that you might still be interested. Sometimes it is more complicated: a company knows that you have a small kid and offers you an investment product that would help fund their college education. But you know who else knows that you have a kid? You. And you most likely already some plans for their future, so no life-saving help here either.
These are only a couple examples of targeting ads, perhaps there are some for when it may actually be of some use. But keep in mind that it's one of the very few such situations, and there are many, many more when giving away your personal data is unquestionably a bad idea. As you will very soon see for yourself.
Marketing research. When it's not about you personally but about understanding consumers as they are. But, as put above, it's not your job and not your problem.
Social studies, all kinds of science, statistics. People must be understood and studied for so many reasons. It is nice to be digitalized for some other reasons than cold-hearted moneymaking for a change. Scientific research needs people's data too, and it can just as easily be mishandled to put you at risk, even if not willingly. In the end, it's your decision: do you put scientific progress above or below your personal interests?
Bank scoring, credit scoring. Banks decide how credible, prosperous, trustworthy, and likely to return a loan you are by using very unobvious criteria along with the obvious ones (purchases, job, salary, home address, traveling, financial history).
Other types of scoring (insurance, medical security, etc). Robots, robots everywhere. Companies are obsessed with Big Data. We take everything we know about a person, feed it to a neural network, then artificial intelligence does its magic, and voila, we have a number that shows your likeness to buy a dog this year.
Commercial profiling and scoring. An HR manager of a company that you're looking to work at clicks a button and receives a valuation of you based on the music you listen to, photos of you on Facebook, and so on. And if those photos happen to be in a bathing suite with a cocktail in hand, it apparently means you are a sociopath inclined to public nudity, and an alcoholic — or at least that's what some kind of a contorted algorithm may decide.
Police profiling. So many stories about false alarms when innocent people become suspects.
Yes, public security is important: we want them to search for criminals, suspects, runaway convicts, dangerous people. But there is a huge gray zone for abuse here. There are countries where smart face-recognizing street cameras are used for searching for the participants of public protests or members of political opposition.
Corporate security. А good thing too, of course. No one should steal and trade commercial secrets and go loose.
Workforce productivity control. This is a very real thing. Are you ready to have your typing patterns analyzed, your eye movements recorded, your brain waves scanned after two minutes of illegal procrastination (as it is already common in China)?
Criminal activity (scam, fraud, etc). As obvious as it sounds. Everything that is known about you can be used against you, starting with the maiden name of your grandmother and including, but not limited to the color of your pants.
So we've established it: your data is valuable to numerous different people, organizations, and governments. Is it enough to be worried? Let's see.
All your data can be leaked or stolen — and it likely will be. It is just doomed to be sold on the darknet. To be used against you. Go change your passwords now and do not write them down on a post-it note on the monitor.
Algorithmic discrimination. It is absolutely everywhere. Even if you think it's only fair that medical insurance is more expensive for smokers, there are other clearly outrageous examples. Feel free to conduct a little experiment: try to order a taxi on a rainy Friday night using an iPhone that's low on battery juice. And then try the same route on a sunny Tuesday afternoon. The difference will be stunningly obvious. Another example: check the price for the same plane tickets from your cell phone and from a freshly installed obscure browser while in incognito mode.
Always keep in mind that any company's price policy is aimed at maximizing its profits, the question is what methods they use. Be very picky about who you trust, and even then, don't share anything that you don't have to.
Mistakes. False positives. Robots make them all the time, and it may cost people a lot. Do you think that the more they know, the better are the decisions they make? No. Not yet, at least. Big data is not just a technology, it's a fashion, a frenzy, a modern madness. Artificial intelligence now is like a kid with a box of matches, and the ecstatic adults watch in amazement how it sets fire on everything that burns.
A while ago the founder of Xsolla, a big international fintech startup, announced that they fire 150 workers. All of them received this message: "My big data team analyzed your activities in Jira, Confluence, and Gmail, your chats, dashboards, documents, and you were marked as uninvolved and unproductive". Most experts agreed while discussing this case that such quantitative metrics cannot be used for evaluating high-profile workers like software developers or managers. Besides, among the fired people were a company bartender and a hostess — why would they ever need Jira or online dashboards in their work is beyound me.
We are not calling you to become a technophobe and deny the technical progress. There are things you can do to manage the use of your personal data and mitigate the risk.