Mark Zuckerberg spoke to New York Times about the latest scandal around Facebook.
Several years ago an analytic company bought the information about 50 million Facebook users from an app developer. The company claims to have used this data for influencing political campaigns and presidential elections. You'll find the full coverage of the events in the second part of this article.
Facebook's CEO said that the social network would "dramatically reduce the amount of data that developers have access to".
Facebook will "do a full forensic audit" of all apps that "have any suspicious activity".
Besides, apps that a user has not launched for three months, will be losing access to the user's data. "One of the steps we’re taking is making it so apps can no longer access data after you haven’t used them for three months.", Zuckerberg said.
Developers that want access to sensitive data for their apps will have to sign a contract with the social network and have "a real person-to-person relationship" with Facebook. As examples of sensitive data Zuckerberg named religious beliefs and sexual orientation.
One of the most intriguing aspects of this situation is whether your likes on Facebook really help study you and influence your choices. "Likes" under the question are not your reactions to posts and comments, they are pages, companies, persons, books, shows, and so on, that you've added as "liked" to your Facebook page.
The power of likes is a complicated issue, so we'll just quote two articles here.
Cambridge Analytica has marketed itself as classifying voters using five personality traits known as OCEAN — Openness, Conscientiousness, Extroversion, Agreeableness, and Neuroticism — the same model used by University of Cambridge researchers for in-house, non-commercial research. The question of whether OCEAN made a difference in the presidential election remains unanswered
Cambridge Analytica has claimed that OCEAN scores can be used to drive voter and consumer behavior through "microtargeting," meaning narrowly tailored messages. Nix has said that neurotic voters tend to be moved by “rational and fear-based” arguments, while introverted, agreeable voters are more susceptible to “tradition and habits and family and community.”
Facebook banned CA on the platform only in March 2018
Remarkably reliable deductions could be drawn from simple online actions. For example, men who "liked" the cosmetics brand MAC were slightly more likely to be gay; one of the best indicators for heterosexuality was "liking" Wu-Tang Clan. Followers of Lady Gaga were most probably extroverts, while those who "liked" philosophy tended to be introverts. While each piece of such information is too weak to produce a reliable prediction, when tens, hundreds, or thousands of individual data points are combined, the resulting predictions become really accurate.
Kosinski and his team tirelessly refined their models. In 2012, Kosinski proved that on the basis of an average of 68 Facebook "likes" by a user, it was possible to predict their skin color (with 95 percent accuracy), their sexual orientation (88 percent accuracy), and their affiliation to the Democratic or Republican party (85 percent). But it didn't stop there. Intelligence, religious affiliation, as well as alcohol, cigarette and drug use, could all be determined. From the data, it was even possible to deduce whether someone's parents were divorced.
The strength of their modeling was illustrated by how well it could predict a subject's answers. Kosinski continued to work on the models incessantly: before long, he was able to evaluate a person better than the average work colleague, merely on the basis of ten Facebook "likes." Seventy "likes" were enough to outdo what a person's friends knew, 150 what their parents knew, and 300 "likes" what their partner knew. More "likes" could even surpass what a person thought they knew about themselves.”