Europe declared war on Google and Facebook, Israel declared war on unrealistic ads: AdGuard's digest
It looks like "Old Europe" rebelled against the brave new world of personal data abuse.
The E.U. regulators decided to replace the transatlantic data transfer pact (known as the EU–US Privacy Shield) with a law that would not let Facebook and other global companies transfer user data gathered in Europe to the States. Facebook (a.k.a. Meta) reacted by mentioning this as a potential risk in their annual report. The move could make them "unable to offer a number of their most significant products and services, including Facebook and Instagram, in Europe".
Press reacted by strong headlines a la "Meta threatens to leave Europe without Facebook and Instagram". German Economy Minister Robert Habeck assured that people'd be fine without Facebook and Instagram: "After being hacked I’ve lived without Facebook and Twitter for four years and life has been fantastic". French Finance Minister Bruno Le Maire said that "life is very good without Facebook", "we would live very well without Facebook", and that "digital giants must understand that the European continent will resist and affirm its sovereignty". Facebook is ready to negotiate and implement data protection.
And it is not only Facebook: back in January "Austrias Data Protection Authority has banned the use of Google Analytics on European websites for violating GDPR norms. Ahead of Austria, The Netherlands voiced the possible ban of Google Analytics. Two weeks after Austria, the French Data Protection Authority (CNIL) followed the lead and stated that GA's transferring Europeans' data to the U.S. violates the law.
"It's interesting to see that the different European Data Protection Authorities all come to the same conclusion: the use of Google Analytics is illegal. There is a European task force and we assume that this action is coordinated and other authorities will decide similarly", the expert notes.
Apple: oops, we did it again
It's often hard to say whether some actions of the Big Tech companies are honest mistakes or malicious tricks.
If you update your iPhone to a beta version of iOS 15.4, it will again ask you if you'd like to "help improve Siri and dictation by allowing Apple to record and analyze your voice commands". Even if you had said "No" earlier. Looks like Apple does not always take no for an answer. It turned out, people who had opted out could be recorded nevertheless. And we've already written about how often voice assistants are activated by false triggers — phrases that sound remotely like the commands they are to react to. So think twice before introducing a voice assistant into your daily routine and keep the risks in mind.
Of course, Apple said it had been a bug. Well, they always say that an employee accidentally forgot the prototype of the new iPhone in a bar. Sure, sure, accidents happen.
Although, weird things do happen: it was recently discovered that when Apple fires an employee, their job title changes to Associate in databases used by other companies to verify candidates' CVs while hiring. People become the lowest link in the food chain, and Apple isn't even hiding it. If a company treats their own employees like that, would it respect its customers?
If we have to see ads, let them be realistic
Advertising is manipulative and often straightforward deceptive. The Israel's parliament wants to do something with it, starts small: "Advertisers are to inform the public when using photoshop on images of models". Unrealistic beauty provokes eating disorders, they say.
We believe there should be more initiatives like this. Advertising is not going anywhere for a long while, but it should at least be more realistic and honest.
Murphy's law of mobile apps
The law sounds like that: the more interesting and tempting functions an app promises, the higher the risks of using it. It promises to get you rich? It's probably a scam. It allows you to track your ex? It'll track you instead. There are exceptions, but they are few.
Take the app called WAMR for example. It allows reading deleted WhatsApp messages, and it is potentially dangerous, experts warn. It works by intercepting notifications: when a user sends you a message and deletes it afterward, the app saves it from the notifications. Naturally, it has to have the access to the messenger's notifications which provides the scope for abuse.
Just listen to your moral sense: if it gives you a hint that something must be wrong with an app, consider not using it, for your own sake. Look at WAMR: is it not mean and disrespectful to a person to forcefully read their message they didn't want you to read?
Your phone can be tracked by a wallpaper
Even when tech giants are not plotting against users' data safety and privacy, and just want to develop a cute feature, they can leave huge security holes because they can afford to cover the potential damages. Maybe we even envy them a little: small companies normally have to be extremely careful on that score. And of course, people willing to abuse vulnerabilities are always near and waiting. Any new technology can and will be used against you, that's our edition of Miranda warning.
A wallpaper, for instance: what can be more harmless than just a picture on your screen? However, wallpapers on Android devices can be employed for fingerprinting by building a unique description of colors in the picture:
"Android 12's highly anticipated Material You design system features wallpaper-based color theming and advanced customizations powered by color extraction. These UI enhancements allow users to select a wallpaper from which an optimal palette of colors is automatically generated and applied to the device’s look and feel globally.
Unfortunately, such personalization can carry a high price in compromised privacy.
The WallpaperManager class provides methods for interacting with wallpapers, including getDrawable for retrieving the current system wallpaper as a drawable resource.
Byte arrays can be used to restore original images from Android wallpapers, which are highly likely to contain personal information or details uniquely important to the user".
To prevent wallpaper tracking, never use private or personal images for wallpapers, especially on devices running Android 8.1 and earlier, use preinstalled wallpapers, or rather one wallpaper, and do not change it.
What's the difference between a password and an eye?
Right, the former can be changed, the latter is with you for as long as you breathe. That's why we do not haste to embrace biometric identification and other technologies involving body pattern recognition. Even if your face alters while aging, biometric parameters almost never change. Once they are harvested, they can be abused and leaked forever.
Besides, it's not people who profit from the new technologies right now, it's businesses and states. For users, it is no more than a mere toy.
The new version of MoviePass app will use facial recognition and eye tracking tech in your phone to make sure that you're actually watching ads. By watching them you earn virtual currency that can be spent on movie tickets. Please, is it really worth it?
Eye tracking gives a lot of information about human behavior. And of course we do not know what other data will be harvested and how it'll be used. Giving all that for movie tickets sounds a little like selling Manhattan for a handful of beads (whether it had happened in reality or not).
We know you folks are reasonable and experienced, you are here because you know that privacy matters, information costs a lot, and risks are real. But most of you are power users, geeks, and developers, you like to test and taste new things. We love it too, but sometimes it's just not worth it.