Privacy as a business model

, ,
privacy, data protection regulation

The original version of this article (in French) was published in quarterly newletter no 22 (October 2021) from the Chair “Values and Policies of Personal Information”.

The usual approach

The GDPR is the most visible text on this topic. It is not the oldest, but it is at the forefront for a simple reason: it includes huge sanctions (up to 4% of consolidated international group turnover for companies). Consequently, this regulation is often treated as a threat. We seek to protect ourselves from legal risk.

The approach is always the same: list all data processed, then find a legal framework that allows you to keep to the same old habits. This is what produces the long, dry texts that the end-user is asked to agree to with a click, most often without reading. And abracadabra, a legal magic trick – you’ve got the user’s consent, you can continue as before.

This way of doing things poses various problems.

  1. It implies that privacy is a costly position, a risk, that it is undesirable. Communication around the topic can create a disastrous impression. The message on screen says one thing (in general, “we value your privacy”), while reality says the opposite (“sign the 73-page-long contract now, without reading it”). The user knows very well when signing that everyone is lying. No, they haven’t read it. And no, nobody is respecting their privacy. It is a phony contract signed between liars.
  2. The user is positioned as an enemy. Someone who you need to get to sign a document, more or less forced, in which they undertake not to sue, is an enemy. It creates a relationship of distrust with the user.

But we could see these texts with a completely different perspective if we just decided to change our point of view.

Placing the user at the center

The first approach means satisfying the legal team (avoiding lawsuits) and the IT department (a few banners and buttons to add, but in reality nothing changes). What about trying to satisfy the end user?

Let us consider that privacy is desirable, preferable. Imagine that we are there to serve users, rather than trying to protect ourselves from them.

We are providing a service to users, and in so doing, we process their personal data. Not everything that is available to us, but only what is needed for said service. Needed to satisfy the user, not to satisfy the service provider.

And since we have data about the user, we may as well show it to them, and allow them to take action. By displaying things in an understandable way, we create a phenomenon of trust. By giving power back to the user (to delete and correct, for example) we give them a more comfortable position.

You can guess what is coming: by placing the user back in the center, we fall naturally and logically back in line with GDPR obligations.

And yet, this part of the legislation is far too often misunderstood. The GDPR allows for a certain number of cases under which it is authorized to manipulate personal user data. Firstly, upon their request, to provide the service that is being sought. Secondly, for a whole range of legal obligations. Thirdly, for a few well-defined exceptions (research, police, law, absolute emergency, etc.). And finally, if there really is no good reason, you have to ask explicit consent from the user.

If we are asking the user’s consent, it is because we are in the process of damaging their privacy in a way that is not serving them. Consent is not the first condition of all personal data processing. On the contrary, it is the last. If there really is no legitimate motive, permission must be asked before processing the data.

Once this point has been raised, the key objection remains: the entire economic model of the digital world involves pillaging people’s private lives, to model and profile them, sell targeted advertising for as much money as possible, and predict user behavior. In short, if you want to exist online, you have to follow the American model.

Protectionism

Let us try another approach. Consider that the GDPR is a text that protects Europeans, imposing our values (like respect of privacy) in a world that ignores them. The legislation tells us that companies that do not respect these values are not welcome in the European Single Market. From this point of view, the GDPR has a clear protectionist effect: European companies respect the GDPR, while others do not. A European digital ecosystem can come into being with protected access to the most profitable market in the world.

From this perspective, privacy is seen as a positive thing for both companies and users. A bit like how a restaurant owner handles hygiene standards: a meticulous, serious approach is needed, but it is important to do so to protect customers, and it is in their interest to have an exemplary reputation. Furthermore, it is better if it is mandatory, so that the bottom-feeders who don’t respect the most basic rules disappear from the market.

And here, it is exactly the same mechanism. Consider that users are allies and put them back in the center of the game. If we have data on them, we may as well tell them, show them, and so on.

Here, a key element enters in play. Because, as long as Europe’s digital industry remains stuck on the American model and rejects the GDPR, it is in the opposite position. The business world does not like to comply with standards when it does not understand their utility. It debates with inspecting authorities to request softer rules, delays, adjustments, exceptions, etc. And so, it asks that the weapon created to protect European companies be disarmed and left on standby.

It is a Nash equilibrium. It is in the interest of all European companies to use the GDPR’s protectionist aspect to their advantage, but each believes that if they are the first, then they will lose out to those who do not respect the standards. Normally, to get out of this kind of toxic equilibrium, it takes a market regulation initiative. Ideally, a concerted effort to stimulate movement in the right direction. For now, the closest thing to a regulatory initiative are the increasingly high sanctions being dealt out all over Europe.

Standing out from the crowd

Of course, the digital reality of today is often not that simple. Data travels, changes hands, collected in one place but exploited in another. To successfully show users the processing of their data, often many things need to be reworked. The process needs to be focused on the end user rather than on the activity.

And even so, there are some cases where this kind of transparent approach is impossible. For example, the data that is collected to be used for targeted ad profiling. This data is nearly always transmitted to third parties, to be used in ways that are not in direct connection with the service that the user subscribed to. This is the typical use-case for which we try to obtain user consent (without which the processing is illegal) but where it is clear that transparency is impossible and informed consent is unlikely.

Two major categories are taking shape. The first includes digital services that can place the user at the center, and present themselves as allies, demonstrating a very high level of transparency. And the second represents digital services that are incapable of presenting themselves as allies.

So clearly, a company’s position on the question of privacy can be a positive feature that sets them apart. By aiming to defend user interests, we improve compliance with regulation, instead of trying to comply without understanding. We form an alliance with the user. And that is precisely what changes everything.

Benjamin Bayart

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *