The RGPD framework, Privacy by Design logics and compliance audits have long structured companies' response to the challenges of personal data protection. But in 2025, reality has changed: the very notion of "sensitive data" has broadened, user expectations have hardened, and technical tools are struggling to keep up. Faced with this new situation, the question arises: how can we rethink our approach to sensitive data to make it a lever for sustainable trust, rather than a mere regulatory imperative?
Sensitive data that no longer has the same definition
Originally, sensitive data referred to identifiable and highly personal elements: health data, ethnic origin, political opinions. But today, this definition seems very narrow. The digital footprint has expanded considerably, and the massive cross-referencing of information can transform an a priori "banal" item of data into a highly revealing one. A browsing history, a purchasing behavior, an occasional geolocation take on a whole new dimension when aggregated, analyzed and interpreted.
This gradual shift calls for a re-reading of the very concept of sensitivity. The value of a piece of data no longer lies solely in its nature, but in the context in which it is used and the intentions behind it. The same data can be innocuous or highly critical, depending on how it is used. This is a paradigm shift that requires us to rethink not only what we protect, but why we protect it.
Tools designed for yesterday, for tomorrow's challenges
Companies have largely structured their governance around RGPD obligations: data mapping, PIA, Privacy by Design... These approaches have enabled a salutary rise in maturity. Yet these tools were designed for a relatively static, centralized management model. They struggle to keep up with the speed, scale and granularity of contemporary processing.
Automation, generative AI, the distributed cloud and event-driven architectures have all led to an explosion in the exposure surface. Processing cycles are accelerating, flows are becoming dynamic, and responsibilities are becoming diffuse. In this changing environment, document compliance is no longer enough. We now need to think of protection as an embedded, proactive mechanism, capable of adapting to usage in real time.
The rise of "Ethics by Design" approaches
This is where "Ethics by Design" approaches come into their own. They no longer consist simply in hiding or limiting access to data, but in guaranteeing confidentiality from the moment it is processed, without ever exposing it. Confidential computing, for example, makes it possible to run calculations on encrypted data without decrypting it, opening the way to uses that were previously impossible without compromising security.
These technologies don't solve everything, but they do embody another way of conceiving protection: not as an external constraint, but as a native property of systems. This calls for a change in attitude on the part of both companies and solution providers. Yesterday, it was all about compliance. What will count tomorrow is being trustworthy by design.
When users demand more than compliance
This shift is also being driven by growing pressure from users. Faced with scandals, leaks and opaque algorithms, they are no longer content to know that a company is "RGPD compliant". They want to understand how their data is processed, for what purpose, with what guarantees. The demand for transparency, explicability and respect is becoming a strong expectation - even a criterion of choice - well beyond just consumer uses.
This expectation, particularly among the younger generations, creates a new space for differentiation. It forces organizations to make digital ethics a visible, consistent and lasting marker. Not to wait to be challenged, but to demonstrate their commitment, both in practice and in the design of their services.
Towards a renewed design of sensitive data
What we still call "data protection" must now be thought of as an architecture of trust. This does not mean abandoning existing frameworks, but going beyond them, by integrating the new dimensions of risk, usage and perception.
It's about designing services that don't confront users with binary choices ("refuse everything" or "accept everything"), but that integrate respect for privacy into the experience itself. To create environments where security is not negotiated after the fact, but coded in from the outset. And, above all, to recognize that sensitive data is not static: it's constantly evolving and redefining itself, and it's up to our tools, practices and principles to keep pace.

Sasha Belguenbour
Senior Data Governance Consultant
Micropole, a Talan company


