Because We May Have New Technology, But We Don’t Have New Values.

European Union Competition Commissioner Margrethe Vestager’s use of regulation to protect human rights against tech corruption has made her a target of both certain corporate interests and Donald Trump’s tweet shade. On Nov. 21, Vestager delivered the inaugural Giovanni Buttarelli Memorial Lecture — named for the visionary, humanist European Data Protection Supervisor who died in August at 62 — at the International Association of Privacy Professionals Data Protection Conference.  Vestager will soon assume a new role as Executive Vice-President of the European Commission with responsibility for a “Europe fit for the Digital Age.”

Margrethe Vestager

From time to time, our world goes through huge changes. As a society, we find that we have big choices to make, about the sort of world we want to live in. And in times like those, we need leaders with vision – people who see far, and who can help us to understand what those changes will mean for our most fundamental values.

And Giovanni was that person. He helped us to see that protecting people’s privacy was one part of a much bigger question – the question of how to preserve human dignity in this digital age. He left us too soon, at the age of just 62. But the ideas that he made us think about are still a vital guide, to a future built on data.

You often hear it said these days that data is the new oil. And there’s some truth in that. A century ago, when oil transformed our economies, it put so much more power in the hands of those who had it, that it soon became impossible to compete in any other way.

And today, the same is becoming true of data. With access to the right data, you get so much new insight and understanding that it can be hard for others to compete without it. That’s why, as competition enforcers, we’re constantly alert to the risk that controlling data will become a way to shut out competition.

But treating data as the new oil doesn’t do justice to what a complex role it plays in our lives. Data is much more than just a commodity – our personal data also defines who we are. And controlling what happens with that data is a fundamental part of our human freedom – and it always has been.

Throughout our history, people have believed that human beings have the right to choose what information we share about ourselves, and with whom. We’ve defined the closeness of our relationships with others by the sort of things that we tell them – and disapproved of those who betray confidences. And all the way back to the Hippocratic Oath, we’ve expected that people who learn private things about us – our doctors, our lawyers, our priests – will be bound by strict duties to keep those things secret.

Because we may have new technology. But we don’t have new values.

And that’s fortunate. Because those values give us a vital reference point, to help us keep our bearings, at a time when data is being collected and used in so many new ways.

Very often, we still see the digital world that we live in through the lens of the past, as just an easier and more convenient version of the old physical world. We can imagine, for instance, that looking something up on the Internet is a bit like opening a book – just faster and with much more information there to find.

And we can easily forget that on the Internet, data flows both ways. When we buy a product online, we share data about our interests. When we chat on social media, advertisers build a profile of us. And whenever we search Google, Google is also searching us.

So a lot of us still share our data online in ways that we’d never dream of in the physical world. We find ourselves surprised, every time that a new story comes out, about the way our data is being used.

And not having control of our data makes us vulnerable. It allows businesses and politicians to understand us better – but it also gives them new opportunities to learn to manipulate us. It allows platforms to filter what we see of the world, to match what we’ve shown an interest in before – so it gets ever harder to stumble on the new ideas that help us grow. And artificial intelligence that’s trained on biased data can reproduce our society’s prejudices – only now, those prejudices come with the superficial sense of objectivity that you get when decisions are taken by a computer.

So protecting that data is an absolutely necessity, to build a digital world that works well for humans. And competition policy has an important contribution to make.

Because competition puts consumers in control. It means that, if we don’t like the deal that we’re getting, we can always walk away, and find something that meets our needs better. As consumers, we can use that power to insist that companies cut prices, or invest in developing innovative products. But we can also use it to demand a better deal on anything that we care about – including our privacy.

Three years ago, for instance, we looked at Microsoft’s purchase of LinkedIn. And we found that in the markets where LinkedIn faced competition, one important way that its rivals stood out was by offering better privacy protection. That’s why we only approved the merger after Microsoft had given us commitments not to block LinkedIn’s competitors.

But competition only works when we can actually compare what different companies are offering, and pick the one that meets our needs. And that can be difficult, when companies are secretive or vague about what they plan to do with our data.

Strong privacy rules, like the GDPR, can help. Because they help consumers to know what data is being collected, and what it’s going to be used for.

But even then, it’s not always easy for consumers to stay on top of the complex privacy policies of the dozens of websites and apps that we use each day.

And here, too, competition can help. As long as we keep our markets open for competition, there will be room for Europe’s businesses to come up with new services that can make it easier to control what’s happening with our data.

That could mean keeping an eye on who has that data, and what they’re doing with it. It could mean helping us compare different providers, so we can pick the one that offers the best privacy.

These are just examples, of course. I don’t know exactly what kind of services might emerge, to help consumers take more control. And that’s OK. Because it isn’t the job of a competition enforcer to say exactly how the market should meet people’s needs. Competition rules don’t tell companies exactly what price they should charge, or what new ideas to invest in –  or how they should protect data. They give consumers the power to decide for themselves.

That’s why competitive markets can be so creative. Because they let us tap into the ideas and imagination, not just of one or two people, but of a whole ecosystem of innovative businesses.

But it’s also why competition enforcement can never be the whole answer, when it comes to making digitisation work for everyone.

It’s good that competition helps us, as consumers, to bargain for a better deal. But we should never have to bargain for the fundamental standards of privacy that we have a right to expect. What we need instead are rules and regulations that set the framework for the market, so that every business meets those fundamental standards.

And of course we have that kind of rules in Europe. The privacy rules in the GDPR give Europeans control of their data – and they’re inspiring many other countries around the world.

But rights aren’t worth much, unless you can enforce them. So it’s also important that Europe’s data protection authorities  have the powers and resources they need to effectively enforce these rules. Because only then will the right to control our own data become a reality in people’s daily lives. And only then can we begin to restore Europeans’ trust that the digital world will work fairly for them.

And to tackle the challenges of the data economy, we need both competition rules and privacy regulation.

Neither of those things can take the place of the other. But in the end, we’re all dealing with the same digital world. And both the competition rules and the rules on data protection are fundamentally there for the same reason – to protect our rights as consumers.

Earlier this week, the IAPP published a manifesto for “Privacy 2030”, which draws together Giovanni Buttarelli’s ideas on the future of privacy. In the introduction, Omer Tene of the IAPP explains that for Giovanni, the question was always “privacy and…” – privacy and national security, privacy and competition, privacy and ethics.

And that’s an approach that all of us can learn from, as we work out how to tackle the challenges of digitisation. There’s always an “and”. Digitisation affects so many different parts of our lives, that all of our policies and our actions are intertwined. And it’s only by taking a unified view that we can hope to face up to the challenges of a digital world.

This is why I’m very glad you’ve invited me here. Because it’s vital that we keep talking to each other, about the things we have in common. And in the next Commission, it will be my job to do exactly that, as Executive Vice-President for a Europe fit for the Digital Age.

As we work on a European approach to artificial intelligence, we’ll need to make sure that AI systems respect people’s privacy. As competition enforcers, if we find that some businesses are using their control of data to deny others a chance to compete, then those companies might have to share the data they hold – in a way that’s fully in line with the data protection rules.

In this time of fast and radical change, all of us have a lot to learn from each other. And if we work together in the spirit that Giovanni Buttarelli showed us, we can achieve his cherished aim –  a digital future that works for human beings.