Margrethe Vestager. “We may have new technology but we do not have new values”

We were loudly cheering for Margrethe Vestager on day 3 of Web Summit. As she is one of the five idols from our PILOT collection – and the one responsible for the creation of THIDOS – to see her live on stage was probably our most anticipated moment.

And, of course, she didn’t disappoint. Margrethe Vestager is an incredibly fierce person, with a very rational and well structured speech. She made us proud to have someone like her representing our citizen’s rights. She is the kind of person who make people believe in politics again.

Here below is the full interview she gave to journalist Laurie Segall on the central stage of Web Summit, just edited for clarity.

It’s super exciting to be here with you. What an extraordinary time to have the role that you have. I was doing some research and someone said: “She will have more leverage than anyone else in the world. She will be the most powerful regulator on big tech on the planet.” I know you are expanding your role. How do you feel about it? Let’s do the emotions first. How does it feel?

I would have felt alone if it wasn’t for this (pointing at the audience). They say it takes a village to raise a child. I think it takes a community like this to make tech serve humans. Because this is what we are going to do.

At a time when we feel humanity and technology are clashing in a very interesting way, what would you say, in this new expanded role, would be your most important priority?

The first priority will always be humans. That technology should serve us and that we need humans to understand what is going on, to create what is going on and to do that as a community.

The challenge that we have is to turn the tech community into the user community: much more diverse and also to reflect the world we want to live in. Look at the date we have, but also the data that we are missing- We have to make that available. And we need to have the necessary skills and the sufficient investment. But the bottom line of all is that we may have new technology, but we don not have new values. Dignity, integrity, humanity, equality: that’s the same.

You’ve been known to take on big tech companies: Apple, Amazon, Google. Have you witnessed them changing? Do you actually think they are changing their behavior and, if not, do you think they will have more in fines? Back in the US, they have been talking about breaking them up. What do you think will happen to make them change?

First of all, they’re different. They’re different in their approach, they’re different in their approach to law enforcements. They’re also different when they partner up – because we do that as well, for instance, when it comes to fighting illegal content.

But, if I’ve seen change, basically it is to have even bigger ambitions. If you look at new Google services being launched, if you look at Facebook Libra plans, if you look at Apple streaming services, you see still bigger ambitions. It also shows me that we have reached a phase where competition and law enforcement can only do part of the job. We need to look for the right level of democracy framing tech and say “this is how you should serve us”.

Let’s talk about Facebook. A lot of controversy of their criptocurrency Libra. Should they be allowed to proceed given their history of handling privacy?

First of all, we’re trying to figure out what it is. We can do that. We can look at it even though it doesn’t exist yet. But this is not just for me, as Commissioner for Competition. This is for the entire Commission. Because we have questions not just tied to the Facebook product and the dominance of Facebook, but colleagues of mine are asking questions about money laudering, terrorism financing, financial stability. All the things that may come when you create a currency that is not under the control of authority.

I’ve been covering tech for a long time. You’re actually feared in Silicon Valley. There’s a total filter bubble in tech sometimes. You see US Government saying a lot of things and FTC fines out there. But there’s really a fear that you may come in and make real changes. Do you think that US regulators are really punishing tech companies? Because there’s a really interesting contrast between you and America.

That may have been. But what I sense is that there have been a very renewed interest and also engagement by US authorities: to start asking questions, to start engaging, to start investigating, to say “maybe there is an enforce role for us here as well” and that is very very welcome.

There is this ongoing debate, going back to Facebook, about political ads. I’m sure you’ve been talking a lot about this. Twitter has drown a hard line on making the decision to ban them. Jack Dorsey said they won’t allow political ads on the platform. Mark Zuckenberg said that, despite the fact that they could be spreading lies, shouldn’t be up to the tech companies to judge what’s true. It’s not their responsibility. You think a lot about this type of things. Who do you think is right?

We have discussed in depth, at length, in the real world what we want to accept and what we won’t accept. I simply do not understand why it is not the same thing in the digital world. Because this is something that we have discussed and it has been back and forth, and it has taken us ages and we say “this is the framework for political advertising, this is the framework for our political debate”. Why that should be any different in the digital world? I don’t understand.

And the second thing is that maybe we should do even more. Because the risk is that we completely undermine our democracy. Democracy is supposed to take place in the open, where a political ad can be fact-checked, contradicted, different opinions can be offered and can be supported. But if it’s only in your feed, it’s only between you and Facebook. And it’s microtargeting of who you are. That’s not democracy. That’s just privatize, de facto manipulation of who you’re going to vote for. Democracy is supposed to take place in the open.

Since people listen to you in Silicon Valley, what would be your message to Mark Zuckerberg?

I, of course, listen very closely when Mark Zuckerberg gives evidence in hearings in the US Senator Congress Committees. What is inspiring is not just what you say, is also what you do. He’s an amazing creator of an amazing company and if he, himself, puts action behind his words, we would see change rapidly and that would be very welcome.

What kind of action?

You see other companies. They say, as you mentioned Twitter, “we will not have political advertising”. That, of course, is not the end of that story, because you may still have bots and all of that. But he set an important step forward where the company shows “these are our values, this is what we believe in, we’re not just a neutral sort of stream of whatever comes here, we have values here”. I’m not the CEO of Facebook, I’m not to judge. But I think the time has come where they also should put action behind their words.

Part of your new role will be setting the agenda for regulations regarding artificial intelligence. We are most certainly heading towards a future where biased AI can certainly have tremendous implications for humanity. How do you see your role to regulate that? What kind of regulations do you see putting in a place like that?

It’s very tricky. So, the first thing that we will do is to listen very carefully to stakeholders everywhere. But we’ll also try to listen fast, because, as we’re speaking, AI is developing and that is wonderful. Because I see no limits as to how artificial intelligence can support what we want to do as humans.

Take climate change. I think we can be much more effective fighting climate change if we use artificial intelligence. I think we can spare people for stressful and awful waiting time between having been examined by a doctor and having the result of that examination. And also have more precise results. I think the benefits of using artificial intelligence have no limits. But we need to be in control of the cornerstones so that we can trust it. That is has human oversight and, very importantly, that it doesn’t have biased actions.

I was recenty speaking to someone who was talking about the AI recommendation models that are being built. And he was saying: these companies, – Google, Facebook, Twitter – know more about us than our doctors, our lawyers. If they’re not acting in our best interest they should have lawsuits and that’s the way we should start protecting ourselves in the future. Is that what you think about it? Are you considering any types of laws and regulations that make platforms more accountable for what’s on the platform?

This is part of my mission letter. The president elected has asked me to take charge of exactly this discussion and figure out if we would push for new legislation to make people more responsible.

But there are more ways to deal with what you’re referring to. Because, one thing is that you have, let’s call it citizen’s rights, the GDPR. But that doesn’t solve the question about how much stakes can be collected about you. We also need better protection and better tools to protect ourselves from leaving a trace everywhere we go. Because otherwise, you know, I don’t think we are ready to be naked, without a gadget, without our phones, without the things that we like.

Maybe we would like to be much more able to choose what kind of trace we would leave behind. And that side of the equation will have to be part of the discussion as well. How can we be better protected from leaving that trace of data that allows companies to know so much more about any one of us, that we might even realize ourselves.

So how can we? How does it actually look like? I think a lot of people are actually frustrated at this point. I remember interviewing Zuckerberg during Cambridge Analytica and it was unbelievable how little we knew about what was happening. So, you in this seat, what does that look like?

First of all, I, myself, am very happy that I have digital rights. My problem is that I find it very difficult to enforce them. The only real result of me reading terms and conditions is that I get myself distracted from wanting to read the article. It makes me tap the terms and conditions. So we need that to be understandable, so that we know what we’re dealing with, and we need software and services that enable us not to leave the same kind of trace that we otherwise do.

You know, I am a kind of a liberal. I really hope that the market will also help us here. Because it’s not just for politicians, to deal with it. It is in the interaction with the market that we can find solutions. Because one of the main challenges in dealing with AI is that of course there is a risk that we will regulate for yesterday. And it’s worth nothing.

And what point do you think tech companies should be broken up? You said that this was the last thing, but at what point would you advocate for companies like Facebook and Google to be broken up?

Two different perspectives. First, from a competition point of view, you would have to do something where breaking up the company was the only solution to the illegal behavior, to the damages that you have provoked. And we don’t have that kind of case, right now. I will never exclude that that could happen, but so far we don’t have a problem that big, that breaking up a company would be the solution.

Second perspective is the size in itself. But the problem in that debate is that the people advocating it don’t have a model on how to do this. You know this story about an antique kind of creature. When you chop out one head, seven came out. So there is a risk that you do not solve this problem and you just have many more problems. And you don’t have a way of, at least, trying to control it.

So I’m much more in the line of thinking that when you become that big you get a sense of responsibility because you are, de facto, the rule setter in the market that you own. And we could be much more precise about what that domain tells. Because otherwise there’s a risk that many interesting companies have no chance of competing.

You’ve recently been talking about tech companies dodging tax obligations in countries where they have a lot of user data, but they don’t make it clear how that data is monetized. One of the things that you said is that is not you searching Google, but Google searching you. What did you mean by that?

If there is an imbalance between what you get and what you pay, then very often you’re the product. It is you that’s being searched because all the data that you leave behind they make you part of the product. Whenever you make a small query, that query is not just a window into the internet, it is also a window into you. Part of the amazing technology that they hold is the knowledge to not only predict what you want, but to lead you into making choices that otherwise you would not made.

One of the reasons I love a good magic show is that this is labeled. You’re going to be manipulated. You know this is a trick. There is cards somewhere in the sleeve, there is something happening. You know. And you find it fun and you are amazed. If that was the case all the time. If you know this is a prediction, we take all the data and the knowledge we have about you, then you would have a different relationship. But here, at least for me, I still have the sense that “this is the right answer, this is the truth I am presented with” and not a prediction of what I want to do next.

As we’re heading into 2020, what do you think is the biggest threat facing technology?

If technology gets self-conscious and not open to the many many many people who have great ideas. In Europe, over the last ten years you have developed a bioprint dynamic startup community with a lot of also scaleup potential. You see small startups, you see the giant businesses, you see how there is an appetite to use technology for all kinds of different things. If tech is something that is only embedded in giant companies that is beyond our democraties then I think we lose trust in technology.

Part of my mission is that we build trust in technology by making sure that we can reach for the potential, but we can also do something to control the dark side.

Do you think that’s where we’re heading to?

Well, I will do my best. But I think with great teams and determination, I think democraties can find that kind of solution, yes.

I have to ask you this. He talks about a lot of people. But he spoke about you recently. President Donald Trump accused you of being the person who hates the United States more than anyone he has met. What would be your response?

That he can only meet people who really love the US.

You have Donald Trump saying you hate the US, you have Tim Cook saying the judgement on Apple was totally political crap. You’re just this person who ruffles feathers left and right, from the most powerful companies and people. Where does this come from? Where does this drive come from? I know you’ve grown up in a very religious family. You just go up against everyone. Are you ever phased?

Both my parents are priests in the Danish church, but they were never that religious. What they were is that they were open to contradiction. When I was very small we only had a weekly paper. My dad was active in local politics and there was a was very, very angry piece in that paper saying that he was wrong and I was devasted. And my dad was enthusiastic. This was exactly what we wanted to achieve. That someone contradicted him.

So I’ve been told that saying what you think, doing what you find to be the right thing – and when people contradict you – that is not a mistake, an error or bad. This is just the way the world is. So you should be able to stand there and do what you find to be the right thing.

At the time I started covering startups, back in 2009, I was an optimist. I remember covering companies when they were just coming up. And things got really complicated. A lot of people are really wondering about this time. Are we really gonna come out of this good or bad? Going back to the beginning of this conversation. What a fascinating time to sit in this seat that you’re in. Are you an optimistic? Do you think we’ll come out of this and win back trust?

Yes, I am an optimist, because I think it’s a moral obligation. Pessimists they really never get anything done. Because it’s worse tomorrow, so why bother?

Deixe uma Resposta

Preencha os seus detalhes abaixo ou clique num ícone para iniciar sessão:

Logótipo da WordPress.com

Está a comentar usando a sua conta WordPress.com Terminar Sessão /  Alterar )

Google photo

Está a comentar usando a sua conta Google Terminar Sessão /  Alterar )

Imagem do Twitter

Está a comentar usando a sua conta Twitter Terminar Sessão /  Alterar )

Facebook photo

Está a comentar usando a sua conta Facebook Terminar Sessão /  Alterar )

Connecting to %s