This is Data Decade, a podcast by the ODI.
Emma Thwaites:Hello! I’m Emma Thwaites, and welcome back to Data Decade. In this series we’re building up to the ODI Summit in November by looking at the last 10 years of data, the next decade ahead, and the transformational possibilities of data in the future.
We’ve explored data in arts and culture and how data is shaping our cities, who looks after data, and how we can trust the data that’s reported in the media. But in this episode, we’re going to take a look at the responsible use of data. Since the ODI was founded, the amount of data created globally has grown exponentially. Data has become increasingly important in our daily lives, both in our own leisure time and at work. Helping people to understand data better – and how to use it responsibly – can allow businesses to thrive, and also prevent people from being misled or deceived.
So it’s become more important to be able to think critically about data in different contexts, and to examine the impact of different approaches when collecting, using and sharing data and information. But how do we do this? And who defines what responsible data is?
Emma Thwaites: So thanks for joining us. And over the next 30 minutes, we’re going to explore how data is used responsibly, and how to better understand the data that’s used and shared around us every day. And joining me to talk about this is Di Mayze, the Global Head of Data and AI at WPP.
Emma Thwaites: Hi. And one of my fellow directors at the ODI, Stuart Coleman, the Director of Learning and Business Development.
Emma Thwaites: Welcome, both of you. It’s really great to have you along for the discussion. So Di, I’m going to you first. When we think about ways in which data can be used responsibly, or indeed not used responsibly, I wonder what you think the main things that have been, that we’ve seen over the last 10 years.
Di Mayze: So at WPP, we talk about just because you can, doesn’t mean you should. And that as a philosophy goes beyond GDPR to really think about the unintended consequences of what you’re doing and making sure you’re not harming society. But I think it’s really important not to make data the enemy. There’s not really such a thing as bad data, unless it’s really unclean or obviously not got from legitimate ways.
And let me give an example of that. So we heard, oh, I read it in a book actually, that a ride hailing app had done some research and they knew that people with a low phone battery would be willing to pay more. Now the responsible response to that may depend on your stakeholders because if you work for that company and you are targeted on generating as much revenue per passenger, why not charge people more when you know they’re prepared to pay more?
But then you think about the ethics of that. You think about what that might look like in a newspaper if that’s played back to you, or if somebody doesn’t get that taxi because they haven’t got the money to do surge pricing. Then suddenly it doesn’t seem so responsible.
And that same data set could be used for the exact opposite reason. What if they were prioritised? What if that company made it their mission that we get you home safely and we arrive with a car that’s got a charger for you. So really keen to make sure that responsible data use, it’s thinking about what you’re doing with the data and using it purposefully, but not, not making data the enemy.
Emma Thwaites: It’s interesting, isn’t it? Cause given that example, responsibility or responsible use of data is very much moot. You know, it kind of depends on your perspective as you’ve outlined. Do you think it’s a bit of a red herring to even try to define it? Is it the right term, ‘responsible use of data’?Di Mayze: I think it absolutely is because if we’re not conscious about it, then when AI and the machines are making more decisions for us, if we are not conscious about what we’re asking the machines to do and we’re acting responsibly, then, all sorts of things could happen.
And it’s also, the language is important. There’s a lot of talk about the demise of third party cookies. But if you think about what they are, they track people. Now who wants to be tracked. That’s a horrible word. You wouldn’t want to be tracked around a shopping centre, having picked up one item for someone to follow you around going, well, how about this? How about this? How about this?
So I think being very conscious about the word and what that means when it’s associated with data, I think brings a level of discussion and dialogue across multiple departments so they can all make sure that we are using it responsibly.Emma Thwaites: I guess your industry, in your role, you must have quite a heightened awareness of this issue. Because the advertising industry particularly over the last 10 years has found itself in the firing line quite a lot, for either real or imagined irresponsible use of data.
I wonder how you kind of handle that day-to-day?Di Mayze: Well, as the first ever Global Head of Data and AI for WPP, I was in a privileged position to set the data strategy for WPP. And to really minimise the harm that we could create – for us, ourselves, the planet, for our clients – we decided what if we didn’t obsess about personal data? What if we didn’t obsess about sensitive data? What if we celebrated all the other data that is available?
And anybody who works in a retail environment or for a brand that sells in store, weather is a much more useful data source than actually what somebody’s email address is. So we, we celebrate that.
So we’ve created a whole philosophy for the whole business to be part of that celebrates the less personal and sensitive and intimate data. And I think that’s really freeing.Emma Thwaites: And do you think that- cause WPP, you know, is a parent company for lots of smaller companies. Do you think that across the WPP group, that that philosophy, that way of looking at things, has been widely adopted? Or are you kind of at the beginning of that journey?Di Mayze: We’re three years in and it is going phenomenally well, because it means everyone’s got a part to play. So you can be a, a citizen data scientist. You can be a storyteller, a strategist, you can think about what would weather data or, where somebody’s at, what they might need and less obsessing aboutwhether it’s a homemaker who likes coupons, which let’s face it, no one wants to be described in that way.Emma Thwaites: Great. I’m gonna come to you, Stuart. I think, you know, Di’s covered an awful lot there in that couple of answers, it does sound like the world of data – indeed, you know, responsible data use and sharing – has changed. Obviously we’ve heard about one particular sector, but I wonder has the business sector more broadly kept pace with all of those changes?
Stuart Coleman: I think it’s ahead of the changes. I mean, there’s a lot of obsession about the scale at which data is growing. Obviously it’s generated predominantly by human activity and you know, all sorts of facts and statistics are regularly thrown out about how much more data we have. But the reality is, there is a handful of companies who control or who leverage the most value from the most data.
If you look at the market capitalisation of the largest 10 companies in the world, approximately half are companies that are less than 30 years old and whose business models are entirely predicated on data. So I think we live in an age where, a bit like the early railroads in the US, where there was monopolies that sprung up because there was a relentless pace to try and generate value from this new asset.
We have a handful of companies who do that. Now it’s great to hear that, you know, massive kind of groups of operating companies like WPP, who are often close to the edge of this, are doing good, responsible things in the area. But the reality is, your question is like, are companies keeping up?
Well, I think there’s a few companies that are way ahead of everyone else and a few sectors that are way ahead. As an example, you only have to see the resources these companies are assigning to data. I mean, I was dealing with a global bank who shall remain nameless the other day. They have five Chief Data Officers. So, I mean, that kind of tells you, and, you know, the average UK company – in, in most countries, the average company – is an SME.
They don’t have the resources for this. So, no, we live in an asymmetrical world of data. And I think we need change around data literacy- I mean, the word responsible implies an obligation. And, you know, if you take that further, it implies a duty of care.
So who is responsible? If we generate the data, is it us? Is it the organisations who, who kind of have the data that we generate because of the nature of the business model we’ve engaged in – whether we are the product, as I think John Rushkoff said about 10 years ago.
You know, it’s, it’s kind of well known and understood now that platform business models operate off our data. We kind of accept it cause there’s not many alternatives. But, but there are alternatives springing up. I mean, it’s interesting that Di picked up on kind of the battery scenario, you know, with a, an algorithmic run taxi firm.
There’s some great new kind of emerging business models in the gig economy space – Driver’s Seat Coop is one I came across recently. I mean, they basically are empowering drivers to get access to their data in a way that helps them negotiate better working environments, better working conditions, or potentially better jobs and journeys.Emma Thwaites: Also, some of the things that Di spoke about were, from my perspective, about almost rebalancing power.
We’ve ended up in a situation where there are a relatively small number of companies, because they have the expertise and the technical capability, who hold a lot more of the power than we might be comfortable with them holding. To the point about the railways in America, it’s great to see pockets of that rebalancing emerging, but how do we make sure that that happens more broadly across not just the business environment, but across society? That people are able to take back some of that control. That it, the relationship between the consumer and the organisation becomes a bit more even and equal where their data is concerned.Stuart Coleman: I think there’s lots of different ways that that needs to come about. I don’t think there’s a single answer to that. I definitely think regulation needs to keep up and I don’t think you can regulate the data monopolies in the same way you regulate other businesses.
There’s lots being kind of starting to be written, or has been written, about this. But I think you need to think about it differently. You maybe need to have principles as opposed to rules that are enforceable through judgements, for example. That would be one example because rules necessarily can’t always be easily, can be outmanoeuvred by- in this space.
I mean, data doesn’t doesn’t fit into jurisdictional, geographical boundaries, like railroads for example. So I think you can encourage and change through education and participation. The Ada Lovelace Institute published a paper about participatory data stewardship last year. And that talks about-
Stuart Coleman: It is, isn’t it?
Emma Thwaites:I mean, we understand what that means. But-
Stuart Coleman: Well, it, means- but it means people, I think the principle is data generated about us, by us, used for us.Di Mayze: I love that.Stuart Coleman: Okay. So, I mean-
Stuart Coleman:Yeah. I mean, I made up the last bit, Di.
Di Mayze: Okay. I like the first bit.
Stuart Coleman:So, so, so to Emma’s point, it’s quite a long mouthful.Emma Thwaites: But it’s, it’s in our world, we understand it. Cause we, we are working in data and, and around data every day, but there’ll be people I hope potentially listening who won’t be familiar with that kind of term.Stuart Coleman: Well, and I was thinking how you might try and make it relevant. I mean, you know, I talked about Driver’s Seat Coop- I mean, cooperative is a business model that a lot of people get.
You know, I think you can, I mean, we’ve, we’ve obviously got a program looking at what we call data stewardship. And, and if you look at that from a kind of personal data perspective, there are things, there are products and services. I mean our own co-founder Sir Tim Berners-Lee, obviously inventor of the World Wide Web, he’s, he’s put a lot of energy and passion behind an open source project called Solid.
Which is enabling and empowering individuals to, to be able to take their data and put it in a place that they’re comfortable with, that they trust. And that’s very early stage project in, in some terms, but there are, I mean, I think, you know, it’s up to government to help these types of organisations.
Cause this is, this is, I mean, this is an opportunity for growth as well, right? There’s a whole economy here. I mean, if you, generally in history, when you’ve opened up monopolies, you create opportunity for, for the market to be a bit bigger.Emma Thwaites: It’s really interesting that you raised the point about personal data stores, because I remember when we were talking, we were having a different conversation earlier this week. And Di, you mentioned that people might, at this point in time, have an exaggerated sense about what their data is actually worth. Do you want to say a little bit about that?Di Mayze: Yeah, cause I, I think value exchange is extremely important and a, and a lot of people may think, well, okay, Google, Facebook – Stuart talked about market capitalisation.
You think ‘there’s money in them hills, my data’s worth of fortune’. And actually I think Facebook arguably are one of the most established companies at generating money from data. And I think the last article I read, they get about $50 per year per user.
So I think the principles of participatory data stewardship, we can win on that if we don’t make it a money thing. So $50 a year may not be enticing to get a scale on that, but actually feeling like you’ve got control, respect, transparency, that you have more say in the value exchange. And individuals do understand value exchange. We know from the retail loyalty cards. I give you my data and I tell you how many bottles of wine I drink a week. And in exchange for that, you send me some vouchers that are relevant to me.
That is a really clear value exchange that people understand. Or, I give you my details and you give me free content that’s tailored to me. These models are well established, and if we focus on what’s important to the individuals, as we build an open and transparent brand, then I think there’s a, a really possibly exciting future ahead where people could have more control and stewardship of their own data.
Emma Thwaites: There’s something in what you just said about tangibility and making things tangible for people. I think part of the challenge with data is that it’s invisible a lot of the time.
I wonder how you look at that and how you- you’ve given us a couple of examples of that kind of working better. How do you make data tangible to, I guess, both the people that you work with, but also the consumers that are the, the consumers of the products that your clients create?
Di Mayze: So I’m really lucky to work for WPP that describe themselves as the creative transformation company. And it gives us unlimited headroom to think creatively about what we’re doing. And part of that is to, to remind people that data’s not just numbers. Look at pictures, voices, sounds, videos. And I, I surface stories that some of our agencies are doing.
And for example, there’s one company that got a tweet saying your packaging’s very sexist. That’s a signal that a small company can pick up. If you are culturally aware to listen for these signals, you don’t need all the data that the walled gardens might have. You tune in. You’ve got a literacy, a cultural acceptance to listen and look, what’s going around.
And we had another one where Wunderman Thompson put a video on a camel and tracked where that camel was going and got some amazing footage. Now, most people wouldn’t see that as data. I see that as, as data and you can get some insights and learnings from that.Emma Thwaites:There’s a data point in most things, isn’t there?
Emma Thwaites:Everything, and going back through millennia as well, right? I remember one of the stories that we were exchanging earlier in the, in the week, I wrote a blog in the very early days of the ODI about cave paintings and how, you know, early humans recorded the sort of patterns in the natural environment, on the walls of caves and ultimately that’s data.
Di Mayze: Of course, the London Underground map is my favourite data visualisation in the world. How brilliant is that, that people who are non-data-analysts can understand how to get from A to B. I think everything is data-
Stuart Coleman: Although it is very, very poorly scaled. So-
Di Mayze: Excuse me, do not criticise the London Underground map or we’re going to fall out.
Stuart Coleman: You could walk a lot of those journeys, but it misinforms people. So it does create a bit of unnecessary-Emma Thwaites: Let’s not talk about Holborn to Leicester Square, shall we? Right, now-Stuart Coleman: I would agree that, with Di, that the key point is that it is accessible to memory. And actually, I think just – sorry to interrupt – I think picking up on, Di used the word literacy a minute ago.
I didn’t kind of expand on that earlier, but I was thinking about that when you asked me. Going back to the kind of asymmetry or kind of inequity that exists in resources, in big companies to small companies or in, in companies that have all the data vs companies that don’t have all the data. Part of the way you can redress that balance is over time by educating people.
And I think one of the challenges, and I know Di’s done this in WPP through a kind of cross-agency programme, but in society at large, I mean, we at the ODI put out a policy paper on this a few months ago for government, the whole issue of data literacy is critical. I mean, if you just step back for a second, I was digging around the other day and saw some research that the RSA put out a couple of years ago.
And they were referencing how, just, as we’re talking about- I’m talking about a few organisations that kind of generate more value than others disproportionately from the data, we kind of take for granted that we trust certain institutions in society. We trust banks to look after our money or, you know, perhaps we’ve had a few doubts since the global financial crisis, but we haven’t stopped banking. We trust them to look after money.
Who do we trust to do that with data? We don’t really have any institutions. We have obviously a program,e at the ODI talking about data institutions, but, but, but I think that the importance of data as a resource and asset, whatever you want to call it, you can’t just compare it to other things.
And the ability of children growing up through school who perhaps get weaned on systems that are in the school curricula, or you know, by, by platform companies, need to be aware that when they’re searching in a search engine, they are giving away information in return for finding answers. It’s not, there is an exchange from the moment you tap information into a search engine.
That’s one form of literacy, as is some kind of basic – I mean, I don’t like the word consent, there are too many different connotations of that word – but I think with, with just getting society, people at large, to understand these things a bit more, that in itself will get us to demand better.
That’s a bit kind of wishy-washy and wannabe. But, but I, I suppose what I’m trying to say is I think if everyone had more understanding of thinking and working critically with data and understanding how data in the economy works, that would take us a big step forward. Di Mayze: And I think it would be awesome to see companies do two things for that.
And one is to make the terms and conditions more accessible. And I’ve seen some brands who actually explain: we are collecting this data and we’re gonna share it with the receipt company, for example, so we can tell you how much you’ve spent.
And I understand a company needs to have a legal conversation with their consumers. But I, I feel like there’s so much room to be more accessible of ‘this is why we are collecting your data, this is what we are going to do with it, this is why you benefit, please trust us’. And the Nirvana is, as a company, you want your consumers to want to give their data to you because who doesn’t want a brand that they’re loyal to, to know that they’re a brilliant customer and, and for that to be part of the relationship?
And then the second thing I think companies should do is review their KPIs and check that they are taking responsible data into account. To go back to my earlier point about this taxi company and optimising revenue per passenger. Efficiency, effectiveness are KPIs in many organisations that may not lend themselves to be friends with responsible data as a strategy.
And actually, if you think about sustainability and the role of your environmental goals or your sustainability goals, actually a resilient business should be hand-in-hand with a responsible data business. And I think we need to bring that to more teams.Emma Thwaites:It’s really interesting because some of what you’ve just described chimed for me very much with the point that Stuart made a while ago about principles.
If we’re not going to have rules or, you know, laws that govern this stuff, we perhaps need some standard principles or some standards, you know, what quality should the data be? You know, how, how, what are the stipulations that it has to fulfil in order to be shared? What are the practices and the skills that need to exist within organisations in order to work with data?
I wonder how, you know, what role do standards play in this whole area of responsible data use and sharing?
Stuart Coleman: I think they play a massive role. And I think if you go back to, I mean, I mentioned railroads earlier, the ODI from, from an early time in its decade of existence has evangelised the need for society to think about data as infrastructure.
And I think if you think about it as infrastructure, then – going back to your point on standards. Well, we take for granted driving down a road, certainly in this country that you drive on the left-hand side and you would stop, or you should stop, and you know you should stop, because you, we have standards around traffic lights, around junctions – things that we just take for granted. That is, is one example.
Data is – to your point earlier – it’s invisible. So it’s perhaps less kind of physically affecting us. But it, we should think be thinking about it in the same way, because that allows us and enables us to be more responsible. If we are all adhering to some common codes and practices.
I mean, the comment earlier about terms and conditions reminded me of an early stage startup back at, way back 10 years ago at the ODI, called TransportAPI. They still exist. They aggregate transport data. So all of data that’s freely published on the underground, buses, transport, all transport services across the UK. They bring that all together and they make it accessible and useful. I mean, other companies like Citymapper do the same thing, right? One of the great things they did, which I’ve seen occasionally, and you could kind of force this upon companies would be, they did a plain English version of their terms and conditions.
Stuart Coleman: And a webpage, you could look at it, you could read it in five minutes. If you wanted to know how your data was gonna be used, if you wanted to know how they used any data you gave them or the data that they got. Before you clicked ‘bye’, you could read the plain English rather than a sort of 56 page monster that we all kind of probably scroll through on our Apple acceptance updates or whatever it might be.
Those sorts of things, you could enforce that. You know, it’s easy for me to say that, but, but I’m sure regulators could do more in this area.
Emma Thwaites: Di?Di Mayze: Oh, I’m so into that. Let’s, let’s start a movement.
Di Mayze: Yeah, let’s make terms and conditions, actually something that someone might want to read and consciously sign up to. Llet’s do it.
Emma Thwaites: I mean, apart from anything else, you know, a lot of this work exists in this space of eCommerce and, you know, we have a global economy now, people are buying goods and services across the globe. If terms and conditions are in, are very complicated, it makes it that much more difficult for people who don’t have English as a language and are relying on translation services to be able to work out what it is and they’re not signing up to.
So, I’m, I’m all for that as well. I can’t believe it, we’re nearly out of time, but we have to leave time for our final question. And it’s the one that I always look forward to, this one. Cause I’m gonna ask you to look in your crystal balls.
I’m gonna come to you first, Di. What do you think the future holds for the responsible use of data? And I guess that brings into play data ethics as well. Where do you think we’re gonna go next with this?Di Mayze: I think companies are gonna become much more creative. To go back to what Stuart said of data distribution and ownership is not equal. While the markets correct itself, I think companies and individuals will take more control back in what gets done with other data sources, what data they can control, what they can do with it, how you can make the world fairer and better.
I’m, I’m optimistic despite all the data signals steering me in, in a very pessimistic way. I think, I think the future people, people will be extracting and using the data that is available to them and doing brilliant things with it.
Emma Thwaites: Potentially only if we raise data literacy?
Stuart Coleman: I would echo the point Di’s made, but, uh-
Stuart Coleman:Well, that’s difficult to do, Di. I think the-
Emma Thwaites:I’ll be the judge of that.
I would hope that we will get more regulation that enforces organisations to allocate resources to data ethics. Also that we will get simpler, model services, resources to support those organisations who don’t have loads of resources to work responsibly in this area.
I’m actually positive that I believe in the market has the opportunity to correct itself and create new markets and services. Which, you know, has the potential to generate prosperity for everyone. I also think data can play a massive role in solving some of our biggest problems. You know, we talked about terms and conditions and trading.
I mean, we’re obviously in a slightly more polarised geopolitical world than we’ve ever been before. But actually data can play a role in bridging some of those polarised gaps, whether that’s enabling countries to trade more seamlessly or to share data to tackle climate change. I mean, I think there’s, there’s some good opportunities out there.
Uh, and obviously the UK is reviewing its position around general data protection. So let’s hope that it’s not all doom and gloom and that we try and do something to support innovation while protecting and furthering opportunities for privacy, uh, and for participation in data.
Emma Thwaites: I think that’s a lovely place to, uh, to wrap up for today.
I can’t thank you enough. Both of you, uh, for a really interesting conversation, I’m sure we could have carried on for much, much longer. But thanks for now to, to Di Mayze from WPP and to-
Di Mayze: Thanks for having me.Emma Thwaites: You’re very welcome. It’s been a pleasure. And to Stuart Coleman, my colleague from the ODI. Thanks.
Stuart Coleman: Thanks Emma. Thanks Di. Been great to chat with you.
Emma Thwaites: So that’s all from this episode of Data Decade, and a really important one looking at the responsible use of data. Some fascinating thoughts as ever, and thanks again to our guests, Di Mayze and Stuart Coleman.
And if you want to find out more about anything you’ve heard in this episode, just head over to theodi.org where we continue the conversation around the last 10 years of data and what the next decade has in store. And there’s more details about the ODI Summit on there too.
And if you’ve enjoyed the podcast, please do subscribe for updates.
I’m Emma Thwaites, and this is Data Decade from the ODI.