Data-mining, privacy & personalisation: An interview with Matt Adams

Matt Adams is a co-founder of Blast Theory, an artists’ group making interactive work based in Brighton in the UK. Their latest project Karen is a smartphone app that uses psychological profiling in the background as you interact. Karen is a life coach who is happy to help you work through a few things in your life. However, as you chat with her, it becomes clear that Karen is slightly chaotic with few boundaries between her personal and professional lives.

Sandra Gaudenzi caught up with Matt to find out more about the apps development and where it sits within the data privacy debate.

If you want to hear more about the project and Blast Theory’s other work, come along to Matt’s public talk, 29/04 at the Watershed (Bristol) – more info and tickets here.

Sandra Gaudenzi: If you had to give me your elevator pitch, what’s the current story in one minute? How would you sell it to me?

Matt Adams: It does not have an elevator pitch but I’ll give you the one minute story.

It’s an app where you meet a life coach called Karen, who offers to help you with your life. She’s represented in full screen video, so it’s somewhere between a video call and instant messaging. You chat with that character and she learns about your life, and as she does so, it becomes clear that the boundaries between her personal life and her professional life are pretty blurred. And you start to see all aspects of Karen’s life and your relationship with her moves from being a client to being a friend and confidant rapidly.

And in the background this whole project is driven by systems of psychological profiling, so we’re using existing techniques of psychological profiling and adapting the story to you based on that profiling.

SG: Hang on, let’s break these into several points. So the first thing that came to my mind was the movie Her, the operating system speaking to us. Are we going somewhere in the same line?

MA: It has some parallels but I think it will feel very different. It feels much more like a fictional world than the AI in Her, who was, of course, an operating system. In this story, Karen is a person, but yes, it absolutely has links all the way back to HAL in 2001, forward to films like Her and Ex Machina, which is just coming out, and so on. So it’s in that tradition but I think it’s resolutely non-science fiction in terms of how it operates.

SG: So the parallel comes from the fact that this is a topic that’s in the air. It’s part of the gate sites at the moment, or actually inspired by Her, or a movie like that?

MA: No – It’s actually a project that predates that. We’ve been developing it for a couple of years and what’s different about this project is we’re interested in what corporations and governments are doing already now. This is not a future scenario.

This is based on techniques and strategies that are already being used, and one of the key collaborators on this project has been Dr Kelly Page. And Kelly is a researcher in this area who previously has worked for clients like DoubleClick, which is what Google bought to run all Google’s advertising. And she also worked as a freelancer for Dunnhumby, who run the Tesco Clubcard database, so she is a gatekeeper turned poacher in that respect and has good experience of what is being done in corporate settings in this area.

SG: But interestingly enough, when you were mentioning the psychological profile, I was thinking also about some research that Cambridge University did recently on using the likes on Facebook to predict who you are and things like that. Is that related as well?

MA: Yes, absolutely. And of course, Facebook and others are hoovering up that research, so we don’t know that those kinds of companies are doing psychological profiling on their customers, but I would be willing to bet that they are.

SG: Yes. Okay, so let’s go back to Karen. Technically speaking, how you are using these several APIs? Because I suspect you will have to use APIs that already exist, or have you developed a whole new software? That will be pretty expensive.

MA: So we’ve developed our own software.

SG: Oh my God. Wow.

MA: We’re not using APIs and we are not scraping from social media platforms. Most of the profiling we’re doing is within the app and is – it uses some contextual information, if you grant us permission to use it, like location, for example. But some of the psychological profiling that takes place, some of it is explicit and some of it is woven into dialogues and conversations that you might have with Karen, where sensibly it’s a casual conversation but in fact you’re still being, more or less, looking at profiles as you chat with her.

“We’re interested in what corporations and governments are doing already now. This is not a future scenario.”

SG: So, since the idea is that you have to chat, then I understand why it is important that it sits on your mobile phone, and I suspect your iPad, yes? It’s mobile devices only. 

MA: It’s iPhone only and there will be an Android version that comes out a few weeks after the iOS launch.

SG: Okay. So who do you think your audience is? Who do you expect will engage with this dialogue?

MA: I think there are a number of different constituencies. I think it’s obviously an arts audience and a digital media audience; people who are interested in trends and in interactivity, and participation. It will be people who are interested in the indie games sector and although this is not strictly-speaking a game, it very much uses similar languages to game experiences like The Walking Dead and other mobile games that are already out there with branching and personalisation.

SG: It would be early adopters, people who are interested in new tech. It would be people who are interested in issues, some of these issues around big data, data mining and so on.

MA: Yes. I think there are two big topics here at least. One is the data mining, the other one is personalisation and obviously the two are joined together.

SG: But what do you mean exactly by that term, because there are so many definitions and ways of seeing it. What’s your understanding of personalisation in storytelling?

MA: Yes. I would say, the way I think about it is that there are two levels. There is profiling which is aggregating users together into groups based on certain properties that might be behavioural profiling. So that, for example, is the idea that men who go in to buy nappies on their way home from work will be tempted to buy themselves a beer while they are at it and therefore you put the beer near the nappies in supermarkets and so on.

And there is personalisation, which is where you specifically, as an individual, make choices or frames for your experience. That means that it is tailored to you as an individual. So you say, “I don’t want to hear about the Middle East. I do want to hear about cuddly kittens,” and you then get an experience that’s appropriate.

There is, of course, personalisation that can be done unwittingly, personalisation that can be done to you. There’s just much less of that at this stage, although I think some companies, for example, debt and financial monitoring companies like Experian, may well be doing house-by-house assessments of your rate worthiness without your knowledge.

“We’re in an age of interaction and participation, and storytellers at a whole number of different levels are looking how to respond to that”

SG:  Absolutely. If personalisation was a term that was first used in advertising a lot and then ecommerce – if you like this you probably should like this and we’re going to inform you about stuff that we believe you liked.

And now with big data we’re getting to another level – the insurance companies, maybe even the health system – that is going to predict things about you and therefore can speak to you personally. 

Now, when we come to stories and storytelling and experiences, where do you see its value? Why does this interest you as a story teller? Where is it going?

MA: We’re in an age of interaction and participation, and storytellers at a whole number of different levels are looking how to respond to that and we’re no different.

When we look 10 or 20 years into the future, it seems highly probable that stories will be increasingly adaptable, that they will be increasingly tailored to us, certainly a subset of stories will be. And so the question is, what are the ethics of that? What are the opportunities of that and how, as storytellers, can we respond to these social changes? How can we make stories that address some of these social changes?

blast theory1

Screenshot from Karen

 

SG: So interestingly, you pointed at ethics. It’s fairly intuitive to think that in the same way in advertising we’ve seen you were buying a beer and, what was it, the nappies and therefore we’ll assume you’ll buy, the next time, powdered milk for your child, whatever. So we understand why personalisation might be sold as something positive for the user.

But in stories, in narrative, when you say that it’s a question of understanding what the ethics are, what do you mean? Because one could just say, well, it’s about understanding like in Netflix what sort of things you like. If you’re more into horror movies we’ll put a little bit more horror in your stories because we know that pleases you. Would that be unethical?

MA: We’re interested in highlighting the ethical conflict between our desire for experiences that are tailored to us and appropriate for us, which is in direct conflict with a new level of surveillance and monitoring that has previously been explored. So we’re looking to make a project that situates itself right in the midst of that tension and plays on that boundary so that the experience is, in some ways, disconcerting and in other ways, completely satisfying. You’re moving between those two different responses; a character who seems to know you and respond to you and a character who knows too much about you and perhaps anticipates you in ways that feel slightly troubling or unnerving.

“The experience is, in some ways, disconcerting and in other ways, completely satisfying”

SG: So, what is the sort of situation Karen might put me in that will make me face this ethical dilemma? I like it, but am I scared of this. I suspect this is where you’re bringing me…

MA: Yes. Karen talks to you frankly at times about your sexual history and you have to decide whether to be open with her or not, or honest or not. And that’s a very difficult appraisal for you need to make. At the same time, she’s sharing, frankly, aspects of her sexual history. There’s an area of superficial reciprocity, but of course this is reciprocity between a fictional character and a real person. So there is a constant tension at work that is bound to be between a fictional world and a real world, and openness and honesty around these sorts of questions.

There are other situations where it’s more explicit but I think it’s – that’s indicative of how we have approached it to make it Karen.

 SG: I see. No, that makes sense to me. Put in this context, obviously the piece will make me, as a potential user, reflect on the boundaries between privacy, non-privacy, how much the relationship in real life and digital life might be different, is there a difference or not etc… So I see that level.

Do you also openly bring it back to the problem of surveillance and what corporations are doing with our data? 

MA: The way in which we’ve approached that is that the app will be free at launch but when you’ve finished the experience, you will have the opportunity to buy your data report and that data report will have three levels. It will show you what you did in the story. It will show you how we use what you did to affect the story.  And it will show you the psychological profiling research on which those aspects of the story were based.

So we will, essentially, reveal which profiles we’ll be using, where they were developed, what their ethicacy is, how they operate and so on. So you will have this opportunity to lift the hood and see how you’ve been profiled. And there will be some aspects of direct comparison between you and the group as a whole and where you may sit, in what percentile you sit on some of these metrics, for example, so you will have sort of feedback there.

I think that’s the way we’ve chosen to approach it.  Rather than this being a particularly didactic work, it’s been work where you get to experience the sharp end of these processes.

“You will have this opportunity to lift the hood and see how you’ve been profiled”

It’s not intended to be a work with an atavistic pulse. Always within our work, we’re much more interested to put you into a situation where, it’s like you are ambivalent and there are mixed emotions, than to try and make a narrow point. If this was to be a work where you thought it was going to be, it was telling you how awful Facebook is and show you, then I think it would be a fairly dull experience. So we’re much more interested to operate on that fine line between it being inappropriate and deeply enthralling.

SG:  I thought it was very interesting that recently there was Sundance, and most of the stuff that I read was all about virtual reality, Oculus Rift, the future of storytelling is in a virtual reality. That is obviously looking like one possible direction in a new market that is opening, but I personally have the feeling that personalisation is a complete other direction of where interactive storytelling might go and at the moment it’s quieter.

People don’t speak about it very much because somehow the difference is, the VR is wow but the personalisation, potentially, can create meaningful emotional attachment to a story. And that might become, I was about to say, even bigger than VR. They are completely different things but in terms of new trends, put it that way, where are we going? What’s next?

How do you see things? What’s next and is there any relationship between VR and personalisation?

MA: I would agree with your analysis, which is that Oculus Rift and VR is consistently exciting because we live in a very image-driven culture and a very tech, innovation-centric culture. And so the idea that something new has been invented is always a source of breathless excitement.

It’s a little bit like people getting excited about stealth fighters when they’re failing to understand that the military have moved – are much more interested in other aspects than hardware innovation. Hardware innovation is only one tiny part of what’s going on.

And yes, if you think about Netflix as a company, and you think about what they are beginning to understand about how most people in developed countries consume movies, how they understand when you pause, when you rewind, when you – as I understand it, they’re even tracking which movies you look at as you browse through and how long you look at them for. They’re capturing every single bit of that data. The idea that in 10, 20 or 30 years’ time they’re going to be using that incredibly powerful database to create stories that have particular properties seems farfetched.

I think it seems an absolute given that that is one of the trends that will follow, not in any way to suggest there won’t be author-led, authorist work in the future. There will be, but we’re going into a world that risks being increasingly solipsistic and self-regarding, and inward-looking. And personalisation would be one of the core things driving that trend.

“We’re going into a world that risks being increasingly solipsistic and self-regarding, and inward-looking. And personalisation would be one of the core things driving that trend.”

SG: Because  there are two levels there. The first level of personalisation is just saying, Netflix can basically do market research about how we consume stories and therefore, will create stories that fits the purpose of our majority.

But the other idea, which is a bit more – that goes beyond, is the idea that the story creates itself, so interactivity embedded in the story, so that every bit and part of the story depends on who we are. So that there is not a story any more but just stories that fit with who we are, so is that the second vision that you have or the first one?

MA: If I understand it correctly, I think it is the second. I think it’s all the ways in which data about us can be brought to bear on our cultural experiences. It’s germane and needs enquiry. And our practice in Blast Theory is always to try and be looking at those trends, and to pose particular questions around ramifications of those trends, whether that’s VR in Desert Rain in 1999 or mobile technology in some of the projects we’ve made since then. We’re always saying, what is the Pandora’s Box that’s being opened there and what are things that we are paying less attention too.

SG: And having worked in the last month of this topic, where do you personally stand on this personalisation? Is it going towards manipulation in a scary way, or into the ultimate enjoyment in terms of entertainment for what it is at its best, which is something that suits the way we want to be entertained?

MA: I think the way I approach it is as a new reality that will under no circumstance be undone or wound back. And so therefore the question is, how are we going to navigate this set of changes. And you know, it’s is like trying to hold back the Internet or the personal computer. This is a change that is coming that we’re already in the midst of. We only have the slightest inkling really of what governments are doing, so it may well 10 years further advanced than we even understand.

So then the question is, is how do we arm ourselves in the face of that barrage. We live in a period where large corporations have escaped all demographic control through globalisation. There is no government that can contain those corporations, not the Americans, not the EU, not the Chinese; they are completely above and beyond as we see through their tax policies. So one of our only sticks to beat them with is the understanding and sentiments of their users, and their customers, and as we’ve seen with Myspace and others, that if sentiment turns against you, or you’re usurped by alternative, you can go from the hottest show in town to irrelevance in a very quick period of time.

“We as citizens and as consumers have a tremendous amount of power. The question is, is how we would wield that in any meaningful way?”

So that capriciousness of how we perceive them is a critical issue, which is why I think they’ve taken the NSA revelations so seriously, because they realise that that is actually an existential threat to companies like Google and that from Facebook is, if they come to be seen as vanguards of the surveillance state they can vanish very, very quickly.

So it’s not a fatalistic scenario. We as citizens and as consumers have a tremendous amount of power. The question is, is how we would wield that in any meaningful way and it’s certainly not through resistance. It’s through a sophisticated understanding of the strengths and weaknesses of these different approaches.

SG:  Absolutely. Thank you for that. Last question is more about the finances behind Karen. You did do a Kickstarter campaign – is there anyone else is behind it? You did mention that you had an economic model – being able to buy your information – how is the whole thing packaged?

MA: The work has been commissioned by the National Theatre of Wales.

John McGrath, the Artistic Director there approached us a couple of years ago to develop a project and this is what we asked him to support. And they’ve been a very loyal supporter as this project has taken shape. Then the project has been supported by ‘The Space’ which is the BBC and Arts Council England collaboration. They have given us a significant amount of our investment.

As you mentioned, we have Kickstarter backers who supported us through a critical time and we’ve been very much supported by our research partners. That’s Dr Kelly Page in Chicago and the University of Nottingham have backed us and have given us staff/client expertise. So as with all of our projects, it’s a real talent fest of different partners coming in and supporting us in different ways and we’re extremely grateful to them because this is a world first.

To our knowledge, no one has done anything like this. And as is often the case with our work, it’s pretty high risk as to whether this will work or this will resonate with our audience or not.

SG: So was the Kickstarter campaign done for the financial contribution or more to create a community and as a, sort of, PR exercise?

MA: No, we were desperate for financial support at the time.

SG: It doesn’t look like it. It looks like you have everybody on board.

MA: Yes. You know, as it goes in these situations sometimes-

SG: Never enough?

MA: Well, no, it’s not that. It’s that things have changed quite significantly in the last eight weeks. ‘The Space’ confirmed their investment just before Christmas and that changed this from a heavily loss-making project to a project that has a good chance of breaking even. So we got turned down for some very major grants on this project in the last 12 months and we were running on empty for a significant period of time at a time when we were desperately committed to this project. We were determined it was going to happen but we could not see a way to make it happen.

So that Kickstarter was a life or death moment and as is sometimes the case, then the really great support we had at that Kickstarter helped give other people confidence that this was a viable thing and that they should get on board, and suddenly we had a bit of momentum.

So you know, the 539 people who chipped in to our Kickstarter, every one of them have done us a massive favour.

SG: That’s fantastic. And I also suspect that this means that since you had to create your own software, the backend belongs to you. So is that something that then will allow you to go into another project? Is this the way you see it?

MA: I think often we talk like that when we get to the stage of a project when you’ve poured so much effort into creating new software and you think, yes, we can definitely build on this and come out a bit further.

SG:  Or sell it, or do something with it.

MA: Yes. In reality that rarely happens, partly because sitting at the heart of this is our Lead Technologist, Alex Peckham. He is the sole developer of this software.

He’s being supported by various people taking some small elements but really, it’s essentially Alex building this thing from scratch. In order to do that, we have to then be incredibly narrow in terms of what we build. So we don’t really build a platform that’s capable of doing a large number of things. We really build a platform that does precisely what we need it to do and no more.

Having said that, we’ve never made an app before and so this is the first time where we are making a product, a piece of work that potentially can run and run, and can be in the App Store five years from now. Most of our projects have a strong performative quality and they run for a bit of time. Maybe there will be opportunities. Of course we’re always looking to do that but at the same time, as artists we’re always really interested in asking the next question and so typically we tend to zigzag around quite a bit as we’re finding new things that are inspiring in one way or another.

SG: Fantastic. Fair enough. Well, thank you for your time. I can’t wait to have a go!

Karen is out on the App Store now – download here or follow @blasttheory & #karenismylifecoach to keep updated.
Matt Adams will also be in Bristol at the end of April to talk in more detail about Karen and the work Blast Theory produce – find out more & book tickets!