 Mae wedi cael y dweud ychydig sy'n syni hwnnw, ac mae hi yn haeth ai awrgiad o woneg iawn. Mae'r dweud ychydig sy'n dweud, a'r ddweud sy'n gyflymu am y byddai, ond ychydig i gael eich fodũmit, ychydig i'r ddechrau,au'r cyntaf, y stefyn, ac ychydig i'r hyn o'r ddechrau. Felly yn rau cyffredinol y brif, yr eu cychydig sydd wedi ei dweud, ac mae'r plaffformau yw'r dweud. Yn cyfnod yw ni wedi'i cyfnodd, ac yna'r cyfnodd i'r cyfnodd yma, ond yna'r cyfnodd yma, yna'r cyfnodd a'r cyfnodd y maen nhw ar y cyfnodd, ac yna'r cyfnodd i'r cyfnodd a'r cyfnodd i'r cyfnodd i'r cyfnodd i rhan o'r cyflwynyddiadu. Mae'r gynhymd wedi'u cyfeithio, mae'r gynhymd yn cyfnodd i'r cyfeithio, Ond oedd y cwylwch gydaeth y ddiddordeb. Mae'r ddiddordeb yn ymddangos i i gael oedd iawn i'r ddiddordeb yn y cwylwch. Mae'r ddiddordeb yn ymddangos i'r ddiddordeb, what motivates them to collect the information, and then also their appreciation of the data they are collecting. What kind of values do they have in relation to it and what kind of values do they perceive other people to have in it? And how do they understand some of those dilemmas? So what kind of starting point, before it kind of went anywhere near anybody he was doing QS. a dynaこう i Cyflawniaeth Cynllunau Irish. Fy nifer o'i fflawni'r embain mae'r llesau sydd ystoddi, ac yn iawn eu tro lleoliaeth. Mae'r llesau f Aquíadau a'r llesau ymlaen â phoesio allanol. Mae wedi amser rai gael gofyn i wneud y cymreu, a'r gwneud oherwydd i fawr yn bwysig i'ch gweithio. Roedd ymlaen o'ch gwneud eu cymdeithasol a'r gwneud o'r hoed, a gan dweud y ddweud i diwethaf nhw eu gwirio gweithio, a'r ddweud i diwethaf nhw'n ffordd o'r ddweud o'r dweud yw'r ddweud mewn. Yn ystod y ddefnyddio gwah reviews i ddweud y ffordd o ffrindieithio. Yn dweud, rwy'n meddwl am gweithio gynghorais o'r holl yn ei rumig. Mae'n gweithio'r gweithio'n ei bron. Mae'r ddweud i ddweud o'i ddweud o'r ddweud i gwaith. I'm trying to get a handle on why people are doing some of those things. And then, okay, neoliberalism comes into it as well. The economic models around some of these things and how things are incentivised, how people do what they do, and also the kind of self-responsabilisation, particularly in relation to health situations. Whereas, for instance, diabetics rather than going to a hospital and having themselves monitored, what they're doing is they're monitoring themselves at home and that kind of thing. And then lastly, the objectification of data where the data is actually the be all and end all, and that is new to me and thanks for that today. But that kind of idea where the emphasis is put on the data rather than the activity that's involved around it. So that was kind of flavouring my initial kind of fall into QS and trying to get a handle on what was going on there. And as I said, what I'm interested in is the users, yeah? I quite like this and you might recognise yourself in some of these different kind of categories. But Sewell talks about the different types of users, QS type users. And the guinea pigs are those that, you know, the top end athletes, those people that want to shave 0.0001 of a second after performance, the ones who are quite quick to take on board something new, technologies that are coming out. Then we have people for health reasons, like as we mentioned, people I came across, people suffering from cancer and they're monitoring certain categories of their treatment or of the response of their bodies to this treatment and using it and helping and bettering themselves in health ways. And then also the improvement, people who are training for a marathon will say and keeping tabs on what they're doing, how fast they're going, how long they're going, how they can move forward with some of these things. And then lastly, which is cropped up earlier on, is the techno driven. Those people who, for instance, are sitting home at night, they've set themselves a task of 10,000 steps per day, they've only got to 8,000 and all of a sudden they think, oh God, I better go out and run around the block or they may cheat and shake it till they get to that 10,000 mark. And then lastly, those ones who are curious and these generally revolve around things like getting a birthday present over a Fitbit, using it for a while, seeing, you know, experimenting with it, seeing how it goes, and interesting in some of the previous or other work I was looking at, some of the other literatures. It talks about failure in these things and how these people take on these devices just for a short period and then leave them. But arguing that it's actually not a failure, even for using it for that short period of time, there's still a certain amount of success that goes with that. So that's kind of flavored my approach to it. What I'm kind of interested in is some of the kind of non-beneficial aspects to it are some of the kind of immediate stories that have been revolving around this that I've suggested that, you know, there's a slightly darker side to collecting some of this information. And these are some of the headlines that I dug up and they revolve around, a lot of this is the US context, of course, in relation to health insurance and what have you there. And things like health insurance companies plugging into exactly what you're doing, monitoring the information that you're providing and offering premiums in relation to that. If you're very healthy and you haven't got any health issues, you get a slightly cheaper premium and so on. Things also as well like leakage. So you have your device, you've run your miles, you're sending it over to the platform wherever it is. A certain amount of leakage goes on there, a certain amount of potential to hack into some of this information. And then also the kind of economic side of things. So here we have the CEO of Jawbone and he's arguing that while certainly stating that it's not really about the devices, that's not where the money is. The money is in the information that's being produced. And, you know, that's kind of key to some of the understandings that are going on around this. And then lastly, you know, information is being sold. It's plain and simple, companies are selling the information on. Some other headlines. Some businesses in the States who are asking their staff to wear, for instance, Fitbits, to monitor their activities, to monitor what they're doing, and also in relation to some of those health premiums and health insurance and what's going on there. Then in relation also to sexes, like Deborah Llufton has written a lot about this kind of stuff, had the promotion of this and in particular, you know, the kind of gender differences there. So men are usually about a performance, the longevity, the amount of partners they have, the amount of positions they have, goes online in a boastful kind of way. Women are probably more likely to record menstrual cycles or fertility cycles and so on and so forth. And then lastly, it's now, there's been a number of cases where some of this information has appeared in court. So there was a case in Canada where somebody was accused of murder. They said they were in a certain place, but in fact they were in another place. And the Fitbit information was able to prove that. It's also been used more recently in the UK, disability benefits, people who were claiming disability, but in fact were training for a marathon and the information was used to prove that. So these are some of the other aspects of QS and how that information is being used. The privacy, so this is the kind of the big area that I'm interested in, is the privacy around this information. So this is the terms and conditions from Fitbit, you know, the small print. So I went through the small print. It may be updated, but this is somewhere from last December, I think is when I looked it up. It starts off with three kind of core values. The third core, I know I've underlined these bits. The third core value is we will never sell your data, they say. You keep going down, you get down a bit more. They say we may sell your data, but it will be aggregated. It will be anonymous. Your name won't appear anywhere. And then you scroll down a bit more. We will sell your PII, your personal identity information. Sorry, it can be sold. There is a caveat to this. And that they say that, well, if we sell it, we'll be selling it to another company. We will ask that other company to comply with data protection laws and so on and so forth. But there's a bit of a mixed message going on there. And as Lucas was talking about earlier on with Strava, a lot of this stuff is observable. It's open, yeah? It's open knowledge. And what I've got to give an example here is Map My Run, yeah? And let me get this to go. So some of these things are password protected. This one isn't. So this one is, I was sitting in my office and I just said, how close can I get a map around here, yeah? So what I do is I simply went down to Map My Run, went in, had a look. Here we have, this is the first name that came up, Sarah. If I click on that, we get a little picture of Sarah. See who she is, yeah? It's quite open about what she's doing. And also, if I go back to the other one, down a bit, we get a little video of our run. Now to me, this is quite intimate detail. The time, the place, where she went, as this loads up, will give you a very detailed view of the streets she ran, yeah? As it goes. And the kind of pace she was running. Did I get the right one? Let's try it again. Hold on. It's not running. But it does give quite a detailed look of the actual road she was on and even what side of the road she was on as she was running. It does zoom in. Anyway, let's leave that. But yeah, the point being how intimate some of this information is and how freely available this information is and how people, the potential to use it in various ways are there. So to make sense of all of this and what I wanted to try and do is kind of use, there's a thing called communication privacy management and use that to make sense of how privacy is being understood by some of the people who are using this in relation to the data that they're producing. And Petrionio suggests that there's, what she talks about is a boundary metaphor. So the boundary metaphor relates to privacy. So you have the boundary further out, further in, depending on the kind of context that's involved. And some of these contexts are things like culture. So, for instance, in some cultures, it's commonly, it wouldn't be unusual for somebody just to walk in and sit at the kitchen table of a stranger's house and start chatting. That's other cultures. You have to arrange exactly the time and the date you're going to appear. So there is discrepancies there in how people understand privacy. Also gender, in relation to how gender, quite often there's been some recent work that's looked at Facebook, which suggests that women lie more on Facebook, whereas men talk about themselves less on Facebook. So there are some discrepancies in what's been disclosed. And once again, it's this boundary, the privacy boundary, how far closer it's going to the person. And then motivations. So, for instance, in examples where a couple may be splitting up or divorcing, emotions are high, what's been disclosed is often a little bit more than it normally would be, describing the intimate details of a certain person and it pushes that boundary out. So that can sometimes affect how it plays out. Then there's also the idea of contextual. So if you're in an environment that you feel safe in, you're inclined to reveal slightly more. So you're pushing that privacy boundary out a little bit more again. So if you're in a password protected environment, maybe you feel slightly safer there and you will push that boundary out a little bit. And then lastly is the kind of the risk-benefit ratio. And what they mean here is, I tell you something intimate, you tell me something intimate, I'll give you a bit of information, you give me information. And it strengthens the bonds of the relationship and moves that onto another level to some degree. And also, there is kind of rules and regulations around this. If you break that, there are consequences. You've disclosed personal information that I gave to you and it can all quite quickly fall apart in that relationship. So that's the kind of where I'm coming from, the kind of theoretical, I suppose, grounding of what I'm trying to do here. What I did, be like yourself, is I went to the QS meetings in London and attended a few, but most of my analysis was in relation to the videos that were being put online. And just like yours, there was three questions, what you did, how you did it, how you got on. People spoke for 10 minutes, there was 10 minutes of questions. Three people spoke per meeting. When I looked at it, it was 64 videos. They've all been put up online. I saw through them all and narrowed it down to 10. I transcribed those 10, had a look at some of the issues that were going on there. But none of them really spoke about privacy and I wanted to push that a bit more to see their feelings and understandings. The value of this information they're creating. So what I did is I interviewed five of these people that were in these videos, but I also wanted to look at a more amateurish side of things. So what I did was I looked at five people who had been basically Q-assing for at least six months, had continuous data for six months. So they were amateurish, but they were relatively seasoned in what they were doing, in the rest of the practice and what they were doing. I've also done a number of focus groups, but I'm not going to discuss them here today. But these 10 interviews is what kind of generates what I'm going to talk about today. And the findings on what I'm going to talk about, I'm going to split them up into two. Have a look at some of the motivations about what people do. And also have a look at some of the privacy aspects of what people talk about in these interviews. So the first one in relation to motivation. And this person talks about it, it instills better behaviours in her. And that's why she does it. Now this person's kind of story is that she grew up in, as she described, a chaotic household. There was no real rules there. So what QS to her does, it's almost like a disciplining strategy, a disciplining format in what she does. It allows her to make sense of what she's doing. It allows her to set goals for herself. So what she does is she sets tasks for herself. And if she hits four or five of those tasks, then she gives herself a little reward. And it allows her to structure how she lives her life and how she kind of makes sense of her life. This one is more he had a problem and he wanted to solve that problem, what it was, and he kind of correlated some of his stuff. But interesting, he starts off with what gets measured, gets managed. That's how he kind of makes sense of it. But in his case, he was trying to figure out what was happening when he was buying so much coffee and what he was buying. As he said, it was making me feel shit. And that's what he wanted to discover. So he discovered he was drinking too much coffee, he was eating too much cake, and then he came to the realisation that's why he felt shit and that's why he stopped doing it. This guy is in relation to, he was asthmatic. And a bit like the neoliberal example, he was collecting the information at home, he was then bringing it into the hospital and the hospital was quite quickly making decisions. Okay, you're using your inhaler too often or too little and he found it extremely helpful. And kind of cutting to the chase when he went into the hospital and being able to produce the information in the hospital and then being able to make decisions quite quickly and a quite detailed manner of his use and what he could do to improve it or less than the amount of inhaler use he was doing. Also within the motivations that I think what's, it has been kind of discussed a little bit here, but for me I think what is key to all of this is fun, the fun aspect of it. That's why a lot of people do this stuff because they find it quite fun. And in this case this is somebody who works all around the world, she's in different places at different times. What she does is she maps or runs when she's get there, she shows that her friends can have a look at it online, there's a kind of a cool element to it. Oh, this week I was in Vancouver, I ran around the Rockies, I did this, that and the other, her friend there may have been in Venice, I ran in Venice, this is what I did and there's a slight competitive element to it, but there's also an element of showing off of what they were doing or kind of bragging about what they were doing. She was what she enjoys about the whole aspect of it, but also there's an amount of exposure on what she's doing, which she was quite happy with. So jumping now to privacy and what people do and how people kind of make sense of what they do. And this person wasn't new in any way, most people say, I share everything. And in this case it's almost like a memory tool, it helps her remember what happened. Now this person is a blogger, she writes about food and she reviews restaurants and coffee shops and things like that. So she finds it very helpful to keep a record of all these things, keep it out in the public knowledge because people are having a look at her reviews, but also it's a trigger when she goes back, oh God, I've been in that coffee shop before, what did I think of it? She goes back in, has a look, okay, I was there January 25th, I had a coffee and I had a cake and I thought it was quite good because it was in the sense of it and that's where the value in the information is to her. In this one, he doesn't mind people seeing data, but then when I say, well what about, you know, we have evidence here that people are actually, the companies are selling the information and when he heard that he said, well that would piss me off. So what he was, he was all right with information being sold, he was all right with giving information but when his information was being sold on, that's when he had an issue with it. Now he did say, you know, if it's anonymised, he's all right, but he doesn't mind about it, but he still, there's a tension there on what he's suggesting. In this one, my data is private. He starts off with a very clear statement, my data is private, yeah, I'm in charge of my data. However, there's nothing compromising in it either, yeah, so I can let it go out there, I don't really mind because it's of no real value, but my data is private, which seems quite odd, but there's also, there's kind of an altruistic sense within this one. It's for the greater good, it'll go out there into the big bad world for the greater good, yeah, which kind of continues into the next one. So this one is a little bit more explicit about kind of the altruistic nature of all this. My information could be used to plot the amount of 41 to 50 year old London guys living in a certain place, doing certain kind of activities and what their health benefits may be. However, both of these and anybody that raised this, I asked them about it, what did you sign up for that? They're like, oh no, no, so there's a presumption made that this will be used for the greater good. Nobody has stated anywhere, as far as they were aware, that their information will be used in a kind of a big data sense or in promotion of health benefits within a certain area. So they presume it will be, they presume their information will be altruistic in some way. Equally, there's a certain kind of expectation that companies will sell the information on. Once again, a presumption, because when they're pressed on this, people don't say, oh yeah, well I signed up to have my information sent away. Now they probably tick the box when you agree to the terms and conditions. But nobody I talked to had read that or understood the value of their data and what is actually happening to the data. So two interesting presumptions and what people are doing with their things. So to bring it back to privacy management, to try and make sense of some of these things. The cultural aspect of it, there certainly is a value in the media. People run their miles, they come home, they have a quick look at their garment, okay, I've burned this amount of calories, I've run that far, I've run up that high, whatever, this is my new personal best. So there's a huge value in that and that's quite often why people use this and why it promotes them to do some of these things. As I said before, there's also an expectation. There is going to be, companies will be making profit of it. There is an expectation that it will be used for big data or will be used in some altruistic way, which to me is really unusual. But there's also, as well, particularly in a Western European culture, predisposition to share, to share online, to get our information out there, to show we're alive, to show we're doing this, that and the other. This is just another form of that. For good or bad, there's benefits to these things and there are the dark side of these things if you want to call it that. But definitely there is a predisposition there to do that, particularly in younger generations. The gender differences, I didn't find any. For me, what I was looking at, there was no differences really. The motivation with all of this is clearly, the betterment through numbers, knowing yourself slightly better, running further, knowing your, as my intake, slightly better, and so on and so forth. That's a clear reason why people do it. Also, as well, I've gone back to that kind of phone element that bragged on, kind of as Quilman talks about, you can show where you've been, you can show off certain things. Then there's the contextual as well. You arrive somewhere and there's the information that's on there. There's the information that's on there that I looked at to map my run. If I was new to Birmingham, I could suddenly find out, okay, I'm thinking of going for a run. Where could be a good run to go? I can go on there. I can get some sort of information out of that. But there's also kind of limits in some of these contacts as well. Some of those I spoke to, wanted to get a hold of their own information to analyse their own information in certain ways. But they would only be given the template that these organisations or these companies provided. They wouldn't give them anything more. But they did push them on it and say, okay, well I have my calories, I have my distances, I have my time, but actually I want to look at something different. I want to look at something in slightly more detail. The companies weren't willing to provide that. Those I spoke to, there were probably three or four of them who asked for extra information and they didn't get it yet. They just got the standard information back. There is tensions around that as well. Then the risk-based thing, the trade-offs involved here, people are quite happy for their information to be used in ways known and unknown. They see some of the benefits, the immediacy of finding out what they have and if it's being used, that's all they're really interested in. If it's being used for other means and means, so be it, even though some people may have been pissed off as in they're not getting money back for it, as the participants suggested. But it seems to be a relatively accepted way of doing these things. Then just to kind of some sort of concluding thoughts and all of this. There's value within the information. People don't seem to mind sharing that. That will be my conclusion of those I spoke to. Certainly from the 10 people I spoke to and some of the focus groups I've done, people are quite happy to let this information go or trade it or move it on. The boundaries still exist. When you ask anybody about the information they put online, health information and financial information are sacrosanct. Those boundaries are kept very close. However, health does cross over into some of these things. People do, depending on the context there, they do let that boundary go out a little bit more. In situations where they may be highly emotive situations, somebody had an asthma attack and they wanted to scroll. The boundary comes out a little bit further. There is some movement within that boundary. Key to all of this is the sensitivity around some of these things. I come from a... My background is kind of surveillance studies and looking at issues around surveillance. Quite often, things come to a log ahead when somebody hits a brick wall. When some sort of access has been compromised, that's when the issue flares up. In these situations, this information, particularly in the US context where it's going to limit premiums to health insurance and things like that, that's when this thing could flare up and possibly is flaring up, particularly in the US context. I think in Western Europe we're kind of isolated from those contexts. But things may change. Particularly in the UK, the NHS, it's on a slippery slope at the moment I think. But anyway, the value is in, getting the information, how many rails you've run, how fit you are, what you've kind of done. The tensions may be as well around some of the issues around the big data as we talked about that altruistic sense. Where is that coming from? Why do people think that? Why do people presume that? We have some sort of evidence to suggest that it is going down the direction of companies that are selling the information. But this sort of one seems to be a presumption that it's strange. Making money on exclusion, there are some of the issues that are revolving around this. Also, forever present is the exposure of compromising data. I don't really want to go down that route because I do think there's an element here of security fatigue, particularly in relation to some of these things. In that, if there's not an immediate effect through these things, people's guards are going to get a bit sick, you must have a password effect, you must do this, you must do that. And if there's no instant repercussion to some of these things, people will come a bit blasé to do these things. As a person, there's nothing compromising in his data. And to some degree there isn't. So I don't think we need to get too caught up on all of this kind of thing. But, you know, this is something that's not going away. Now, I went with a conservative. I know you said 9.2 billion by 2020. I went with a very conservative. I said 2 billion in the one I looked at. But, you know, this is an industry that's here for the long term, I would imagine. And how it's going to develop and move on, because quite quickly the market will be saturated with devices, or new devices will come on. But the emphasis will be on the data, will be on what's been collected there, how that can be maximised and I want to finish with one last kind of example of all of this. Last year's tour de France. Chris Froome was leading at the time. There was accusations that he was doping. Now, this is kind of QS at the top end. But what they said they would do is they would release his data. You know, the energy he was generating on the bike, the amount of revs the wheels were going around and so on and so forth. But it was sensitive data because his opponents, if they got their hands on that information, they could have very insightful knowledge of exactly what he was doing. So they edited the information and they did release it to prove that he wasn't doping at the time. But that just shows at the top end how kind of sensitive some of this data can be. It can be equally sensitive when it's related to us, particularly when it compromises some of the decisions we may be making in future dates.