 So yeah, really interesting papers, I really enjoyed reading those, Dorothy and Keith. So let's see, so I'll just say a few bits and then I have a couple of questions. Yeah, really interesting stuff. So what both of your papers really made me think about and what I really appreciated about them is that you're kind of, I think, bringing out something that's missing in a lot of the quantified self-research and also practice as well, which is the willingness to talk about something which is ontology. That's what you're doing. You're kind of saying, okay, you know, let's look at something. What do these practices say about how we can be and what is, as it were, how we can ask those kinds of questions and so I really like that. A lot of the work I think stops at epistemology or very empirical stuff that doesn't ask these kind of other kinds of questions. So I think what your work really does is to help us to identify kind of contradictions that the emergence of these again practices kind of introduce and then ask what that means in terms of our role in the world. Lucretius, for example, you know, Roman philosopher actually sort of talks about our consistent, our sadness as being a result of our inability to understand our place in the material. So I think your papers really highlight that in a way that I haven't sort of seen. So I appreciated that. So when we set out, as supposed individuals, to get to know ourselves, the autonomic self, the self that is not apparently otherwise knowable at the physiological level, so I think what you're saying is something that's fascinating because I think especially in the way that the quantified self has kind of understood the wider sense is that there's an empowerment dimension, that this is something that is going to provide something really positive in self-discovery. And again, what both your papers do very nicely is to say, well, it can be a painful experience. It can be something that has other types of implications. So I think that really helps us to move away from what I talk about in some of my work to do with binaries that might be part of this inability to understand the kind of material aspect of daily life and how, you know, we can even talk about this in terms of professionals be cognitive work, how we're not actually thinking about our bodies. I mean, we can talk again about sedentary and these kinds of things. So what we're finding is in a lot of the literature, there's the self, the other technology, the body as binaries, interior world, exterior world, and then we can get into other context, citizen state. But then there are questions around patient, doctor, employee, and employer. So the kind of, I think the questions that you begin to introduce help us to think the next stage and maybe something to do with our very near future. So I think that that's fantastic about it. So you're willing to do that, which I appreciated. Again, if we're talking about the effect that bodies also have on one another. So there are questions around the individual in a community. And if quantified self is about understanding oneself as an individual, I think what's often left out again is what happens with communities that are involved. And your projects both kind of explore that. So there's a way of doing that. But at the same time, and of course, Binosa talks about how, how, how actually the effect on bodies isn't necessarily always something that's empowering or positive, but actually there could be sadness, there could be, you know, this kind of the kind of repulsion almost. So I think that's, that's really fantastic. So I think, I think there's a lot of potential even to take the work you're doing further. So one of my questions is how, you know, you're going to push it some of those potentially some of those boundaries. And then questions, do you think that the movement individualizes? Because the other question is around privacy. You know, is the individual then a private being? Because if we are starting to ask questions about what the quantified self relies on an individual to be, is that a citizen who has autonomy is that and this is linked to some of the work, of course, that Tiaj has has done in her work on biometrics. And the, and are we talking about a new kind of a self then, which is, which is more than this traditional understanding of something someone who's personally, you know, private or, you know, a personally derived kind of person. And then, and does it matter then that potentially this movement into terms of privacy that the data that apparently is bringing us to that point that a libraries of discovery can of course be commodified and is being commodified. And so Keith really brings up those questions. And so I have a question, a bigger love. So do we have the right to mind? Do we have the actual right? So there's a bit of a societal shift, I think, what is privacy is the concept of privacy changing itself? I mean, you know, there's a kind of generational dimension. Are we actually at a point where, you know, privacy, we can't expect it. And again, of course, that links with the questions about around normalization. And does that matter? Okay, so what I'm going to do then is just there are three major things that I wanted to ask you about. And they're in these categories of the self. And then with method and then with this concept of discovery. So so just to kind of see what you if you have anything more you wanted to say on that. And I've already mentioned disquantified self rely on on a citizen as a limited sort of subject. Questions around the gendered self, which also your paper does Keith, some questions around that. And and then also your papers, that there's a concept of, of course, roses concept of the self. And again, what, you know, can we go beyond that? Are we only talking about? Are we talking about new types of rights that can be understood and accepted? And are we talking about more of an ethical subject in your work? Are you talking about the ethical subject, for example, that Foucault refers to? Okay, so the privacy question. And so what, what gets us to become why do we involve ourselves then in the quantified self? Why? Why are we trying to discover our supposed true selves in the material world? So that's the kind of, is there what is the desire element? What is the passion? What drives us to do that? So that's the other question. And then in terms of the self and others, do you see a changing relationship for potential changing relationship, for example, between patient and doctor? And again, this will can this will this will rely then on on, you know, on what's what's available. So when we have socialized medicine, clearly, you know, there's again, this, the kind of patriarchal relationship, and could quantified self brings into the new world whereby, you know, we have a bit more autonomy in that sense. And what does that mean? And of course, we've already talked about employee and employer and self management, those kinds of questions. Second question, a set of sort of comments and things that I took from your papers has to do with your methods and your methodology. And I think what's fascinating when we talk about body as laboratory, as you've done Dorothy, it's, I'm curious then, are we looking at then at a living laboratory? Are we talking about living labs then in that sense? So there's, obviously, this is one of the methods that that that's applied and, and it kind of brings about concepts around co creation. And it brings about concepts of, of, of how you know, of how we work together and how we can actually become involved at that level. But then it also brings new questions to relationships with machines and the changing the assemblage and this kind of question. So could, is that something that you've thought about in terms of your own method and co creation. And then in that sense, makes me think about individualizing supposed individualizing restrictions that I'm kind of referring to when we're talking about a supposed quantified self, which is this, you know, supposed, Ottoman or being that, that, that can be taken out of a community for example. And then the next question in terms of methods could, and this brings me to some of the, the, the references in fact in your work Keith was really about digitalized labor. And, and is there a possibility in terms of the methods that are applied not just in terms of our use of devices and quantifying self so to speak. But is there a possibility is there a potential for, again, co creation that that could, that could actually be not exploitative and is and where is this happening if it is in terms of, you know, how usage can be interpreted by companies to say, oh, well let's improve because that is happening. And again, my question just would be, does it matter? Does it not matter? You've, you've introduced that, you know, in interviews, you've found that people are, are kind of okay with some of their data being used in specific ways. And is, and so what I'm asking then is, does that literature need to be updated a little bit to say, well, you know, this is actually part of a process that's empowering, or is it, is it, is it a let down these just these kinds of questions. Then in terms of discovery, so that's my third kind of category in terms of seeing oneself in the data as it were, which again, of course, we're interested in as a, as a bigger picture. So again, does it matter that the discovery might not be positive? And do we think that in fact, maybe products are, are letting people down by not kind of giving them that lead in to, to recognize that that might happen? Maybe it doesn't matter at the psychological level, possibly not. But then again, there is the tension between the kind of consumer aspect of products and then what happens in, in, in day-to-day experiences. And then so really then, does it matter that the discovery of yourself may mean that there's profit being made? And it doesn't matter if you're a product or not. So those are the kinds of questions that I had. Give you a chance to say more questions. I'm still stroking with all those questions. Are there just discussion points there? Yeah, you know, I think, I think, you know, ideas around the self and the, and the body. And I think, you know, it's quite a, like, I like Lutton's work and all of this. I think it's great. And you know, the kind of the body and the kind of disordered body and putting water on the body. And that those kind of arguments, I think, are very pertinent here. And it's, you know, creating this idea of like, all of these things. I like the idea of betterment, but whether betterment will create a sort of a uniformity where all bodies will look like this, and all people will act in a certain way. That's where I have tensions between the two. And I can see both sides of the arguments, but where the crossover is, I'm unsure where to go with that one. But going back to, you know, some of the issues where I'm concerned with this, I think top of that list will be transparency. And, you know, in your talking about, you know, I think the idea of a kind of techno assemblage or whatever you want to, you know, the social and technical coming together, I think is a good thing. You know, it's happened for, you know, for almost a century at this age, if not more than that, you know, going back to things like the car, the telephone, all these kind of things, always when they were introduced, there was problems around these things, and there was fears, and there was tensions. But they have solved and, you know, they still exist, but we've learned to coexist. And I think this is just another example of one of these issues. However, how some of the fine details of these things are being used and are being promoted, I think is something that needs working on. Just like we're still working on with something like the car and, you know, the pollution that's involved and where the oil comes from and all these kind of things, I think we will have to learn to live with those. But transparency for me should be at the forefront of all of these things and how we can understand it. I think, yeah. Yes, I don't know really where to start because it's like three or four different discussions that could easily fill up half an hour. Firstly, to the co-creation. Yeah, that's an interesting topic. I'm currently involved in a product about biomarker and development of medicine. And pharmaceutical companies are really seeing a potential in giving patients' app. So when they develop and test the new medicine, they give the patient app and then they can test it immediately. So they have like all sort of extra data and it's really like course law. So there are a lot of possibility and there's of course a huge discussion. This is this kind of like sophisticated capitalist machine where we just get better at exploring our customer, getting inside or we actually having a sort of democratic system where people are much more involved in the actual development of drug, for instance. So there are a lot of like moral discussion to the whole role of co-creation in using app in product development. Another thing that relates a bit to your presentation that you didn't draw, but that I have heard a lot of my informant talk about is the thing about data silo. So that when you track, it's tied to that app. So when you want to correlate it with another app or if you find another app that is better, your data are all in the app. So they sort of find that the data are trapped and they are really discussing this a lot. So how can you make a system where your data are not trapped and tied to the actual app? So this at least in Denmark is quite a big discussion. So it's also of course about like the ownership of the data and how can I actually make sense in a longer time perspective if the app are tied or my data are tied to a specific app that perhaps is not in fashion in a couple of years. So I think that's a really interesting discussion. Yeah, the last point is about the change in relationship Dr. Payson that you also mentioned. Yeah, should I say? Yeah, I think I didn't talk about that at all in my presentation because another discussion. So in the QS in Denmark, there are a group of people working with developing apps for healthcare. So especially they're working in helping people with actually work with the data and to help to improve the lifestyle. For instance, seeing correlation between a bodily symptom, the allergy and what they do, what they eat to start to detect some sort of pattern. So the idea is that they can find something that trigger the disease. For instance, if you have autoimmune disease, perhaps gluten could be the cause. So there's a huge discussion in the Danish QS and I think some of them go to the U.S. as well. And I think that's like a topic in itself that is really, really interesting. And the thing I have talked about with people like losing their interest, which if people tracked you to like health reason, I haven't seen a similar process because then you have like some sort of value like transcend. So I think for them is a completely different story. So there's a really a huge separation between people who track just for curiosity or to improve or because they have a serious health issue. Talk about moving it forward and you struck a note with me. The biggest issue when I was talking to people was compatibility and they wanted to collect more information in a central place. And they had various different ways of doing it but one didn't speak to the other and they couldn't get it all on the one platform or what have you. And I think that's probably the frontier of the next step of what people desire and want and where this potentially could go. Which would require corporations to work together. That's a tricky one. Well, I mean, no, but you're absolutely, I think you're right. And I mean, I think all the kind of concept of bundling that we, you know, it's these types of issues and something that, so the invention of the thermometer for example, and Jeremy Green writes about this, I was seen as a really radical and you know, it's taking away from the doctors kind of authority to, so I think that's, it's fascinating. And then of course, what I think is really interesting comes out of some work, as I say Jeremy Green is doing is that when the use of the thermometer was then given to nurses who of course tended to be women to use with patients, then it just went off the agenda. Like, oh, that's fine then, absolutely fine. Do, should we turn it to questions for the rest of you? Thank you very much. Really great talks, really resonate also with a lot of what I'm interested in research, what I'm doing. So first, also to address your point about the data silo, there is already a method that can be used, and it's been talked about actually this week in an Apple keynote. It's method is called differential privacy. It's a very complex mathematical solution that uses noise injection and couple of other sophisticated algorithms to remove the personal information from the data by, this is only possible because right now smartphones becoming so powerful that we can actually outsource some of the processing power on the phones. Couple of years ago it wasn't possible really. It had to be processed on the server side, on the company because the computations were too complex but right now the phone can perform this. So there is such method and in fact Apple is the only one going to promote them or anything like this. It's they have their dark sides too, but indeed this is the sort of probably one of the next steps for a lot of the wearable manufacturers to go for. Second point, what you said about information with value that we don't seem to mind sharing. I think that maybe it even goes deeper than that. I think that maybe the sort of aspect related to it is that the rapidity of the development of this technology that surrounds us and that leaks or not even leaks but gets the data from us surpassed our ability to control what's going on with this data, how much it can tell about us in terms of health information or this more kind of private personal data. And we only now trying to catch up with this when because we have this really fascinating paradox we pay for Fitbit or any other wearable. I'm not, I don't want to be, but Fitbit is one of the most popular. Then we really don't fully own the data. The data is processed and kind of stored in a raw format by Fitbit who uses this data to optimize the algorithm, their product, make it better, gain it. Plus, if you can actually, if you want actually to get the raw data you have to pay them for a premium subscription to get the raw data from them. So this is also this kind of paradox that we don't really fully have access to this information yet. And that really might be the next step. But yeah, fascinating issues. Thank you. I totally agree with what you're saying. Yeah, and I think with the information thing, I wonder do people really realize what happens to it or do they not care? That's what I'm still puzzled by. I don't really know. And I wonder if, and this is my issue about transparency, if you said your information's been used in this way they are making, we'll say, every five days on what you do, would people suddenly, their ears pick up and go, whoa, all the people go, well, do you know what, I'm still running my miles, I'm still keeping fit, I'm still, it's a bit of fun, I'm showing my friends what I'm doing. That's what I don't really know. The one quote where the guy, he did explicitly say, no, that wouldn't annoy me. But other people, well, ah. So yeah, cross the border, I'm not really sure how people feel about it. I do think people need to be told explicitly because nobody reads the terms and conditions, nobody does, or very few people do. These questions around informed consent, exactly. And also whether people have sort of digital literacy when they go into those relationships and they are relationships. And then the question of, of course, the algorithmic kind of dimension, which means it's done in this way that is seemingly kind of very much outside of ourselves and to what extent do we have control over that and does it matter whether we do or not? Actually, there's a group of us working in London on digital bill of rights, and we're actually taking it to parliament to kind of ask questions around these things. And there are changes. I mean, if, well, if Britain leaves the EU, but anyway, let's not go into that. Okay, I just wanna ask a question about the issue of privacy. And whether any research you have come across any differences in terms of people's perception and treatments of this issue of privacy in relation to age or kind of some socio-economic backgrounds, because it seems to me, at least from my own kind of research, that what people call digital natives, they tend to be less interested in this issue of privacy or less concerned about what happens to their data, because I think they were born into a highly digitized world. So for them, they take it for granted that your data is not always yours and that people will have access to it and share it and do all sorts with it. Whereas, for digital migrants, they tend to have more reluctance towards the use of data. So have you come across any of these differences? You know, certainly there is issues around age and social demographics and how people approach them. And one example I have for that is a focus group. And there was a girl, I suppose in her early 20s, who mapped her runes, but she found this, she did map her runes, but she found it also as a kind of a security blanket. She said, if she ever went missing on a run, they would know exactly where, they'd know where her phone was, but you know, they would be able to find her. And then in relation to that, there was a guy who was probably, it's late 40s, early 50s. And when every time he went for a run, he didn't turn on his garment until he was at least half a kilometer away from his house, because he thought if he left his house, he turned it on as he was leaving his house, it pinpointed where his house was and when he wasn't in. And there are just some of the differences within the age that spring to mind when I think about these things. But the differences you would expect are there. Definitely the older generation are more reluctant or more kind of keyed into some of the privacy issues where the younger digital natives just take it for granted. Okay, well, I'll put my information up there. That's what goes there. And I feel like I'm doing a disservice and saying they're not as questioning because some of them are, but in general they do accept when you tick that box, that's what happens. The Data Protection Act is gonna cover all of that stuff. I'm gonna be okay. Because the Data Protection Act covers me. It's a good thing, people are good out there. And there is a kind of a perception around some of those issues from what I found. Yeah, I haven't really probed people into the privacy thing. So I haven't heard anyone talking like critical about it. Some talked about the sharing as some way of disciplining themselves. One referred to us as she has installed an inner cup because now she had to share, for instance, everything she ate, and then she had to really eat healthily. So it's a way of disciplining herself because she was sharing everything with other people what she ate every day. So it has some sort of disciplinary effect. So I haven't, but I haven't really heard anybody talk like negatively about the sharing, but I heard a lot of people who talked about there's so much potential of all this data that is not used, for instance, from the healthcare system. So people talk it very positively as though they are might doing something good when they're tracking, but it's a potential that is not used in any way. Because there's really no system where you can upload. There's not really any sharing with, for instance, the doctor doesn't have access to the data. So this is like a theme that people talk quite a lot about. So I would say almost on the contrary, but these are people that are really like digital, literally. So they have like very sophisticated skills. Yes, I'm from a science and technology studies background. And my interest is really about the quality of the data that these devices generate because we talked about people cheating. And are these devices interpreted in different ways? Your, of course, your research points that way, but what really is the quality of the data for the big companies? Because there seems to be a lot of user strategies and using them different ways. And is there really an app for everything? Like some of these guys talked about, but what is the quality of the data? And the use of the data, what can they actually generate from the data of all these different kind of uses of the data? Yeah, at least the people I talked to has some sort of like ambivalence towards the data. There was one talk in the QSO, one who had tested three or four different Fitbit and they showed something completely different. And that was like a kind of fun. So they all say, yeah, you cannot really rely on the data. And these devices, of course, they don't produce secure. But on the other hand, there's something that is fun to play with and look at. So they sort of just accept and just trust the data. But they still have this kind of like knowledge that of course these devices, they are not completely secure and Fitbit, for instance, they mess up in different ways. So you cannot rely on them. Yeah, you are right. The information is limited and what particularly the templates people use and what they get, you know, there's usually three or four different categories. And also the calibration. So some, you know, you could use one particular device. It will say you've run 3.3 kilometers. Another device will say you've run 3.5. So there are discrepancies between them. But the information I think is particularly valuable in kind of marketing strategies and in the development of new products within these organizations and in sending off some of the bundles. So don't forget the information that can be sold on doesn't always necessarily have the information you're generating. It can be things like that put your bank details, but you know, what's being used, what kind of information you're being sold on. When you go on the platform, the kind of things you look at, okay, so this person is definitely interested in their weight, let's bombard them or marketing about weight issues. And that's where some of the value is in those things. But yeah, the data it generates are relatively limited but still have potential to be, you know, developed and profitable for those organizations. Quite possibly it could, but yeah, but it would be the market, you know, I would say more likely is you'll get flyers in your door, say join this club or join that club or something like that. That's at the moment, but in future probably absolutely the way to go. Or you'll be, you know, when you log on to your, I don't know, email account, the pop-up windows will all relate to that kind of thing. Yeah, yeah, yeah. No, nothing to compare, having fun with it and I reckon it took a long time. But I do think particularly in the groups I've seen, a lot of the kind of QS people who speak, they identify the kind of almost like a gap in the market or some issue that's close to them that hasn't been covered and they start, you know, that's what they start off with. Occasionally you'll get people and they're not supposed to do this, anybody with commercial interest is not supposed to be there. Occasionally people come particularly in health related things. We have this sort of information, do you think this would work? Now I've seen one or two talks along those lines. So they are, I suppose it's an entrepreneurial spirit as we saw earlier on, you know, in using this information and developing it to get it bigger, stronger, better, all that kind of thing. And the other way around too that the companies use the information for, I mean, as you were saying, use it for completely different things like when the Fitbit, there was an earthquake, wasn't there? What was the earthquake? Oh, there was an earthquake, sorry. In San Francisco, was it Fitbit or one of the activity trackers released data then showing how people woke up and they showed a distance from the epicenter because they have the, it was in the nighttime and they could say, you know, people closer to the epicenter moved around a lot and then they went back to sleep and, you know, the further from the epicenter we were the longer they were, no, the faster they went back to sleep, et cetera. And that's not, I mean, sure, it's using the same data, but it's using it for something very different from what. And people seem to like that, right? But a friend of mine who's a computer scientist and quite clever pointed out, you know, Jill, if you use a Fitbit, which I'm not using now, they could actually tell whenever anyone, for instance, has sex, if, well, they'd have to do is they'd have to have a sample of people, maybe 20 people who actually register when they, or anything, really. I mean, he was just trying to get a rise out of me, I suppose, but any activity, you could have a sample of 20, 100 people, whatever, who actually log when they do whatever other activity and then probably there's some pattern that's recognizable in your Fitbit data for that activity, right? And then once you've got that data, you can extrapolate it to your whole population, at which point you suddenly know things about them that they don't know you know about them, at which point you can, as you say, market things or whatever, right? So, and we haven't heard much about that yet, but I'm sure that will come. I think that's about to happen with the new data protection regulation, which actually requires companies to be up in terms of transparencies. Then the questions are, will that inhibit? Innovation. Yeah, so companies are, you know, not necessarily excited about it, but I think it is being discussed. And this is very much about data repurposing and also this kind of functional creep. It's how something is created for one particular purpose, but then it gets directed to other purposes as well. But I mean, what really kind of puzzles me is that there are many examples and many, some of which you've mentioned in your talk, Keith, about how data has been used and sometimes even abused in certain contexts where people were not aware of it, such as in the courtroom and, you know, sometimes even at the workplace or by insurance companies. And yet even that doesn't seem to be scared people enough to become more cautious about their data and to read those terms and conditions before they click the box. I'm gonna perhaps ask more of a normative questions. What can we really do to heighten this awareness of these issues of data sharing and data repurposing? Yeah, I, you know, that's what I kind of finished with the security fatigue thing is, I don't know what the answer is. I do think the more we scare people, you know, when you scare people, it only has an impact when there's a direct consequence and people can see and experience that. It's like, okay, I've done this. Okay, all this marketing material has come in the door, you know, and that's kind of a mundane one, but, you know, they can see the correlation between the two. When you say, okay, if you possibly do this and nothing happens for a year, nothing happens for a second year, people get blasé about it. And, you know, going around that scaremongering thing, I don't think works and I don't think has worked. And that's why, you know, keep going back to this transparency. I think just having it in big bold, we are doing this with your information. This could potentially happen or, you know, going to the data rights act, being saying this is in relation to data rights acts, you can do this and you can do that. Information can be taken this way, can be taken that way. Then people may become more aware of it and maybe slightly more informed because I think that's where the issue is at the moment is it's how that information is being translated to people and if it's translated in that kind of scare tactic way, I don't think it works. I think people just think, oh yeah, it's not gonna happen to me. You know, I've got nothing to hide. There's nothing compromising here. You know, and that's the kind of standard response to a lot of this stuff. And it could be something further on down the line that well, you know, what you're doing on a Friday night when an earthquake goes off, could all of a sudden be used to get, you know, this information doesn't go away. It's there, it's held and it can be sensitive depending on the context that's used it, is what I would say. Magna Carta on these things. And of course, trade unions are also starting to respond. But I mean it, the point that we've made as well, Lukas, is that often technology sort of speeds ahead in ways that so, you know, lawyers are trying to catch up and companies are trying to sort of look at what, as well as anything else. And then, but yeah, and I think also your work, Chris, that looks very much at what happens with data and to what extent consent matters and is something happening. It doesn't matter that there's profit being made. So, but I think, do you wanna go ahead and do other questions? Yeah, do you have any other questions? I can ask one more question perhaps to Dochte is about, you mentioned something with regard to how in the case of your interviewees, you noticed that it is not so much about the kind of neoliberal imperative of competition and productivity and this kind of extraction of value from certain practices. But it was more about the quality of life, well-being. So I wondered how much of that is really about the geographical context. Because I must say, as an outsider and a foreigner who came to Denmark recently, one of the first things that really struck me in Denmark is that there is an absence of competition in the culture. Everybody is interested in wellbeing in general, as opposed to maybe some other cultures like North American and so on. So do you think that the context plays out in how people kind of perceive of the self-quantifying practice and how they act on it and perform it? Yeah, it's a really good question. And of course I would love if I could see all the ethnographic work on quantified self, for instance, the UK and Britain, because I think it's, yeah, of course, the cultural differences and it's very much presented as some kind of like universal thing. But I think there are really, really big, there must be differences in how people relate to it and how they use it. And it's also in the Danish context, it's really difficult to find people who go to the meeting. It's the same 10 people who have been presenting for like four years and then we have people coming a couple of time and one of them see travels between New York and Denmark and to say, oh, in New York there are hundreds of people and people have been tracking for so many years and here in Denmark people try and then give it up. So it's not really something that really thrives very well in a Danish context and I don't know why. I think like in Germany I've also heard that they're a big group. So it'd be really interesting if somehow you could make a comparison study of the different, also think in a Danish context we talk very much about feeling our body. So we rely very much on our bodily experience and they also talk about, of course, I believe also in my body, not just the data and also fingers very specific to a Danish context is of feeling in my stomach or in my gut if I can rely to the data. Yeah, but I would need some more comparison. I haven't really read many papers on the quantified self, a couple from US and your work but I don't know why they haven't been written so much on the curious because I know that many research are going to the meeting. Yeah. Self-expo in San Francisco last year, the questions around cultural kind of understandings of privacy and there was a kind of a conviction of American sense of what we can consider as private and what's ours and what's happening with our data, how it's different from the European context and I think that actually would be quite a nice kind of project but yeah, yeah. Although I feel in the context of those symposiums managed by the quantified self community itself, I think that as well will turn into a value-extracting exercise. Yeah. Any last question before we wrap up the event? Okay, well, thank you very much for our panel. That was fantastic discussion.