 Well, good afternoon. It's time to convene our closing session for our Fall 2013 meeting. I'm glad to see so many of you stayed on. I'm also glad that so many of you got here in the first place. I know that the weather has been a bit interesting as we discussed in the opening plenary. I would suggest actually that those of you who are still here probably made a good decision because actually, as I understand it, the snow has kind of moved on and it all gets better from here. So I just have a couple of bits of housekeeping before I introduce our closing plenary speaker, which I'm just thrilled about. First off, I want to just remind you that in your packet you have, and also in the inside front cover of the program, new program plan, you have the schedule of the next few meetings. We will be in St. Louis in end of March, beginning of April for our spring meeting. I hope to see many of you there. We're back here in December and we just finalized arrangements a few weeks ago for a spring meeting in 2015 in Seattle. So if you want to note those dates in your calendar, that would be great. In terms of evaluations, we did away with paper evaluations a while ago and you will be getting an email and a bit inviting your comments on the meeting and the sessions you were able to attend. The only other thing I'd like to do really before introducing Esther is to ask you to join me in two rounds of thanks. First, I'd like to thank our presenters, some of which went through some fairly heroic efforts to get here. Some were quite flexible and accommodating folks who couldn't get here, who were part of their session. We actually had an amazing set of plenaries at least based on what I was able to see. I'm delighted that we've captured as many as we have and I think the broader community is really going to enjoy being able to see the record of some of those sessions. I would urge you, if you think of it in the not too distant future, to pass along your PowerPoints if you haven't done so already so that we can add them to the collection of materials on our website. So please join me in a round of thanks for our presenters who really are the core of this meeting. Thank you. And I'd also like to just ask for a round of applause for the CNI staff. This has been an exciting meeting for them too, with all of the uncertainties and need for last minute flexibility and I think the smoothness with which the last couple of days have gone is a real tribute to their good work. So please join me in thanking them. And now let me get on to the reason why we're really here. I am so pleased to be able to introduce Esther Hargity as our closing plenary speaker. I've been looking for an opportunity to bring her to CNI for a couple of years. Esther and I are colleagues actually on the Princeton IT Visiting Committee and I've had an opportunity to kind of sample some of her wisdom and insights about students and technology and how technology has adopted over the last few years that I've found very helpful. As you know, one of the longstanding themes that CNI has pursued is the question of how learners are changing, how they adopt and adapt with technology. And I think you'll also recognize if you look back over some of the work we've done in this area, we've tried to be fairly fact-driven and data-driven. This is an area that seems to invite extravagant rhetoric about and generalization from an anecdote to data. Yet it's also an area that's tremendously important for us to understand. Esther, who is the Delaney family professor of communications studies at Northwestern, and also I would note a fellow at the Berkman Center for the Internet and Society based at Harvard, has a history and a body of work that really looks at these issues around technology adoption, society, literacy, differential facility with technology. And I think it really gets at a number of the issues that we're concerned with about the interaction of technology and generations of students and different learning types and that sort of thing. So I'm delighted to introduce her, invite her to the podium, and I ask you to join me in welcoming her. Thank you, Cliff. Delighted to be here. I speak to many very different types of audiences, but this is actually one of the most relevant ones, as you'll see given the type of work that I do. So I thought I'd start with this slide. If you're not familiar with XKCD, I highly recommend it. It's a very insightful cartoon. Since you come from different areas and I come from a specific area, I thought I'd put my disciplinary background in context. So I'm the person on the left who likes to complicate things. That's how I put it in context of this cartoon. So I will not give you a simple answer as to how things look. I prefer to look at the evidence and discuss how complicated things are. Okay, so there's tons of enthusiasm about the Internet and digital media and how they're going to help our lives and specifically higher education and different ways in which they can help us range anywhere from recruiting students to helping students adapt to their environments once they get to campus, helping them do well and what not. At the same time, there's just about equal amounts of anxiety related to all these aspects or potential outcomes of information technology. And what's interesting is that many years into the Internet's mass diffusion, we still have a lot of rhetoric that goes either one side or the other, right? So either people are super enthusiastic that the Internet's going to solve all our problems or we have to be super scared because digital media are doom. And so I'm here to tell you that it's more complicated than that. You probably know that already, but I have some data to back that up. Okay, so I'm going to first start with a couple of slides about what I'll be addressing more generally. So to put my talk in context, back in the day, there was this term, the digital divide, which those of us in higher education don't really talk about because pretty much everyone who's on campus is on the right side of the divide in the sense that everyone is a user. Okay, and especially in academia, we can then sort of check that off as, okay, well, we've dealt with that problem because everyone's online. But what I argue and what I find in my work is that just because people are connected does not mean that they're necessarily effective and efficient users of technology. And so we have to be very careful in assuming that there are no longer any problems just because everyone's on the correct side of that divide and instead encourage this more nuanced approach of not a binary divide but a much more of a spectrum of inequality when it comes to using digital media. And then the other topic that I want to address not so much centrally, but to some extent is this topic of quote, unquote, big data. I'm glad you like it. I just came up with a slide today. So as you'll see, I'm somewhat skeptical of what we can learn from big data. Obviously, there are incredible potential opportunities from big data, but we have to be very careful in how we approach big data. And basically what I'll be doing is showing you issues around digital inequality and then I'll have that speak to the whole question of big data and big data's potential. Okay, so I'll start with the overall framework I take when I think about users. So I start out with a user. I'm not interested in non-users and those of us in higher education don't have to be too concerned, as I said, most pretty much everyone on campus is a user of digital media. But then I'm very conscious in recognizing that people come from very different socio- economic backgrounds that raise ethnicity, gender, all these factors might influence how people use the internet. And then I also think about the context in which people use digital media, both the technical context and the social context, that is, people support networks, things of that sort. And then overall what I've really focused on in over the past decade plus is what I call skills, what some people refer to as digital literacy, competency, ability, but basically the ability to use digital technologies effectively and efficiently to meet one's needs. And then what I argue is that all these different factors will influence how people actually use digital media and incorporate digital media into their lives, whether that has to do with more information-seeking types of activities or production types of activities. And then ultimately what's really of interest is how do skills and types of digital media uses influence various outcomes, especially of interest to us in higher education would be things like academic achievement, but also other types of outcomes such as people's socio-economic status, various life chances, general well-being, creativity, and ultimately whether in our case students are able to get jobs, for example, after they finish college or graduate school. So what do I mean by skill? I define it very broadly. I think of very different dimensions of it. At a very core level it's awareness and understanding, and I'll talk about this quite a bit, and it's something that many people tend to take for granted in ways that I think are quite problematic and a disservice to both us and users and students and staff and faculty. Skill relates to efficient information-seeking, which also has a very important component of credibility assessment, being able to evaluate content people find carefully, and then also increasing in the past several years questions having to do with privacy and security, and I suspect those are some of the major issues some of you deal with on campuses. And then it's important to recognize that these aspects refer both to content consumption and content production, and both of those are quite relevant. Okay, so overall in terms of big questions, these are the ones I'll be addressing today. So having to do with skill and how that might vary across users and whether skills and online uses are linked and why should we care. And in fact, I'll start with this last question, why should we care? So just a few data points from out there. Increasingly, how people get jobs may be influenced by what they're doing online. There are all sorts of horror stories. There's an entire blog that's dedicated to documenting the stories of people either not getting jobs or being fired from their jobs due to things they do on social media. And of course at some basic level, part of it is just don't do things that are stupid, right? So that's sort of the most basic thing. But then there are also cases where it's not the thing the person did was necessarily that stupid, it's just that they shouldn't have broadcasted it to the world. But then there are other ways in which digital media have potential implications. For example, in the domain of health, there's quite a movement now against certain things that are in the medical community taken for granted that are important, but then there's pushback. So there's this whole anti-vaccination movement and you might ask, well, who cares? Who sees this? Well, this particular website is actually liked by almost a million people out there on Facebook, which means that the messages that they put out, for example, get to a lot of people, potentially. So those are just a couple of examples. Now, in terms of studying young people's internet uses, which is a lot of my focus, one thing that's interesting that seems to be true across generations is that everybody seems to believe that young people are savvy with digital technologies. And maybe some of you in this room, because you deal with this so much, don't actually believe that, but there's a lot of that myth out there. So here are some of the characteristics that we tend to hear about young adults and teens when it comes to digital media. But what I'll be showing you is that not all of these are true. So it is true that young adults have certainly been using digital media for many years, and most of the people who show up on our campuses have, in fact, lots of experience with digital media. But it's not necessarily true that everybody spends a ton of time online. And it's not true that they all engage in varied activities. And it is absolutely not true that they're all savvy with the internet. And so these are the things that I'll be addressing in the data that I discussed today. So I'm going to spend a few slides telling you about the methods of the work I do for a couple of reasons. One, I don't believe that you should believe what I tell you without knowing a little bit about how I actually collected the evidence that I share. Two, I think it's important to think about how we can collect evidence about these questions that I discuss. It's not a straightforward enterprise. It's complicated. So two of the ways in which literature on digital skills has collected information, involves in-person observations. That's one type of big research literature. And then the other concerned surveys. So this is where I want to take a step back and link to the big data conversations, although it might not be completely obvious, and I'll come back to it later in the talk. So data on internet users is actually quite little and very basic. And so you might be thinking, what is she talking about? She already told us she knows about big data. How is this possible? Well, I want to draw a distinction between internet users and internet users. And when I say internet users, I mean the average internet user. And we actually don't necessarily have that much systematic data about the average internet user. We have tons of data about certain internet uses of certain people. So tons of data there. The problem is that those are not necessarily generalizable, which is where I'm going to be going. But when it comes to average internet users, we just don't have great ways to measure the average internet user for these various reasons. In terms of uses, yes, we can grab lots of log data from the uses of certain sites and services. There are some great details in them, and then other details are not there that we might need. But the problem overall is that these data tend to overrepresent quite a bit the really active and engaged users, and those are not the average users. So this is what I refer to the digital data paradox. We have all these data, but they're actually not necessarily useful data in every way, depending on our questions. So the data that I rely on today that I draw on for the types of findings I discuss have to do with lots of studies that I've conducted over the years. I've been working on this for over a decade now. And specifically today, I will focus on data I collected at the University of Illinois, Chicago. So you might be wondering why. I've never actually been affiliated with that university. I'm currently at Northwestern. I got my PhD at Princeton. Before that, I was at Smith College, so absolutely no affiliation with the UIC. So why would I do this? They're actually not that close to Northwestern, and we did the data collection in person in the dead of winter. It's Chicago, so it's not convenient. That's not it. What really inspired the selection of this campus was that they have an incredibly diverse student body. And as a sociologist, I'm interested in collecting information on diverse populations. And they also have a course that's actually required of all students, which turns out not every university has. And they were willing to work with me on this project, which was very nice of them. So over the years, for several couple of times, we went into UIC classrooms that may sound very easy. It's one course, but it's 90 different sections. And we use paper and pencil surveys, not because I'm a masochist, but because you can't... So the easy way would be, of course, collecting data online. But if you're collecting data online and you're interested in skill differences and privacy differences and how much people... how much time people spend online, then you can't collect your data online because you're automatically biasing against the people who are less skilled who spend less time online, who don't have privacy online, etc. So that's why we go through this very elaborate, expensive, complicated, complex, time- consuming process. So we collected data back in 2009 on over a thousand students, went back a year later, not into classrooms but in the mail, contacted them with another paper pencil survey, and again in the mail in 2012, re-contacted the same people one more time. So we have data on the same young adults over time, which is extremely rare when it comes to studying internet uses. It's actually generally rare when it comes to social science, but it's very rare in this particular field. Now depending on the school you represent, you might be thinking, well, UIC students probably not that relevant to us. But that's actually unlikely to be true. It's a diverse sample. I'll show you how actually the findings tend to be replicated in other populations. And I'll also say as someone who teaches students at Northwestern, which is a different university, anecdotally I can say through my personal experiences that I certainly observe very similar patterns. And in fact some of my studies have been based on some of our students. Okay, so just one more because some people are very skeptical about surveys. How can you know the quality of the data? So we actually implement this data quality check where we make sure that people are actually paying attention to the survey. And in the 2012 survey we had two such questions and anyone who didn't answer both of them correctly we just pitched from the data side. So I stand behind the quality of the data. I think we have very good data here. Okay, so, oh and by the way that was about 4% of the respondents that we had to pitch. Okay, so this is the sample that we have that I'll be talking about today. And in some ways they're actually quite representative of Americans and certainly in terms of, so for example by 2012, if they were finished in four years then they graduated 48% had done so which is actually very similar to four year graduation rates across the US. So in terms, in those terms they're quite representative. My hope is to go back and I do actually plan to do this next summer which would be six years from their start in college and see how many had graduated then. So I collect lots of data about their internet uses in addition to background factors. You can see from these data that they are in fact quite diverse. That said they're still a particular population, right? So we need to be careful about generalizations but if anything, if we find differences here on a fairly homogenous age group then were you to change things like age and education level we'd probably find even larger variations. Okay, so at a very core level some of the data I present are from 2009 and then some are from 2012 and I'll always indicate that on the slide. So as I started saying students in fact do really have a lot of experiences with digital media when they come in. So we can in fact take that for granted that they have been online. One of the things people have wondered about is do they still use email? Turns out they do. I actually checked these figures at the bottom for the 2012 data and it went up to about 92% that do check email daily and for example we pay students or we pay young adults because not all of them are students anymore. For the survey participation we pay them in 2012 we paid them $25 and we would email them to verify that address and we heard back very quickly. So if it comes to something they can benefit from you bet you they're checking email. Okay, so awareness and understanding. So I developed this instrument to study skill which is made up of 27 internet-related terms and you can ask me questions later about how I know that this reflects anything that has to do with actual skills and that was actually what I did my dissertation on. I have several published articles about the methodology behind this that I don't have time to talk about today but happy to address later. So there are these terms that some of which I call kind of basic internet-related terms although by today some of them are kind of outdated but I keep them consistent over time just to be able to compare. And we can see already here that not every term is something people are familiar with. Now you might be thinking okay well who cares if people actually know what BCC means like does it mean do they know what B and C and C stand for or does it mean the meaning. Well so to check some of these I also conducted gave them multiple choice questions. So here was the one for BCC. I think you'll agree that it's not particularly tricky if you know what BCC is I think you can pick from this list what BCC is. Yes, no yes, no. Okay well it turns out that a third of them could not. Which is worth mentioning because there are all sorts of embarrassing scenarios that can come from people not understanding BCC. So this is partly important not just for universities but also frankly employers and even you as employers of staff in terms of the incoming people from new generations what do you assume in terms of their knowledge of use of IT. Okay then when it comes to what I would consider somewhat more advanced internet related terms but I mean that's just it doesn't really matter what we call them. Knowledge drops quite a bit and there's quite a bit of variation. I like to point at RSS which across very different data sets because other people have now implemented these similar measures. It's been very consistent that pretty much no one knows what RSS is other than people who are like actually deal with this stuff and it's not used that much anymore but there was like a four or five year window where that was on every web page and everyone wanted you to subscribe and the reality is that people just don't know what that is. So it's important to stay in touch with what the average user out there understands and knows. So here's another one where you might be thinking well who cares if they know the term fishing all you really cares whether they recognize a URL and whether they know that if you know is it something they should click on or not. Okay so here's another multiple choice test. Revising the title of the slide. Okay so I will give you a moment. Again not super tricky. I think those of you in this room can probably get this right. Well let's see if the students could get this right in 2012. So that's pretty bad right because Russia is actually one of the sources of a lot of websites people might not want to click on. Not so awesome. And then not so much with that one. So 12% of these young adults could actually recognize what is likely the website of a bank and if you're thinking where did you come up with that crazy first line it's actually at the time when I constructed the survey that was where you logged into that bank. Just saying. So something to think about. Okay so now that I've shown you that there's definitely variation in people's skills and hopefully some of those data points were convincing. Let's see whether these skills vary across the population. So one of the things that I find consistently is that when it comes to these measures which undoubtedly are to some extent they're self-perceived measures right. Women always score lower. And this is persistent over time. So women aren't really gaining on men. That said this is a super complicated question gender and I don't want to focus the whole talk on that today but very happy to address in Q&A it's complicated I have several papers that focus only on gender even though when I started studying this entire area I didn't have any particular interest in pursuing a gender angle but it just kind of jumped out at me and kept yelling and saying look at me look at me. So there's like a huge gender story that I'm happy to elaborate on later. Okay how about relationship of skill to socioeconomic status which I measure in this data set by parental education as a proxy which is often done in social science. So what we have is socioeconomic status is related to people's skills. So those from less privileged backgrounds are less skilled than those for more privileged backgrounds. And again remember we're looking at people who are all in college so in that sense they're actually already more privileged than lots of others out there but even those who all made it to college made it to this campus are quite different when it comes to skill related to their socioeconomic background. So one of the questions I used to get I or actually I get quite a bit still is well can't people catch up I mean this is just one point in time. So this is why it's great that I've been able to study these people over time. And the following was really remarkable when I looked at the numbers so it's the same measure of the same people three years later. So yes people's skills have gone up to some extent but what's remarkable is that it's pretty much at the same levels across the different social backgrounds. So no people are not catching up in fact. For those interested in a more general population it's very hard to have nationally represented data but fortunately the FCC on a survey they did in 09 replicated some of my skill measures of shorter version and they found that both income and education were positively related to skill this is interesting because obviously my data set education doesn't vary. So again socioeconomic status overall across the U.S. also relates to skill. However important to note that under 50 age does not. So over 50 there's a relationship between age and skill in that people who are older above that age are less skilled but under 50 and under there is no relationship between skill and age. So important to remember when we have these assumptions of how young people are somewhat savvier than everyone else. Okay so let's move on to discussing joining communities. You might be thinking who cares about 2007 well I want to show you this because back in 2007 there were all sorts of campus initiatives to incorporate various services in terms of reaching students whether that was for courses or for recruiting perspectives and it's important to recognize that different people hang out on different sites and this actually continues into today. What was also interesting is in 07 the reaction some of the reactions I got was well who cares this will change over time. Well it did change but interestingly the patterns again did not necessarily change right so everyone's use of my space went down but when it comes to its relationship to socio economic status that stayed consistent across those two years. Facebook did gain much more users many more users across this spectrum that's true. Just quickly so it's interesting when I published the 07 paper I got covered by the chronicle but when I tried to pitch it to more national publications for coverage they said who cares about a bunch of Chicago students. What's interesting is two years later Nielsen came out with some data pretty much finding the exact same thing so I show you this just to suggest that the data are in fact more representative than they may seem so Nielsen had data, naturally represent data and they also found relationship between socio economic status and what sites people are using. Okay how about Twitter which is another one of those sites that's been hyped to no end. So 0910 having heard of the site went up in this group use not so much I mean it went up quite a bit but still less than a fifth we're using it in 2010. To put this in context 2010 was when Oprah joined Twitter and there was a lot of celebrities were on it by then I mean it was really getting quite public I mentioned this because with internet services people tend to have this idea that everything changed in the last two months but I mean these things some things change and some things really don't change that much. Okay so then again what's exciting so first through other data we did some focus focus group studies back in spring 2010 on Northwestern students and I thought these quotes were helpful in again showing that just because they're young doesn't mean that they are embracing every service and understand the use of every service and in fact right now I teach a course on online reputation management to Northwestern students and some of them have a lot of trouble understanding Twitter and why they should care or they use it in ways that are in fact quite problematic because they've sort of moved off of Facebook in terms of their private information because now their parents are on Facebook but they use Twitter as though it were private which it isn't most of them don't protect their accounts so in some ways it's actually even worse than what they were doing on Facebook because it's completely public that's more anecdotal but I suspect we'd find that more generally. So one of the things we found with Twitter use data was that there was quite a bit of variation by race ethnicity and this these findings were actually completely consistent with data that the Pew Internet Project had found nationally what was exciting in our cases again we had data for two points in time for the same people so what we could do was we could control for data from 2009 to see what predicted Twitter adoption by 2010 which is helpful because now we can actually talk about causality since it's two different points in time so not surprisingly based on what I just told you being black predicted that you might be a Twitter user and this is a graphical representation of regression results. People had hypothesized so there were big questions like why is it that African Americans are more likely to use Twitter what's going on there people are very curious to know this which I mean it's a very interesting empirical question so some people suggested while they tend to text more often or have web access on their phones more often which is true we were able to control for this and it turns out neither mattered in explaining why African Americans are more likely to rely on Twitter one of the things that did matter was that those who had higher skills in 09 were more likely to adopt Twitter a year later and the other thing that we collected data on in 09 was interests in various topics so not only is interest basically as so that arrow from African American disappeared the reason to explain this what this means is that African Americans tend to be more likely to use Twitter rather than on Twitter because it's a very interesting topic and there's a lot of entertainment and celebrity news but it's the latter that actually explains adopting Twitter turns out some other things that people had hoped might be related to Twitter are not relevant that doesn't mean if you're on Twitter you might not increase your interest in those topics but being interested in those topics didn't predict joining Twitter more recently for 2012 data in terms of socio-economic status in all cases those from less privileged backgrounds are less likely to use these services why should you care well for example if you're using these services to reach perspectives you should recognize that you're more likely to reach a certain type of perspective the more privileged types than others so that's something to consider if we strive for diversity which I think most of us do moving on to the topic of contributing content just quickly I have studied people's experiences with writing reviews editing Wikipedia all sorts of different types of contributory activities interesting because again the opportunity of the web to give people more diverse people voice online it turns out that they're not taking advantage of that that much that many people aren't doing that much again we see a gender difference women tend to engage in fewer of these activities and there is some relationship again with socioeconomic status in that people from the least privileged background tend to contribute the least and then there's very much probably not chalking the relationship between skill and contribution so those who know more are contributing more and this is one of those cases where you could say yeah but you know you're contributing more so your skill probably goes up true what's nice is that here I'm able to control for the skill from a prior year right so your skill in 2009 is very much related to whether you're contributing three years later so let's go back to big data for a moment after sharing these results about how people's background relates to what they do online let's take a hypothetical question that might be of interest to us for example how does digital media use relate to academic achievement right that's a question that many of us can rightly be interested in well let's say we're researching this topic and we collect data over time and we show that indeed certain types of digital media uses improve academic achievement that's great but based on what I've shown you and other research I've done usually it's people who are from more privileged backgrounds who are doing more online which means that most of our data are likely about those people and it may well be that people from less privileged backgrounds are not improving their academic achievement of course you could say well this is kind of a negative dire approach well let's let's give a little bit of a slope to the lower SES people as well but the story might still be discouraging because we might if we're looking at benefits comparatively what we're seeing is that even if everybody's academic achievement is going up thanks to some type of digital media use those for more privileged backgrounds may be gaining more than those from less and the gaps between them may actually be increasing and what a lot of my work has shown is that this isn't just true in certain domains but in lots of other domains as well that we mostly tend to have data on people who do a lot because again big data grabs data about people who are leaving traces on sites from which we grab the data we have less data on those who are doing fewer things on these sites and we have absolutely no data on people who are not showing up on those sites at all and so this is applicable these are all services that have been that have served as the basis for various studies that have been published about people's characteristics but again it's important to remember that none of those studies would include data on people who are not using these services so when you're looking at big big data and grabbing big data just remember that you're by design excluding certain types of populations and because use of certain sites and services many sites and services biases towards the more privileged those are the people who are more represented in your data so whatever findings you have apply to those types of people but not necessarily other types of people. There are other challenges of using automatically generated data which is that well so the first point I've shown you today other things I've done other work on the interest of time I actually won't get into this too much right now but I wanted to move on to the question of credibility assessment a little bit because that's another very important domain I already showed you how easily students are fooled when it comes to URLs how about other issues so they have immense amount of trust in search engines and to be sure this actually generalizes to the population at large we've also done studies on adults of all ages where this is also true so basically in some ways it's more important how people got to a website than what's on the website right so basically because it was a top search result on Google and this almost everyone is using Google these days that's basically what matters it's not so much what's on the website and there are examples of very problematic outcomes of this because sometimes not really great websites make it fairly high on Google partly because Google doesn't necessarily distinguish whether a site is being linked to for bad reasons like negative reasons for as an example of a really bad site it just seems that it's being linked to so it must be great we did a study specifically focusing on Wikipedia the strength of this study one of the strengths of the study is that the study itself was not a Wikipedia study we just collected data in addition to lots of other things about Wikipedia uses this was through observations and interviews and while some people understand how Wikipedia works many don't this is also represented in some multiple choice survey types of questions so there's the potential for concern there in some cases other cases not necessarily here's a specific example of the kind of hypothetical question we might give someone in one of these observational in person observation studies where we sit people down at a computer and give them hypothetical questions and what we found here was that a third could not figure out that you could go to a pharmacy and get an emergency contraception over the counter which is pretty problematic in case you're wondering why we picked out or why I picked South Bend Indiana it's because it doesn't have a Planned Parenthood which made it a more complicated case okay when it comes to privacy and security issues just one example so one of the ways in which I mean as I showed and you've probably heard through the media that employment prospects can very much be influenced by what people do online so we asked about how much people are thinking about an employer type of audience when they're managing their privacy and it turns out some of them aren't paying attention to that a lot at all or at all interestingly here the gender story is reversed in that women are more likely to be conscious of their audience our hypothesis is that it's probably because of all the media attention to how women especially have to fear strangers online and so they're raised more in that from with that concern in mind so that's that's an interesting finding but here again your skill actually does relate to how much you're paying attention to whether you should be managing your privacy settings to sum up a large body of work I have dozens of publications at this point very consistent across the various studies is that skill is very important to what people do online whether that has to do with managing your privacy or how you figure out the credibility of content that you see or what types of sites you're visiting why is it helpful to focus on skill well while I've shown that gender and socioeconomic status matter when it comes to what people do online they're not necessarily things that are easy to change well people's gender we probably don't want to change and then socioeconomic status is not something that is easy to change I'm not saying skill is easy to change but the intervention the potential for intervention there is much larger than the other factors there are very few studies that have tried that type of intervention I had a small study where it turns out it did matter even just a one-shot approach to trying to teach people some things about how to think about URLs and the site they're looking at can actually make quite a bit of difference in their future assessment of content so there is the potential for that as long as we don't assume that everyone who comes to our campus is already very skilled and very savvy when it comes to digital media so the things to take away students really are very different when it comes to their internet skills so some are very savvy but many are not the more privileged tend to be more skilled and since skill relates to what people do online it's the more privileged to tend to do more things from which they may benefit so to sum it up remember I started with that slide where I had all the positive potential things woohoo and then all the anxiety producing things of the internet so here are the positive things and here are all the negative things and conveniently a smiley and a frowny make a then diagram and so the message to take away from this is that it's somewhere in between and it very much depends on the context and the user and so we shouldn't jump to conclusions either in the positive or the negative directions when it comes to outcomes but recognize the context in which technologies are used and who's using them and recognize that potential outcomes are related to that so now just to acknowledge the funders of my past work and my current work and also the many students who participated in their research projects and thank you for your attention and I'm happy to answer some questions. Hi Esther, thanks for showing some really nice data. We get a lot of comments from scientists and you know graduate students around where people are saying oh you shouldn't you know talk about your research work online because you never know what your potential employer might say and yet that's a practice we really do kind of want to encourage for other reasons not only is it going to be more important part of their you know the way they work in the future but there's also that fear so did you learn anything kind of in your studies that maybe could be I guess a more informed way to respond when you hear this kind of you know fud about talking about research online. Okay so I haven't done research on that but I am happy to speak to it nonetheless. I actually write a column on inside higher ed called PHDU and it has to do with professionalization types of topics which this seems to be that type of a question and are you thinking more graduate students than since you're talking about talking about research? Yeah more than graduate students. Okay yeah so I think and did you have an example of why a potential employer might consider the research problematic? No I don't have a good example and usually when I see this being discussed it's kind of like the thing not to do there's usually not any examples given it's just kind of a little why it's annoying. Sure so one example I could come up with of the top of my head would be research went poorly and you had some bad things happen and you might be complaining about it on Twitter or something. Actually I have another example which is when I was hiring for a postdoc one of the people who applied for the postdoc had just tweeted that the person was excited to start data collection soon and I thought okay if you're just starting data collection you're not going to be ready to take this job in two months. So in that case the person actually pretty much caught their chances of getting that postdoc I was glad that I saw this to be sure but she wasn't helping herself. So I think in that sense people need to be savvy about what they're sharing. What I tell my graduate students when it comes to talking about their research so on the one hand I think it is important to communicate to people about research on the other hand especially with graduate students who aren't known for many things and that's their one thing that their dissertation is their one thing that's their thing. I encourage some type of publication and at minimum a conference item on their CV before they start talking about things publicly so that they can own that topic. So I do think that there is actually that you do have to be careful talking about research at an early stage when you're a student. You have to be careful as a faculty as well I think the risks are a little bit lower. But otherwise I just try to be savvy in the sense of hmm so I've actually edited a volume that talks about all the bad things that happen during research. I don't think graduate students are necessarily the ones that should be talking about that too publicly too often. Even though recognizing that it happens you don't want to be known only for all the problems you're having in your research. I don't think, oh okay. In your research have you done any measures of satisfaction level with using the internet in relation to skill? Do you mean like just asking how satisfied they are with what? Like if they achieved what they were trying to do. That what? Like they achieved what they were actually trying to do. Oh no I have not measured that. It's more of a psychological variable and I don't have much of a psychology background so I haven't really gotten in that direction. I definitely think there's research out there that looks at that. There's also research that looks at people's motivations to do certain things. I think not shockingly that relates to what people do but I think it's important to recognize that what people are motivated to do is probably also related to background. Did you have thoughts on why, like I'd love to hear why you think that, like in what part of the model that would fit that would be interesting or what we'd learn from that? I was wondering if there's a point in which people with a high level of skill become less satisfied because they tend to get caught in finding 17 ways to do something and not being able to choose what way to go through it. Yes, so that's a really good point. So I can address that for more of a methodological perspective because in that sense I have thought about it. So back in 0102 when I collected the data for my dissertation, the way I measured actual skill was how long people took to do, to find different types of content. And back then I think that was a good measure, time to completion. Today I don't think that's a good measure because I think today if you're really skilled you're actually going to take more time because you're looking to verify information and so it's not the motivation aspect but I think it's a similar idea that those who are more skilled sometimes end up doing things that maybe in some ways counterintuitive for someone who's more skilled precisely because they are more skilled. So in another study that we've done more recently in adults of all ages that's about looking for complex health questions online, what we found was that people who are less skilled completely dismiss ads, for example. Whereas people who are more skilled recognize that those are ads on a page but also recognize that ads could also be helpful. So there is that more nuanced understanding if you're more skilled. Someone there had their hand up but I don't know if they wanted to walk up to the microphone. I've recently seen some research that students think their skills are better than they actually are and that creates a challenge of getting students into programs where the library in particular is wanting to help them improve their skills. Do you have any findings related to how you crack that, get by that barrier and get the students in to have them realize they could use help? Yeah so you're absolutely right. This is, as I had a slide that said everybody thinks they're better. In the focus group study we did it was really interesting because I showed you those two quotes where they had no idea why they would use Twitter, right? And then a few minutes later I asked, oh so do you mention on your CV the kind of social media you're familiar with or use and they said why would we, everyone knows we know all them well. And they said this two minutes after they said they don't know what Twitter is for. So even like complete cognitive dissonance and so they don't know. They don't understand what they don't know so I completely get this. Like it's very hard to get people in. I think and I'm not sure how this would work but it can't be optional. It has to be something much more required and I know it's very hard to get required things on orientation programs. I'm completely familiar with that. Maybe it doesn't have to be the first week of classes. It's a huge challenge to know where to introduce all this into curricula. A huge challenge. But it cannot be just people who think they need it for sure. I think you're absolutely right. I think trying to work with programs or faculty or advising where it becomes or maybe residential colleges who knows that it becomes more of a required type of activity. Brian? Well it's maybe related to the question that we just asked. You started early on in your talk about digital divide and that suggests an ethical issue I think in terms of supportive diversity, supportive fairness and so on. And often so the question becomes well what are the interventions? And maybe what you're suggesting here or you're just suggested is the interventions are among let's say college university staff, educators. Are there other options? I'm sorry what was the last sentence? Are there other options? Or do we need to get educators more aware of this because the educators are not very aware of this? Yes. So that's actually a very good point. So partly I do think part of the challenge is in fact that most faculty think that their students are so savvy anyway. So it's not just the students as the faculty as well who think that the students are savvy. So they would never say in their syllabus go to the library, go take part in this session because they just think they know more. And it may be that the students know more than a particular faculty member but that doesn't mean they know enough. So yes, I think that's a good point Brian. I think part of it is educating the faculty themselves that students don't know everything. So that's, that could be an important aspect of it. I also wanted to mention that those of you who are at schools that are maybe less diverse than you I see who might have more students who come from more privileged backgrounds. I don't think your take away can be, oh we're good because we have all these privileged students because you probably still have students who come from less privileged backgrounds and in some ways on some campuses those students because their numbers are smaller probably feel even more marginalized in some ways and might feel that they have less resources to catch up so you really need to be careful that they have those opportunities to catch up. Yes. Do you have a hypothesis as to why those students with higher socioeconomic status have what you call higher skills? And a follow up question is are the skills really what you're measuring or do the schools measure the acquisition of something else through socioeconomic status differentials? Yes, so good questions. So the first one I think I can hypothesize about some of it and some of it we've actually studied. We have a paper that we're wrapping up where we actually look at parental support with technology use while these students the same sample was growing up and we find that positive support from parents or positive engagement with the student while they were growing up regarding technology is actually a predictor of skill so that the parents a lot of focus is on monitoring and restricting but in this case our questions actually have to do with your parents sat down with you or your parent asks you to help with something or just basically supportive types of things parents are doing and the students from higher SCS backgrounds are more likely to have parents who did that and so that may be one mechanism. Other hypotheses that we haven't tested I think have to do with just that those parents might know more themselves because they are more educated they might use these technologies on their job more and so are actually able to pass a long knowledge about some of this technology use because others who have researched I don't do research on kids but others who do research on kids have found that kids in fact do learn a lot from their parents even when it comes to digital media so that could be another mechanism and then the other question right am I really measuring skill or what am I measuring exactly good question so my dissertation over a decade ago was specifically about measuring actual skill and then coming up with survey measures that were the best proxies for it and at that time the skill the actual skill I was measuring had to do with information seeking both in terms of effectiveness and efficiency so could they find content certain types of content and how long how efficient were they at doing so and I tried all sorts of survey measures to see what correlated well with the actual skills and this type of measure asking about terms was the best proxy compared to others. Now obviously internet terrain has changed a lot in over the last decade so there's a need to update some of the instrument incredibly labor intensive process so there's not a ton of work on it but we're working on it we also try other types of multiple choice questions things like that that said I'll take this opportunity to touch upon the gender question that I mentioned so one of the papers that I ended up writing from the dissertation data where I had data on actual skill was that regardless of actual skill women rate their skills lower than men and it turns out this is completely consistent with measuring skill in any type of domain in the literature and we don't know if it's women underestimating their skill or men overestimating their skill or both but that's what happens. That said then I have a paper that shows that even that regardless of whether this is actual or perceived skill it still influences what people then do online so once you control for this perceived skill measure it still explains away differences in who's contributing for example so whatever it may be measuring even if it's not absolutely even if it's just in your head it seems to influence what you end up doing so in that sense it's still a valuable instrument. It's an important question because if you are going to do an intervention is it a skill based intervention or is it some other sort of qualitative or educational intervention that yes that's a very good point yeah and I mean I don't want to take away to be oh you walk in there and teach them these ten terms and suddenly they'll be better that's not the point. Those are really just proxy measures for a skill yeah. I have a question I think maybe related which is I wonder if you there's a possible correlation between disposable time if you like time to play and social economic status because that would have an implication for useful interventions. Yes it's a very good point the ways in which I've been able to control for that in the statistical analyses suggest that it's not necessarily about that. So I've been able to so for example in a yet another study that my colleague Aaron Shaw and I've been working on from the 2012 data set but controlling for skill from 09 looks at contributions to Wikipedia for example which we know is very uneven across genders and we control for who works for them. For example which would be one of those discretionary time issues and it's in no way significant when controlling for a whole bunch of other things. It's a little bit hard to know about discretionary time with this population but I have always collected data on whether they have a work study job or another type of job and it's never been significant. So I try to control for it. By the way on the Wikipedia question just so it's not sort of hanging in the air like what was the result. So yes gender is there is a gender difference but it turns out skill is a huge part of the puzzle. So if you take a man and a woman neither of whom has much skill neither is editing Wikipedia and it's the really highly skilled men who are doing a lot of the Wikipedia editing basically. Yes. How do you store your data? How do I store my data? Well I have lots of backups which it turns out from the data I've collected lots of people don't but I have lots of backups that's one of the ways I store my data. But of course as per IRB compliance it's in password protected secure server on campus only accessible through VPN connections. Anyone else? Yes looks like there's someone else. Thank you for this. It's great to see real data. I think you but can you tell me more anecdotes. You had great studies about how not being able to decipher a Russian URL might have deuterious consequences. I think also we can all imagine or experience BCC or Facebook privacy setting issues. Do you have any examples or data that show the positive side of internet fluency? What that might lead to in terms of better jobs, better connectivity, more success? Sure anecdotal but actually my colleague Braden King and we're writing a book now on managing your online reputation which will come out with Princeton University Press and in that book our whole point is a lot of the focus has been hide yourself because who knows what might happen and our point is actually no put yourself out there just be really careful about how you're putting yourself out there and where we want to get people to think carefully about what are the positive ways in which they might put themselves out there so I've been teaching a class on this topic and I have collected tons of anecdotes of how young adults have benefited or not even necessarily young adults have benefited from things they've done online. So whether that's showcasing there are various creative talents or certainly people who are in the computer science engineering fields who can showcase to the future employer what they can do so yeah they're definitely examples of that it's much more anecdotal though. I am doing some studies now on other types of potential benefits like economic benefits but partly that needs to be longitudinal so it just takes longer to do so I don't have results in terms of like systematic studies. That's a great teaser to have Cliff have you come back. Thank you so much and that is a great excuse to invite you back as you progress with those studies and I suspect I speak for a number of people in looking forward to seeing that book as well. That was just an amazing talk so much in it to think about and just a wonderful way to end our conference. Please join me in thanking Esther one more time. And with that we are adjourned. I wish you safe travels. I wish you good holidays. I wish you all the best for 2014 and I look forward to seeing you all in various places in the near future. Thank you.