 All right, good morning and welcome to this week's edition of Encompass Live. I'm your host, Krista Porter, here at the Nebraska Library Commission. Encompass Live is the Commission's weekly webinar series where we cover a variety of topics that may be of interest to libraries. We broadcast the show live every Wednesday morning at 10 a.m. central time. But if you're unable to join us on Wednesdays, that's fine. We do record the show every week and it is posted onto our website for you to watch at your convenience. And I will show you at the end of today's show where you can find those archives. Both the live show and the archived recordings are free and open to anyone to watch. So please do share with your friends, family, neighbors, colleagues, anyone who you think might be interested in any of the topics we have on the show. Both are upcoming shows and anything in our archives. We do a mixture of things here on Encompass Live, book reviews, interviews, mini training sessions, demos of services and products, anything that we think that may be of interest to libraries. Here at the Nebraska Library Commission, we are the state agency for all libraries in the state, all types. So we run the gamut of topics that we will have for different kinds of libraries. So we all have things that are for publics, for academic, University College, K-12 schools, museums, correction facilities. Anything that's a library that's really our only criteria. We're very broad in that. We sometimes have guest speakers that come in from outside of the Library Commission, outside of Nebraska as well on all of our country. But we also sometimes have Nebraska Library Commission staff that come on and do presentations for us. And as we have this morning, with me today is Amanda Sweet, who is our technology innovation librarian here at the Nebraska Library Commission. Good morning, Amanda. Good morning. And she does all of our, as her title indicates, techy stuff, which is kind of broad. And today's show, actually, Ethics and Technology is the second part of a two-part series she is doing. She is done, is doomed. Back exactly a month ago on February 13th, she did a session on what in the world is emerging technology. And the archive for that is available now. So that was the first part of this. And then today we're going to talk about the ethics of those, that emerging technology. So if you haven't watched the previous one, you can, don't go now, watch today, right, watch right now, but go back and you can get the basis, the background on what we're talking about today. Kind of an overview of what the tech is. Yeah. So this today's show and that one will go together as a two-part deal for us. So I'm just going to hand it over to you, Amanda, to tell us all about what we should know about the good, bad, and the ugly of ethics and technology, which, yeah, I get it. Understandable. So again, I'm Amanda Sweet and probably the first question you should ask me is why am I qualified to talk about ethics and technology? And that's probably something that you'll want to ask yourself and anyone that you're learning technology from. And the thing is that in libraries, our goal is for everyone to become qualified to talk about ethics and technology. We all use it every day. We use it and we use machine learning and things like that in ways that we probably don't even realize or understand all the time. And so that is one of my main goals in doing these kinds of presentations is to redefine what an expert is in technology. Because I studied the developer manuals for machine learning, for AR core, for augmented reality, and I studied the development of virtual reality. So I learned more about how it works and knowing how it works is one of the key components to understanding whether or not you should use it. I don't expect everyone to read the entire developer manual for all of this tech. It's a lot. And I also did background research into the ethics in technology and human-computer interaction. And I did a lot of self-study on Coursera and through a computer science intro course through Harvard. And so it's basically, I did some background research. The reason I did that research is because I started looking more into technology and felt like I was drowning. And I think that's a very common feeling for all of us, even though I was just trying to figure it out, not even for your work or your library, but just personally. Yeah. So when you feel like you're drowning in tech, which I'm sure everyone is, I mean, it's hitting us at work. It's in our personal lives. It's everywhere we go. So when you feel overwhelmed, learn more. And that's what we're going to do today. So I'm going to give you a very, very, very, very basic intro into ethics. So ethics is basically, it's the concept of is what I'm doing right or wrong. It's the moral code that we use to make our decisions. And every ethics person you ask is going to have a different definition of ethics, a different approach to ethics. And they can make it as complicated or as simple as they want. Today, I'm going to make it simple because we have some other stuff to cover. So I'm just going to give you a few techie things to keep in mind. And I'm going to give you three main concepts to watch out for that are going to recur in several different areas of your life. So instead of focusing on just social media or just machine learning or just Google search engines, I'm going to break it down into larger concepts because those concepts are going to be applied to multiple different things in your life and in the future in ways that we don't... That's very useful, yeah. And I'm going to... The things to keep in mind are also going to tie into how technology affects us because there are some questions that we should start asking ourselves as we're choosing new technology and using technology now. And then this presentation is, I'm in a library, it's four librarians. So I'm going to do my best to gear this tech talk toward librarians. I can't guarantee you'll be successful in narrowing it down that way all the time, but I'll do my best. So first off, ethics. This is the most simplified version of ethics that you're ever going to get. And... But it's the variables that you put into this decision. It's understanding what goes into the search engine. It's understanding what goes into techy stuff. It's a technology so broad that there's no one way to apply ethics to one single aspect of technology. You just start formulating your own questions and looking at it from different angles. So my goal today is to start getting you looking at this from a different perspective. To look at it not just, my friends did this, so I'm going to do it too because it got more likes. And that's also the basis of digital literacy, which is what we're going to get to a little later on. This is what my favorite goal for ethics. Just do good things. It's looking at the world around us, see what the world is right now, and then think about what you want the world to be. And then how can you use technology to make that happen? And instead of just finding a problem, looking at your list of apps, finding a keyword that matches your problem in the title of the app and throwing an app at it, try it a different way. Try looking at it and saying, this is my problem. These are the issues that are contributing to this problem. This is the technology that will solve the issues that will eventually solve the problem. This leads me to is tech good or bad? It's all in how you use it. It's not a black and white and it's not a yes or no, yeah. Blockchain started with cryptocurrency, but now it's starting to be used to help the homeless. So many amazing things coming out of that. You keep seeing something new every day. IBM is starting to do more research into using tech for social good and Google is starting to do AI research and funneling the use of AI towards social good because artificial intelligence is all in how you look at it too. If you go to Google right now and say, how will AI save the world? You're going to get a list of stuff. And if you go to Google and say, how will AI kill us all? You're going to get a list of stuff. So again, ethics is all in your mindset when you start out in the beginning. If you go looking for bad things, you will find them. If you go looking for good things, you'll find them. And if you try to be unbiased, good luck. So the first thing I want people to start keeping in mind is how technology shapes individual identity. And I'm going to touch on this a little bit, but I'm also in the handout that I have. I'm going to give you more information about where this concept comes from. It's rooted in psychology. It's rooted in ethnography, anthropology. It's the study of humans and how we work and why we work, what makes our identity, what makes a person a person. And that seems like a really, really large concept for just studying technology and figuring out what you want to use. But it's part of it. I mean, look at social media. It's changing the way we think about things. Yeah. And it just, if you try to start thinking about what you want to be as a person, how other people affect you, how technology affects you, how social media has started changing the decision-making process. And I'm going to use a specific example of how social media has started changing the way business leaders have even started making decisions. Oh, yeah. It's inevitable. Yeah. And this, this is one of my favorites. When you're self-esteem is dependent on how many likes you get. And it just, should we really be tied into a thumbs up on a screen? It's, how do we want to think about ourselves? It's nice, but don't base your entire life and all of your self-esteem on that. There's so many other things. Yeah. And that's part of digital literacy is, a lot of it is who do you want to be and what defines you. And that's a hard thing for teenagers that are forming an identity to do. And it's just, it's always going to be hard at that age. Yeah, it's hard at any age. Yeah, it was when we were kids and we didn't have all of this. Yeah. And now it's just another thing, another, another venue they have to deal with. Who am I and what am I going to, how am I going to present myself to everyone else? Yeah. And that'll, this will tie a lot into human-computer interaction too. Oh, yeah. I mean, it all will. And then how technology shapes society and the world as a whole. So technology is hitting us at an individual level. It's hitting us at a community level. It's, it can either bring a community together or it can redefine it. It's expanded our definition of community to not just the people that are intermediate vicinity to the people that are in our interest groups nationwide or globally. And when you have options to go to a nationwide group or a global group, you might have less of an emphasis on your immediate community. And that community can either filter or flourish based on that. And again, that's going to be who you get involved in it and how you think about technology. And if the major organizations in your individual community are leveraging that technology in a good way to bring people in and I'll touch on that a little bit later, but that's almost another presentation. We can do that another time. And the dual identity of Facebook and social media in general and companies actually. So you try, we actually teach this to kids in digital literacy to filter what they put in social media. We say cast yourself in a positive light. Your future employers are going to look at this. And this is how people are going to define you in your head, in their head. So we are teaching people to create a dual identity. And so now the question is, who are we online? Who are we in real life? And is it healthy to have this dual identity? And there's actually a lot of research that's going on right now. Joint research and there's a lot of human-computer interaction research going on about the psychological benefits and vitamins to doing this. And unfortunately, we won't really have an answer until we have more data. It's been, it's only been a too short a period of time to see the really, really future effect. Facebook didn't even go public on a larger scale until it was about 2000. And it started out in just isolated universities and then it expanded out into schools and then it expanded out into the general public. So users like Facebook has evolved and social media has evolved but have our privacy policies and our usage policies evolved with it. And we won't know the full ramifications of this until years from now. And it's unfortunate but a lot of this technology is that way. And that's why it is really important to start emphasizing the humanities in this. Psychology, anthropology, the study of humans and start incorporating this into how a technology is made. And this is personally, that's one thing that I would want to put more into STEAM. Because right now, we don't really put a lot of that. The age in STEAM is silent. There are people that want to add it. There are people that want it to be, I don't know what kind of huge long word. I think it all just needs to be connected. I mean, this is interesting thinking about, you know, how you present yourself on Facebook or anywhere social media compared to what you are really but and what you're just saying and it just don't mean, I mean, I've been on Facebook and I use it as in 2012. I thought it was like 2007 but I mean, being online with things, it's been so long. It's just a new place to have, a new place to do the same thing you always have. I mean, in real life, you don't always say everything that's going on with you to somebody. Like there are certain things you do tell your boss about your personal life or how you're feeling about things and certain things you don't. And same thing with your friends and your family. There's always these levels of this is what I am. This is who I present myself to you in person and this is who I am. What's going on really in my head and here's how I present myself to my mom when I call her to cry about whatever happened today, which is different than how I keep myself in, you know, I'm a strong person, a strong woman here at work and I'll hold it all in. So it's, we've been doing this already. It's just now there's this other venue and for some and lots more people can see it and for some reason people think it's something totally new. The venue is new, but what you've been doing isn't. You've always chosen what you'll say to some of your, how you'll present yourself to someone, whether you will, you know, and I do see the people on Facebook who go from, who are, everything is always perfect. Their kids are the angels of and never do anything wrong. Like, I know, I know that's not true because, you know, I'm a, I'm a, I'm a human being and I understand they're sometimes bad. It's okay to just share all the fun things and then there's the people that are the complete opposite that are every couple of hours. It's a drama. Yeah. I just don't know why I go on anymore or well now I got this bill that's coming in and then tomorrow it's something else and really everything is, I mean, is it that extreme? Yeah. And you wonder what, you know, is that something you would really like just say to your neighbor or your coworker? Maybe not. And it is something you got to, you've already been doing this. Yeah. I mean, we're splitting too there. Yeah, I always are. Yeah. I don't tell everybody everything. Yeah. I mean, perfectly no one does. No, you need to have that. Yeah. But it just, and you're going to find that. And I think, I wonder if the social media, the online location is because there's not another, you're sitting at a computer potentially all by yourself. Yeah. You don't see another person's real face. And that's easier to say the things you normally wouldn't and that's maybe where some people need to learn to pull back because maybe it's not the right thing to put out there because you, who may see it. But there are, there are some ways that that is good too though. Like because having that, like that digital distance can also make people open up in ways that they never would have before. Yes. And if you have some groups that you interact with like a therapy group or a group of friends that it's in a little private room or location that is, I couldn't say this to anyone else, but you guys will listen to me or I know the people who I am friends with on Facebook will help me through this deal and either support me in whatever is going on or we'll smack me up the high side of the head, which is what I needed. It's a good thing and tell me, stop it. Don't do that. You know, that's not right. Go and do the right, you know, that maybe that is easier just for some people, not to say to a person in person, but to say online and get that interaction instead. So it can be helpful. Yeah. See, and this, right? You've got to be careful about who you're doing it. Yeah. And this right here, it's an ethics discussion. Exactly. Yes. It's, is this right or is this wrong? And there's, it's going to depend on the application. It's going to depend on the application, the situation, who you're dealing with. Yeah. It's, it's heavily nuanced and it's going to be different for each different person and each different group and each different, if you're using social media for personal life or if you're using social media for your business. Exactly. A lot of people have emerged because it kind of evolved that way and it just. And a lot of people that have totally separated it. Yeah. There are people, whether it's allowed in this terms and conditions or not have multiple, two Facebook profiles. Yeah. One is just for my family and friends and one is for my work professional face. Yeah. Yeah. And then that's almost a triple identity. And if you can keep track of that. Yeah. So let's go on here. So these are the three main concepts that I'm going to talk about today and I'm going to use the three concepts using two main examples. Google and Facebook because those are the two main things that librarians use in an everyday life. And in the previous session, I talked more about blockchain and artificial intelligence as a whole and I dove a lot into machine learning. And I'm not going to touch blockchain as much today because it's less of an. It's more unknown and it's not something that librarians are going to use on an everyday basis right now. It's coming. But libraries are sharing to investigate it. Yeah. Maybe for a future presentation. But hopefully your community will. But possibilities. Anyway, so user experience design is. People who study other people to decide what makes them tick, what will make them click on your website, what will make them use your product, what will make them feel good about spending a lot of time on your site. And you'll notice that I talk about this a lot in a digital sense in a website sense, but it's used a lot in product design. It's used in pretty much everything. There's another link that I'll put in the handout that'll give you a bunch more information about what that career field is and what that concept is. And that's a whole other field to study. Oh, yeah. It's used in library science a lot and that's the example that I'll use to describe it. When we do collection development and we're trying to decide which books we want to use in the library. We do a community needs assessment and we figure out what which different professions are in the library, what the interest of the community members are and what people actually need. And then we try to gear our library collection towards that. And then we also try to add in a few things that could be helpful information for those people to have. And so we do that already. And we also when we build a library website we track which sites people go to. We track sometimes we track how long they spent on there. We can we can track a lot of stuff. And it depends on what you track and how you use it and that'll feed into statistical analysis. And that's a lot of what Facebook uses when they are it's like their life. But machine learning the basis of it is stats. It's stats. It's math. It's trying to identify a problem that can be solved in a quantifiable way. And so when you're trying to apply that to a human problem like an example that I like to use that is rather controversial. Oh cool. Isn't it? Yeah. That so they started trying to apply this to determining court cases. So whether a person goes to jail to sit out and wait for their court date or whether they're able to go home. And so they tried to figure out the different factors that go into a judge's decision about making this case. But literally. And then they tried to build a machine learning model that replicates those factors. And. It's having mixed results at work. I'm just wondering is it accurate and that's where you get into biased algorithms. Because if you're African-American going into this model will you have a more likely chance of going to jail to wait out your time. And they also ran into biased algorithms with Amazon. I'll finish this example before I use the Amazon example just to keep them separate. But so there is a huge controversy over whether that machine learning process should be applied to court cases. Because that is it's an extremely complex human decision. It is. It's very. Yeah. It's going to be very subjective to people. Each person's personal view. Yeah. Of it. Yeah. And how can you quantify that? And is it something that is clear cut enough that stats can solve that problem? Yeah. And that example gets more complicated depending on how deep you want to dive into it. But that'll just that gets you started thinking about it. And they're I mean they're also using it in New York to start identifying where crime happens and start doing crime stopping. Like they were trying to find out they were starting to get some clogged or drains in different areas because restaurants were dumping grease down into the sewers. So I don't even do that in my own kitchen. Yeah. Yeah. And so they started doing a stats analysis for different restaurants that use the large in a quantity of grease that would have been you that would have clogged up the drain. And then they used a location analysis to determine which restaurant would have been nearby enough to have been a cause of that. And then based on where the water was going and how the street flows. They determined the most likely restaurants to be the culprit and they had a 90 percent success rate of identifying the restaurant. Wow. Okay. And so it kind of work. Yeah. That's something that's less subjective. I mean it's it's all just water flow and logic of well if this was this then this. I mean it's yeah. So the thing is when we're show involved people. Right. That's the difference. Yeah. And people are nuanced. They can be an unknown entity. And that's why if you look at I forgot to put this in the handout but if you look at Wolfram language Wolfram language is one of the languages that goes into the machine learning algorithms that are used in Siri and in like the natural language processing. But the only aspect of Wolfram language they use is in stats and math because that's more clear cut. Yeah. But Wolfram language also dives into more psychology analysis and Wolfram language is trying to pre build a lot of this decision process into the original algorithm. And so is that going to be helpful in creating the algorithm possibly or will it further obscure the factors that went into the model. And it's it's a toss up. Yeah. And anyway I go on tangents about machine learning but we only have so much time. Oh, it's a question of that. Oh, the language Wolfram language when that went into what you're saying. Oh, W-O-L-F-R-A-M. Yeah. You can get it for it is a pre a paid program. But if you get a Raspberry Pi it is free on there. If you see Matt if you get a Raspberry Pi loaded to go to the Linux homepage you'll see Mathematica. Mathematica is one of the language that's tied in with Wolfram and Wolfram is a whole other entity. Do you want to add that to your hand? Yeah, I can add it on there. Yeah. So and she's a medicine and mentioning the handout. Yes, she's these slides and she has a separate handout with all the details of what she's talking about will be available to you guys afterwards. So that information will be on there as well. This our previous session last month had the same thing that slides were posted afterwards and then a more detailed handout that has links to all the different concepts and ideas and useful things that she's talking about today. So and if you go I'll put the so Wolfram is really over complicated. So the one that I'm going to send you to is the Wolfram demonstrations and the Wolfram demonstrations is a more user friendly kind of intro to it that will give you there's someone applied psychology principles to rock paper scissors. So that's kind of a cool way to learn how AI works and how you can build a data set just of playing rock paper scissors digitally. So the computer learns the decisions that you would make and then by the 50th time would play at the machine should be beating you and you can go through a bunch of those different demonstrations and you can learn more about how linguistics goes into AI and different things like that. So it's cool but yeah and big data you'll find everywhere. Big data is basically it's defined by the sheer volume of data that is available. So when you go on to social media when you go on to Facebook Facebook is collecting a lot of stuff. Yes and that data has to go somewhere and then it has to be analyzed in different ways and it can be sliced and diced and interpreted in every way under the sun. And so that's my this is just my little sample slide of what we're going to go over. So I'm going to skip ahead because I spent way too much on that way too much time on that last slide. I apologize for that. Sorry I got jetting. But so I'm going to demonstrate user experience in just two quick slides here. So this is Facebook Facebook helps you connect and share with the people in your life. This is ethics at its greatest according to them Facebook addictive. So it really depends on how you look at it and when you look at different websites start looking at how they present themselves. And then if you look at how a company presents themselves look again at the decisions they make because we pay attention to what makes the headlines. We don't always pay attention to how they make their ethical decisions internally. How would we know? And so this is an example that I like to use. Think about how Facebook is using our data. Cutest little maybe you'll ever see. But let's take a look here. Facebook has been more or less studying us. They look at different keywords and they look at people who have had kids. And this is in a section called Facebook insights. This is on Facebook's marketing page. And this is what businesses that are trying to get us as users to click on their ads. So Facebook has sliced and diced our data. To show marketers how to better target us as individuals. So one specifically whether you do have children and don't and how things change from and it shows how priorities change and it shows that parents early parents they go online more there. I mean they try they're trying to find out what in the world they're doing. Sure. And so we're like we as users are online and we're posting baby pictures and we're posting and I've never had a kid so I don't really know but I have a bunch of friends who have kids and it's interesting. I have a friend right now who's and that you mentioned is that early parents. So you're talking like before their first kid they go online and do this more than when they have a later chance. They track the behavioral changes. I can see that I got friends who have one and then a second and I remember because I'm in a few years the first baby it was all over the place and then the second they're like yeah this is what she's doing today. Yeah and I'm like wow I almost forgot that there was a second one and they just a little bit more like back about it. They like is that they're trying to figure it out. They figured it out. So they don't need to be on as much and so Facebook collects all this and they say they give their potential marketers pretty much everything they need to know about user behavior so that they can target their product to us as individuals. So is it genius on their part? Yeah. Yeah. Is it help me creepy to the people who just had babies? A little. And so if you look at their series and reports and their insight section you'll see a lot of these different sections and there's a lot of different experiments that Facebook has been found to do on people. So Facebook occasionally changes the way they put their feet out to people and they've actually done a mood experiment to find out if different kinds of statuses affect your mood like if other people's statuses affect your personal mood. So the way they did this was to they filtered in they chose a sample size and it was about a little over 600,000 people and they would feed in only good statuses from their friends to those people and they just chose them based on keyword and then they found out that the people who got that only good statuses started putting out positive statuses of their own. Yeah. Sure. And then the people that got negative statuses they either did not post or they put out negative statuses. And so Facebook has been doing different like these little mini experiments and some were just like oh yeah that makes sense from a marketing standpoint and others are like why in the world would you do that? And some of them make the headlines some don't. And then think here too this all of these information reports this isn't something this is all public. This is all public. But we're looking at here yeah this isn't like only this is some belonging to something special and we only have access to some reason you know that anybody can go and see that this they've been collecting this and then yeah I mean they tell making it into a presentable here's how things are affecting people and they tell you like they we signed all this away when we signed up but we just don't and if you have a Facebook page you know I know because I run a few I want one for Encompass Live they're always pushing to check your insights here's where you for your own particular page can get data yeah on what people and tracking with your page you're doing yeah and what we can do with that I don't know if I have the experience of that knowledge to do as much as what they're doing and I'm as big a sample set of course as I don't have 600,000 people who like Encompass Live unfortunately but what we do have those insights out there yeah. Yeah and it's just I'm not saying this to vilify Facebook in any way Google does it to an extent libraries do it I mean oh yeah we are both community needs response you are studying your community or even asking them voluntarily to to do surveys or focus groups or something that's a voluntary thing on for the purpose of but you do watch you know get statistics on what is being checked out what isn't being checked out what's going on in the community that I know I need to respond to at the library as a service or something we need to expand our collection and because something has changed in our community and you're doing that sometimes without talking to the people directly just noticing what's going on out there same thing that Facebook is doing it but at a more analog I guess is the way of describing it but it's the same concept yeah and and also libraries we could do it through Internet of the things by starting to track the different section that library patrons frequent at different parts of the day oh yes and there is actually a work being done in that I forget which library or set of libraries is but it's they're doing it so it's just this all this data can help with user experience design it can help you can help us learn more about our our brand it can help us find out what people click on what people like how to tailor our product to what people need and we can do that in an awesome way and a various way depending on how you think about it and it just so what do you look for in a company that says I trust this company and do we want Facebook to be a centralized source for all of this data and are they treating it well they just had that Cambridge Analytica thing not that long ago that compromised a lot of her data and that made people look at Facebook and a more negative light but maybe they'll do something with more social good and start changing their branding a little bit and maybe we'll start to look at their website and like them more mm-hmm and but are the underlying decisions are the processes actually in our paper or do the user experience people just know what we want so I don't like the last thing that I want to do is to vilify a lot of companies by doing this because a lot of people are doing awesome things benefiting you're benefiting from it I mean it is something to to think about to I mean ethics as far as what you're doing when you're putting stuff out there but you as a user as well yeah and that's something I have I've had discussion with this over the years as more of this social media type stuff has been happening it like yeah the creep creepy factory said if they collected all this information about parents and their babies but then it then pushed out to some of these parents particular ads for things that they might actually need yeah and it may be creepy that hey how did they know I needed a new double stroller yeah rather than my old one whatever but you know what I do let me check and see this ad popped up that was totally relevant to me it's relevant that's the whole thing that's the good part about it but some people still have this ethical conundrum of but how did they know exactly they knew because you gave them the info whether you knew you were doing it consciously or not and is that giving up that privacy worth what you get back yeah as as a user and is as a library collecting that information worth it to I don't use the word invade but take the invade the first years of privacy to help give them more what they want and we also get free access to Facebook and Google yeah and ads are the are generating the revenue to give us free access to these sites I should put an ad block yeah but yeah I have this thing why try and bounce to I some people are all about I'm not going to go on Facebook I think it's evil I don't want to get them to have my information and I don't want I'm like yeah you have to decide am I going to give up a certain amount of that to get the benefits though yeah and I am I put certain things out there and then when I get the ads and like like I'm getting lots of ads for things I've been searching for outside of Facebook come in that I see you know building and I'm like and then I it works I've accidentally not on purpose clicked on sounds like oh that's interesting and then more of that same topic come up yeah like oh yeah that's because I clicked on that a couple of times two weeks ago and we also have choices to whether we click on that's true yes and it just and these and I have so I know why it's happening and we have control over the privacy settings on our accounts exactly that's the thing too that I think is important is to that we are gathering information doing things out there as a library trying to get information but the individual has really the controls because they don't realize it or they are still they're so up in arms about it that they just want to rant and something feet and yell about it but you can control what you what's being taken from you what they're collecting what they're sending to you have that at control actually if you go in there and find the spot to do it that's a whole different discussion we're here trying to talk about yeah libraries and what we do with our there's actually a link on their privacy policy on here about how you can change it how I manage and delete info about so yeah this Facebook actually had to change their data policy and they really adapted a lot after the Cambridge Analytica thing I used to look like that but and now you'll see the format here looks really similar to Google's format I don't think I pulled that link but it's yeah and the other thing is Google calls it a privacy policy and then shortly after that Facebook calls it a data policy so Facebook's emphasis is less on privacy and more on data collection and Google their user experience designer decided to emphasize privacy and again that's just branding and then Google also started doing more into applying AI to the world's biggest challenges and this is actually one reason that I like Google and is Google putting this together just because people were freaked out about Cambridge Analytica maybe but they still did it and it's a good thing I mean a good thing came up a bad thing I will say yeah and a lot of the a lot of companies working in social media AI machine learning big data they learn from other companies mistakes and that is the biggest good yeah that is one of the biggest things with emerging tech is that it did not exist until not too long ago we don't have the information yet to know whether it does or it does not work and we as users have almost a responsibility to find to learn more about how it works and us we as librarians have a responsibility to tell people about this yeah and to show people that part of digital literacy is knowing that you have access to privacy policies and that there is a benefit from taking the time to at least take a gander at it and knowing that you can also use a lot of the principles that Facebook laid out in their insight section to your own community and to your own personal life because the way that Facebook lays out really well how they lay out their user experience design and how these lives and vice their data this more or less tell you the factors that go into it and if you read more about it you can start applying this and the information from Google to your own community problems in your own personal problems so it's very useful yeah and so it's all this is why I say change the way you think about it you can think about Facebook like it's evil and it's doing like all these horrible things and we should never use it like this I put this slide together because it vilifies Facebook this is user experience design I want you to think that Facebook is a horrible thing it was a bad thing does it work do you think for yourself or do you let me tell you what to do this says that I like Google this link will send you over to Google's data policy our data helps build a better future that is the quote that go that Google uses according to this Facebook is horrible Google is awesome yep is it you can totally swap those yeah those quotes I mean I for each service however you want to present it and think about it that data policy and that link is not titled it doesn't say Facebook data policy it just says data policy it could be anything until you click on it that's digital literacy so we have some some comments here about this too well I'll mention um someone here says that they I give an online privacy and security course and I always tell them to at least look at it and know it's there the privacy policies yes I know that they're there that that's part of using these services yeah if you're especially if you're attending an online privacy courses you're obviously already concerned yeah and if you're here you're probably already concerned about what am I should I be doing or should I be paying attention to go ahead and look at them and see what they say and that will help inform you about what each of these services are doing and then what you as a library can be doing from your side yeah and open up a discussion start asking people questions and start people thinking about this kind of stuff and asking themselves questions internally because one single one hour in a library is not necessarily going to change a habit it's gonna be an ongoing yeah there's neuroplasticity that says that it takes a really long time and I'm not gonna talk about neuroplasticity because we only have how long a few more minutes five minutes but we started a little later that's okay as she says you're the first step to protection is information yeah that goes for anything libraries do that's great and talking about specifics and a funny comment here earlier too about have the your data being out there and getting things in response yeah this person says I get coupons for chocolate close to a certain time of the month and I use that example all the time for me it's worth it so I always say to look at both sides of the coin so somewhere out there knows that from whatever this person has put online this is the time of the month you might be wanting some chocolate in your life here you go and they've tracked that somehow to know that you're not wrong yeah you're not wrong thank you you have a lot of control of the online of the information you give online and that is one quick example that I do want to go over is thinking about how you make decisions based on Facebook so in this insights they also talk about they segmented out their different target audiences and one of them there's actually decent examples and all of these and they changed their people insights for today so they no longer have that one featured I'll just summarize it they did a more or less a study on the media and business executives who are most likely to be decision-makers in their company and they use that based on self-imported titles that go into your Facebook account and then they studied the time duration that people spend and when they spend on Facebook and discovered that a lot of people are using Facebook likely during work and they also post things that are related to some of the decisions that they're trying to make a lot of them probably aren't in so many words but they are trying to gather input from their audience and then Facebook started using this information to encourage businesses to market their specific product during the day and market the product that is likely applicable to the business executives decision-making so that while they're already on the mindset of making this decision and they're gathering input from other people they can creatively place ads that may sway or change someone's decision and so now their Facebook is aware that they can change people's behavior oh yeah and so if Facebook ever puts out a headline that says oh we had no idea go to this they know it's just stay informed and some as a comment while we're on the screen here which is interesting that middle one right there why stories my stories is a format that can help marketers promote brands this is your Facebook stories and so it makes a various do comment of course but interesting that stories health businesses promote may promote brands works for libraries to yeah that's one thing that comes up all the time with advocacy for your library is what is the story you want to tell what stories are you gathering and then you want to share to promote more about your library Facebook stories the format to help yeah and when you're building an online presence what story do you want to tell yourself what story do you want to tell about your company yep about your library and I mean marketing half of it is your story go libraries I'm going to skip ahead a few through these because I already covered part of them and we're rather low on time that's okay we'll go as long as it takes to get through all the basics that we need here yeah so basically the whole thing of it's about who should we trust and who is playing on fear to get what they want and that is something that info having the right information can help us determine and it's also going to be the mindset that we go into looking for that information because again remember I said if you Google how will I save the world you will get a different information set than you would if you Google how will I kill us all yes so when you send library patrons out on a research spree to find out to start thinking about this stuff think about how you phrase it to and if someone walks up to your reference desk and says I'm looking for more information about how AI can be harmful start asking more questions and say we have some information about how AI can be harmful but have you thought about how AI can help to what are well for what purpose are you looking for that particular view point yeah yeah and then you might be able to create a more unbiased look and anyone who's ever been in collection development knows that being unbiased is virtually impossible but we try we try yes to get as much in your collection as you can that it will show both yeah valid sides yeah yeah mean we may not like it as individuals but it's still we have the information out there it's in the library code of ethics mm-hmm it's in the handout and American Library Association also started putting out a social media code of ethics to drop it out in early January but it's pretty recent so I don't I just stumbled upon it a few days ago so I don't know how many people really dug into it so I talked about a lot of this already so digital literacy came up intermittently and that's just keeping people informed about how things really work and thinking for themselves and that is probably digital literacy has become a monster of a thing it's everywhere and it's heavily nuanced and it's everyone defines it in a different way and everyone's adopting it in a different way so I'm just going to simplify it here learn the main concepts and think for yourself and think about what you want yourself to be and what you want your community to be and what you want the world to be and then start acting and choosing technology in a way that will help you do that easier something done but it's a starting point and makerspace ethics I put in here because we are giving people the power to create their own things we are telling people we have a bunch of stuff that you can come into the library you can make whatever you want and you have we are empowering people to create and that is amazing are we empowering people to create with purpose when people come in and they're starting to choose projects and they're starting to spitball ideas back and forth do we have anything out there that will say create a project for good look at what your friends are what look at what the problems in your community are look at what the problems your friends are having look at your problems how can you solve them in a productive way and how can we put the tools in the library to make change for good instead of just saying here's a free for all good luck have fun make something cute make a make a yeah and we're just make a coaster for your grandmother that's great once you learn how to do those things now what more can you do with yes go beyond just I made a cute coaster for my grandmother yeah yeah and instead of just building a website build a website for good if you're going to teach people HTML CSS and JavaScript that's also a building block for building apps and building apps is a building block for augmented reality it just what we're doing in makerspaces could progress to a lot more things but in order to do that we have to encourage people to go through a really really heavily nuanced learning process and the reason that I personally haven't actually gone down a lot of these learning paths is because there's no direct reason that I need to as librarian it's just what is the most productive way that I could use an app and then what is the most productive way that I could use augmented reality to do good for my personal life and for librarianship and granted after I've done this research project I found about 20 or 40 different ways that I could do that now that you look for it there it is yeah but before that I had no idea before I started researching more about this probably a couple of years ago I had no idea and it just so I had known motivation and I could say more about this but that is also another almost another presentation in itself and so big data is going to be appearing in more and more libraries it's somewhat it already does but so libraries could also use the big data that's being collected by social media and the big data some some probably are already their user experience is becoming more popular in every industry and it is becoming more popular in the library and shit too so the information is out there but what do we want to use what do we have the budgetary resources to use and how much time do we have to do it and a lot of the stuff that exists out there right now can help libraries and nonprofits but there is a cost gap and that is one of the biggest things that we as librarians would want to help with is reducing that digital gap that digital divide so right now we're focusing on robotics and computer science with everything else and I'm sure some libraries out there are focusing on this in some way shape or form as well but who's to say so I guess to leave you on a good note just learn more and think about what you can do for good welcome to ethics we had mentioned earlier we talked about measuring how people are using your library building and I mean there are people doing that like how we talked about what we think they're both are checking out but like how are they physically using the space yeah and how to track that and the ethics of that of course is a thing you know where they're going and I know they're the thing Jason Griffey is a researcher in library and his this project measure the future whether you can just look that up measurethefuture.net and this is actually a product that they're doing where they are tracking where people are using how they're using your library the number of visits what they browse what parts of the library are busy during which parts of the days of physically you know using open hardware sensors to collect that data and then get that to the librarians and there's a lot of information on the page about the privacy of it and how they use it and everything so that's something another thing that's being used really cool pictures here here's where the little dots where the people were and how it's all and anonymized yes. So their goals enable libraries and librarians to make the tools that measure the future of library as a physical space and its open source tools open source software tutorials and everything in here so that's something to the I'm sure they had a lot of discussion there about the ethics of we are tracking who's physically coming into the building and then putting it together using Google Analytics type resources to figure out while they keep going over here let's figure out why we need more resources there's just whatever you need to do your range so that's something good I think to take a look at and building a smart community there's a lot of they can do a lot of good but there's a lot of controversy about collecting the data that they need to do it. Yeah exactly but do we have any questions? Yeah, do you have any other questions anything you want to ask? You type in the question section I mean it will answer any of your questions library and anything go ahead and type it in it's a good questions and comments throughout the show so that was good. Yeah. And this is a great like I said that this is a two part series here we did on this maybe there'll be something else in the future but for now this was our emerging technology set last month of the February 13th one as I said and then today's when this is the archive for today is ready we'll have links the both of them back connecting to each other. So you can watch the first one of this one together I do want you to get more information about emerging technology in general which is the first session and then today's ethics discussion. The slides will be available out there the handouts handout will be available as well to you and I'll add that the wolf from information be added to it. Yep. All right doesn't look like anybody's taking any more questions now. So I think we could probably wrap it up. Actually cool works for me. All right. Thank you Amanda have been here with me this morning and thank you everyone for attending. I'm going to hop over to our there's a lot of thank yous of course as soon as I say something I see this little notification come up they're typing things. Thank you for presentations. So thank you very much everybody. Let's get you out of here to regular website. Just yeah and we're just going to do a tape in and compass live for me there. So that will wrap it up for today's show and compass live you can use your search engine of choice. And so far and compass live is the only thing called that on the Internet. So don't anybody name you're something this we'll we come up as the only search results. So wrap it up for the show today's archive will be on our page right here. Scroll wheel is not really working for you. This is it right underneath our upcoming shows is our archives shows links. And this is where at the top of this was the most recent ones at the top of the list. So today she'll be there probably by the end of the day today as long as go to webinar on YouTube cooperate with me and you'll see here we have what in the world here's the previous one what in the world is emerging technology to be same kind of thing of length the recording length the presentation and the handout that made us put together on everyone who attended today and registered for today she'll get an email from me letting you know what's available. We'll also push it out to our social media as we've been talking we have Twitter and in compass life has a Facebook page while I'm here also show you this is our archives for the all of the history of and compass long going live going back to beginning and compass live premiered in January 2009. So we do have our archives you're going back to the very beginning so there will be if you do see we do see we have a search feature here so you can search for just the most recent 12 months worth of info or all archives. So just pay attention when you are going through the recordings here of the date everything is dated of when it was originally broadcast you will find old information here outdated information potentially links that no longer exist services that no longer exist. But we're librarians this is what we do we archive things and say for posterity so they will always be on here. And as I said we do have a Facebook page for the show and I've got links here to that so if you are big on Facebook give us a like over there we do post announcements of when our recordings are available and the shows are coming up this book is very slow here's our reminder to log in for today's show so if you do use Facebook a lot give us like and you'll keep be notified over there. So that'll be wrap it up for today's show next week I hope you join us we were talking about reading reading diversely specifically the Nebraska Library Association has a diversity committee who their members will be talking about some titles they have been reading recently about more diverse topics. So Alyssa and Angela and Anika and who else is coming Oh Laura are all going to be here with us next week to talk about some new diverse titles so if you're looking for some more interesting diverse reading to get into join us next week for that listed books and then any of our other shows that we have coming up talk about health education OER in May we will be joining just you may have noticed if you are a library person in Nebraska our new state poet has been announced Matt Mason for the next five five year period we will have him on the show May 15th to talk about the libraries so please to join us for that right other than that that does wrap it up for today's show we did have a question here about Oh continue education credits for attending Compass Live yes if you are in Nebraska library we automatically submit that you attended today for the show if you are not a Nebraska library you will receive an email from us that's a confirmation that you logged in today with a little certificate that says just that you showed up you then present that to your states continuing education people and they can give you the credit for attending today but we in Nebraska cannot issue continuing education or any education credits to other states but we will give you a little notice that you can then use to apply for that in your own location alright and that goes for both our online really goes for our online ones for our archives you've got to figure that out yourself we can't track who's actually accessing our YouTube talking about ethics today and tracking things that's interesting all we do in YouTube is track how many people watch your recording we don't know who they are or anything so we can't do anything so here we go full circle yeah alright so that wraps up for today's show thank you everyone for being here thank you for being here with me Amanda and hopefully we'll see you next time on Encompass Live bye bye