 Greetings, everyone. I'm going to reorganize who's appearing here just for a second. So, as I'm sure everybody is tired of hearing now, I'm Nate Angel from Hypothesis, and I want to welcome you to this very special edition of what we're calling Featured Educator Office Hours, where we invite some folks from around the community who have some experience with social annotation to just come have a kind of casual chat and conversation with whoever shows up about social annotation and how they might use it in their practice, maybe questions of pedagogy, what does it look like in the real world, or other topics as well. We can really talk about anything we want here. And so, without further ado, I want to give our featured educators here a chance to introduce themselves, and I'm going to ask each one of you a kind of specific question to kick things off. And this is really the only kind of structured part of the conversation, and the rest of it from here will be completely driven by the conversation. And that first question, and I'll go to Dana first if you don't mind, Dana. That first question is if you could tell us a little bit about what your role is, you know, what you do in your day to day, and then how you came across social annotation and how it relates to your practice. And then we'll follow up with that same question to the other folks after Dana has a chance to get it. Absolutely. So my name is Dana Conard. I work with a unit called Online Education at UC Santa Cruz here in California, so it's still morning for me. My role is instructional technology support specialist, so I specifically help instructors put in instructional technology like a hypothesis into their course. Our LMS is Canvas. We also have a video hosting platform, so my day to day is helping instructors deliver their well designed instructional content to their students. And I first learned about hypothesis from the instructors. We had a lot of requests from instructors who had heard about hypothesis and wanted to use it in their course. So then our journey was to implement it in our instance of Canvas and to onboard instructors, and we've been using it officially since fall 2020. That was our pilot. So we've only been using it for a few quarters so far, but the reaction has been very positive both from instructors and from students. Awesome. That's great. And I think one thing that is clear here is that we do have a kind of special focus today on some of the folks who are a little bit more behind this. Scenes with enabling social annotation at their institutions as opposed to what you might call front and center at the beginning of the classroom. So, Kyle, it's kind of the same question to you. What do you do with your days? And I should ask, what are you reading as Rami in Ontario just asked in there? Maybe we'll get to that in a minute. And then how did you come across social annotation and how does it fit into your practice? Yeah, so my name is Kyle Denlinger. I'm the Digital Pedagogy and Open Education Librarian at Wake Forest in North Carolina. And in my role, I'm on a team in the library called Digital Initiatives and Scholarly Communications. So we deal a lot with digital humanities projects and anything that really touches on copyright and access related to scholarship. And with the addition of me on our team to more digital pedagogy things such as OER and that kind of stuff. And I also teach fully online and I've been teaching a fully online information literacy course since 2015. And in my own just kind of looking around being very much online, I found hypothesis probably through some online discussions and wound up using it in my own fully online course before we had an integration into Canvas. My role has kind of shifted a little bit since I moved on to this new team. I still teach online, but now I am much more faculty outreach oriented and supporting faculty development of online courses and specific digital pedagogy projects. So the big thing that happened with me at Wake Forest was that of course with the transition to COVID, I really wanted to integrate hypothesis into Canvas and get hypothesis in front of as many people as I could before we went live in fall 2020. And so I could speak a little bit to our kind of faculty support model that happened in the summer of 2020, which I think was really successful at getting hypothesis to be well known on campus and to be extremely widely used in fall 2020. Really, really cool. And I want to give Shauna a chance to jump in too, but we know that she might be having a little bit of trouble connecting. I love hearing the very different perspectives that you guys kind of have already brought to the table. And I'll just, would you say what I'm already thinking about is sort of the institutional context in which you're working? And so I think, I don't know Wake Forest that well, I'm going to admit. I have spent a little time at the, what may be the most beautiful campus on earth at UC Santa Cruz. Go bananas lugs. And it's what a fantastic location that is. I just seems like unfair that anybody even gets to go to school there really. But I'm curious, and maybe we can start actually with Kyle this time. Is there, do you think there's something, I know that social annotation is kind of taken off at Wake Forest in a pretty big way. Is there something, do you think that's specific about the culture or context at Wake Forest that has made that possible? That's a good question. And it's kind of, I would say when we piloted hypothesis in fall 2020, it was kind of a perfect storm of things. And I think a lot of it had to do, well, I think it's partially due to like our heavy evidence on small courses or small classes, Wake Forest is a private university. So we have very small class sizes. And it's quite a luxury, right. And it's very liberal arts oriented. And so there's a lot of reading and writing, even in the more science disciplines. And I think there's also a really healthy culture of teachers doing really experimental innovative things and feeling protected in doing so. So I think when people were exposed to hypothesis in the summer of 2020, in this faculty development program that we ran, they saw a lot of applications for it in many different disciplines, and they just kind of ran with it. Yeah, so it, I think it has a lot to do with the teaching culture, but I think more than anything it had to do with the way we did faculty support in the summer. That makes sense. And I mean, what you say about there being a perfect storm is, is sort of true just with the pandemic as a whole. I mean, at hypothesis, we've just seen social annotation usage skyrocket. And that's kind of disturbing silver lining inside the pandemic. But if that's what it takes to jumpstart something like this, I don't, it's not like I, I'm happy that the pandemic happened. But if there is some good to take away from it, one little piece of good, maybe that's it. So maybe then over, over to you then. Dana, do you think I know that Santa Cruz, there's also been quite a bit of rapid adoption, I would say I don't know if it looks that way from your perspective, but from what I've been seeing is do you think there's something special about the culture there at Santa Cruz that makes that possible. I would say absolutely. One of the leading departments that wanted to get hypothesis was our writing department and they have smaller classes and they wanted their students to engage more. I think a lot of the driving factor of hypothesis as as fantastic as the platform is has been the social aspect of these annotations. Because students, you know, there's there's only so much zoom boxes you could take I think hypothesis came at a perfect time because it was a way for students to interact with each other in a way that wasn't just those in boxes. And it, the popularity of hypothesis really was propagated by the instructors and they they have had wide support for it they really enjoy the tool. Also our onboarding was fantastic. We did adopt it pretty quickly in my experience for a digital tool but I will shout out Aaron from hypothesis she was fantastic with helping us through that transition we had maybe three workshops in the beginning of but right before fall, where she held our hand she walked us through it she answered faculty questions. It was exceptional so hypothesis really helped us with that transition as well and now that we've been using it for three quarters. We're seeing a lot of adoption. Well I'm so glad that you were able to come on we can see and hear you perfectly. I'm wondering I want to give you a chance to kind of kick things off the way that Kyle and Dana did by giving us an introduction to you know what do you do in your days, just so we can understand your context a little bit and then how did you come across social annotation and and how does it enter into your practice. Yeah, my apologies for the technical troubles but And I think I share real similar to what I heard with Kyle and Dana talking about I'm an academic technologist I work in the College of Liberal Arts at the University of Minnesota Twin Cities campus. I support faculty like Dana I do canvas. We do a lot of my personal area of passion is working with geographic information systems but I have to say annotation is very quickly making a is climbing the ladder in terms of the technology that I feel has the most impact with students. So what we the way we got involved was a former colleague of mine had worked quite a bit with Jeremy I believe. And when the pandemic hit. We were already exploring getting the integration with canvas and when the pandemic hit. The college was said we need something we need another tool something else to give instructors another way to interact with students so for me the pandemic in this case was good because it let us we tried to did a pilot first as well and then we did do a contract last semester just within the College of Liberal Arts and we kept it pretty limited for the first. The first year of this use we are going full contract for my college next year so I'm very excited about that. That's great and you know I another thing came up that I think might be really interesting to hear from you about to and that's I mean you're at another institution also near and dear to my heart because I mean who doesn't love the twin cities but I mean the weather the weather sometimes leave something to be desired but what an awesome place and I nearly went to grad school there I was this close. But at any rates, I'm wondering, would you say that there's something special about the culture at at your school that that made it or particularly right for the adoption of social annotation. That's a really interesting question. I think there was something special about the College of Liberal Arts, as opposed to some of the other colleges. CLA was the only one that agreed to do the pilot. A lot of others were very had other had more concerns and I think we were like let's try it and see what happens. And the faculty who jumped on I didn't do any advertising of this it just kind of went by word of mouth the faculty who jumped on board. We're also just it was it's something very new to them and they were willing to just take it take a chance and try it. And maybe that's liberal arts I don't know. I'm not sure. I will say we had really. We did some student evaluation and faculty evaluations and I'd love to talk about that at some point about the reactions and responses that we've gotten. All right, happy in the chat and talking at the same time is not something I've mastered. As you can see from the typo. And I'm actually curious. I might bounce a little bit off this this chat conversation started started around other tools other social annotation tools because first of all, this I annotate conferences and really just about hypothesis I know it may seem like it is sometimes. But it's really is supposed to be organized around the idea of open annotation and we define that fairly broadly and thinking about especially tools that have openness in other ways like they may have open technologies with API's they may be open source tools. Just being free on the web doesn't necessarily count as open in everybody's book. Because of course being free on the web has other things writing on it. But I'm kind of curious and we talked about hypothesis at your institutions but I'm wondering. Do you guys know of other social annotation tools that are in use in your campus and you have any experience with those. I don't know I don't know who would answer first I'll go to the Brady bunch of you go for it. Well I have a couple of examples maybe. So, and maybe I'll go a little bit deeper into the summer support model that I talked about earlier. So, it's going to be hard to explain but we have about 900 faculty in our undergraduate college at Wake. And was sorry undergraduate and our graduate school excluding our professional schools in our med school. And our summer support model kind of put everyone into like a peer learning community so we had eight faculty members. Sorry, eight development people of which I was one. Each of us created a peer learning community for. I think there was a group of 64 faculty that went through our program in the summer and then those 64 faculty turned around and created peer learning communities for their own. And the key piece of it was that we demonstrated social annotation in the that initial initial cohort of 64 faculty members. And of course that was through our first integration of hypothesis but we also showed examples of that through like comments in a Google Doc for example. Just showing people that it's possible to have a document that's more than just static reading and that it could be engaging and dynamic. And just showing people that that is a possibility got them to integrate that technology into their own peer learning communities that they offered to faculty and their disciplines. And then many of those faculty having been exposed to it then use that technology and their classrooms in the fall. So yeah I can talk more about that later but yeah those are two examples. And then I've also speaking of the University of Minnesota we recently started a pilot with manifold. So it's an open source publishing platform from Minnesota so it has its own built in annotation tools. So we're looking at using that for more digital scholarship projects scholarly publishing kinds of things. And you could use that in a like a course group context where if there's a set of texts and you can create a course group and they could annotate that text within that platform. So it's not like an LMS integration or anything but it's another kind of exciting example. Really interesting. Dana and Shana are there examples of and I really appreciate Kyle also addressing that sort of exponential model that you had for publicizing. I like how it seemed like everything was working in powers of two. You know so it's like eight people doing this and then 64 doing that were you purposely using that kind of like binary exponential. I think we figured out like how how small we wanted the resulting groups to be in the end and just kind of work back from there. But yeah and I think from that program I think we reached something like 90% of all the faculty that taught in the fall were exposed to that program. Not all of them were exposed to social annotation but they were part of those peer learning communities. Right wow that's amazing like exposing I mean exposing 90% of your faculty to anything seems pretty dramatic. I see some knots from Dana and Shana. So what about Dana and Shana? Are there other social annotation tools at use in your schools? Yes I think Google Docs has definitely been a tool that some programs have used. I know that's been actively used. We do have some using perusal and the manifold is something that I don't actually know a whole lot about but I know that's something out there. We have a couple people using a handful of other tools but I think it's very it's just very hit and miss. And so what I think what we've done with the hypothesis pilot is really work in pedagogy and really work on how do you use these tools well. And the peer learning kind of what you said about peer learning is really exciting to me and I'm going to that's something we've seen as faculty really want to connect with others who are doing this and get ideas and share surrounding hypothesis. But yeah others who are using Google Docs I think would be the most common one that I see used. Yeah it does seem like I mean Google Docs is easy right because if you can cut and paste something into a Google Doc I mean frankly their social annotation in the Google Doc is pretty well done. It's pretty powerful. It can be a little too flexible sometimes but it does work pretty well. It's just they'll it I think the trade off there and one of the very different models than the one hypothesis uses is and Prusall uses the same model as Google Docs. It's like you have to bring the text to their tool in order to annotate it where it's hypothesis sort of works the opposite way right where you bring the annotation to the text wherever it happens to be living. So there's there's those two different models kind of going back to what are the differences between different tools that are in use like that. Dana what about you do you have other other social annotation tools going on at Santa Cruz. Yeah I would echo the same thing as the previous two. Of course we use Google Docs for collaborative work especially I personally haven't seen it used too much as as a social annotation tool I just really I'm not a plug I promise like just there really doesn't seem to be a lot of. A good substitute for hypothesis the way that it integrates with our LMS of Canvas and just the culture of hypothesis for instance I haven't used many other social annotations tool for instance I haven't heard of this. Do you go do you go thank you. I don't know if that's right that's how I say that that sounds right to me that's how I'll say it but I know that when I have something I have a question about hypothesis I know the hypothesis team is going to email me back. And work through the problem with me I don't know about these other tools. The fact that hypothesis is also free and open source is also an important factor for us using it or at least for me personally using it and the other thing I would say is I really appreciate how hypothesis also has a very public facing roadmap. I asked them about we an instructor had asked about perusal because they were interested in video annotation. And I emailed hypothesis of like hey is this on your roadmap and and I don't know if I'm allowed to say this but they did have a proof of concept. Am I allowed to OK about doc drop and the proof of concept was fantastic so it's when maybe when it becomes more available I don't know where it is and proof of concept versus like official drop but. Hypothesis is really receptive to. Additional features other other ways to engage social annotation that's not just how we've been using it which is digital text. Thanks so thank you so much for bringing that up and because you know in mentioning doc drop and then also this you know the support that you guys that you guys have all mentioned the support that you've worked with the hypothesis. So thank you for that and there's quite a few of my hypothesis colleagues here in the crowd I noticed that come to to bask in your praise I guess. But just to clarify a little bit on the cost thing right so like a lot of tools offer free web experiences right and hypothesis maintains. Free and open what I'm hosted version for in kind of individual use although many institutions use it in institutional way. But it's not fully supported for them when they do that right just like any tool that you go grab for free off the web isn't really supported for you. And so institutions like Wake Forest and Santa Cruz and Minnesota College of Liberal Arts Designs is at at the Twin Cities campus of the university are all using hypothesis at a scale that they really want that full support kind of you know relationship. And so that's that's the relationship between hypothesis and these schools and money is changing hands and that is the money that goes to pay for all the great support that you all receive and the development of further features and so that's that's sort of how it works. And so there's you know every software offering kind of solves the sustainability problem in a different way. And so there's you know there's different models for that and this just happens to be the one that hypothesis uses. And to the note about DocDrop I'm actually wondering if I'm looking at who is here in the crowd from hypothesis. Because we were just talking about DocDrop which I put a link to in in the chat there and DocDrop is our sort of it's a little bit of an experimental area that our CEO Dan Whaley kind of started to kind of try out some new things it's like a little lab area. And what Dana's describing there is it has the ability you can drop in any YouTube URL and then you end up with an annotatable experience where the transcript of the video becomes an annotatable document but linked to the video in a way that you can kind of work and see through both together and we can even play around with it if we felt like it here. But that we were just taught the reason I'm dwelling on it at Infinitum is that we were just talking about it internally this morning actually and the degree to which it can also already play in the LMS integration. And so I don't have all the details on this which is why I was hoping that somebody else from my hypothesis might be able to pop in and talk about it. But my understanding is that there is now the ability to use DocDrop hosted material in the LMS which didn't used to be true. But one of the issues is that YouTube videos won't come through in that experience for a whole variety of actual security reasons. So it doesn't it won't pop in yet as a completely like as the video editing transcript editing experience that you get outside the LMS but that is hopefully in the near future. So I don't know the full story yet but it's like you can feel it creeping towards something and that would be amazing because you know hypothesis has long been primarily a textual focused annotation tool and then a grant of this is still on the text of the transcript. But once you start to be able to interact with other kinds of you know content forms I think it starts to become really powerful. And that kind of raises another question for me although I'm happy if if any of you wants to riff on any of that. Maybe I'll chime in really quickly. Like the conversation around the differences in the tools you know maybe Peruzal has this feature or maybe Diego or some other tool has you could annotate in a certain way. One thing I've really appreciated about hypothesis is all of the different conversations I'm able to have now with faculty right. So why is an open source tool preferable to other tools that are supported by advertising or whatever are have really close integrations with you know corporate textbook publishers where you know hypothesis doesn't have that really formal relationship like that. And what does that mean. So I'm able to have so many conversations with faculty about about ed tech that are more philosophical in nature and I think that they walk away from those conversations with a deeper appreciation for the choices that we've made at Wake Forest. Like why we went with hypothesis in the first place as opposed to any other tool and are really valuing that we are taking this thing seriously. Right. You know student privacy and and things of that nature are incredibly important and incredibly hard to get your to wrap your brain around these days. And I think that a relationship with hypothesis is kind of a mark in the sand that we're on the side of students and that we are you know thinking about those things more long term. And also things like I don't want to bash any competitors or anything but certain tools are organized more around the like transactional nature of education where everything is built around a gradebook. Right. I could even criticize the LMS for that but other annotation tools have like really deep rich grading capabilities. And what I've loved about hypothesis is I mean there are really deep rich grading capabilities built in with speed grader but in canvas. But it's just as easy to leave an annotation assignment ungraded and we've been having a lot of conversations on campus about the nature of grades and the kind of emerging ungrading movement as a result of the use of this tool. So it's it's allowing us to have more pedagogical conversations to so that's those are a couple of things that I really appreciate. I have to win on this call I couldn't agree with you more. We, you know, faculty come to us saying they need this particular functionality, and I need that little thing that that tool does. And so when we can walk them back and look at what is it you're really trying to do. And then hypothesis has been such a great example of exactly what you said talking about student privacy and why that's important talking about open source and why that's important all the things you just said. And I you see eyes opening like oh there is more to what you all do than just make my life miserable making me learn a new learning management system. And the pedagogy conversations I've had around annotation. I don't know that I've had those around anything else any other approach and I'm really excited as we roll out of our pilot this year. I'm a little terrified for what's coming for me in August when we do this training because I don't want people just using this because they think it looks cool or it looks great. I want them using it well so students have a positive experience and anyway that sort of takes off in a different track but yes I agree the conversations that we've been able to have around annotation have been wonderful and I really appreciate it. That's really cool stuff. I'm slowly moving the figures around moving us back to Brady Bunch maybe we'll just stay here for a while and see if there's some talking heads I'm really I feel like our audience is a little quiet and I'm wondering why nobody wants to come up on stage with us and at least get me to shut up and talk to you guys with a different voice. This whole question I was really thinking a lot about this topic that Kyle brought up around the different kinds of conversations that you guys have been having and I don't think we should say thanks to hypothesis because it seems like you guys have a really considered practice about how to have conversations with faculty about pedagogy already and so it's like can hardly blame it or credit it to hypothesis. But I'm curious the degree to which it seems to me that there's always a tension between tools and practice right and I feel like a lot of times in the ed tech world there's this over reliance on the idea that a tool the adoption of a tool will generate either a set of practices or a set of results like student success that that's why the tool should be adopted. And I like the idea of reversing that right where we're focused more on the practices we want to enable and then which tools will best support those. And I wonder if that I see some nodding heads. I'm wondering if that resonates with you guys at all and thinking about it in that reverse way. Yeah absolutely and I'll shut up for a second because I want to give Dana and Shayna a chance to chime in. Then yeah I think hypothesis is just a really great introduction to a practice and it's one of the best tools for this practice of annotation. And I don't know a better way to put that right it is a tool but at its core it is a practice. And until you're able to experience that firsthand it's really tough to understand what social annotation actually is and why it's useful in your teaching. So other tools do it but I think those tools are tools first practice second because they come bundled with textbooks or something like that. So anyway I'll shut up now. I see Dana muted. Yeah I'll second that I'll say that all of our instructional designers at my campus are very far more focused on finding a tool that works for you as opposed to making the tool work for you. I think the experimenting is also good too. We had a recent one change challenge that we asked our instructors to do and it included a lot of things but you know one of those things might be trying out hypothesis trying out Flick or trying out this new tool. And I think it's OK if it doesn't work for you in your course I think that's totally OK. A lot of these tools are really dialectical you know you can have a relationship with it and come back. But I think a lot of hypothesis is just trying it out and using it and I all of the instructors that I've had worked with hypothesis so far continue to use the tool each quarter that they're using. And I think the pandemic for all it has done for good and for bad. One of the things that we've seen is a lot of instructors want to continue using Canvas in the course want to continue using these tools that they've had to use to make bridges and connections for their students. And I think hypothesis is going to be on that list as well even if they've returned to in person. There's just something about annotating a text socially digitally and making reading visible that I think instructors are going to want to continue to do with their students. Yeah I totally agree. One of the things that came out this happened pretty early in the pandemic was a couple of faculty came to me and talked about how surprised they were to see some students that you know they were physically in class for a few months and then they went online. And the differences in how students behaved the ones who are physically in class and then once they were online and that some of the students who are very quiet in class were actually quite eloquent in the asynchronous spaces. And so that's one of the pieces we really want to carry forward. And when I asked faculty if they want to use and this is not to dis on any other tool but do they want to use the Canvas discussion or do they the ones who have used the Canvas discussion and the in hypothesis they look at me like and they don't even think that they're even the same. And so they all go for reading you know the hypothesis because it takes students so much closer to the content and to the reading and I'm hoping they're all going to keep using it even in an in-person space because it does allow that asynchronous reflection that's very different for students. Yeah I'm kind of wondering about that the future is you know what are the fall plans that your institutions are you all headed back like Sean are you headed back to more face-to-face environment at Twin Cities? Yes it's what did I hear 80% in person there are I think I heard we had 120 courses out of 2000 or something that will be remote or online so it's a very small percentage. There's a whole lot of other conversations we could have about that but yeah it's mostly in person but I do think all the faculty that I know that used hypothesis in this last year they've been emailing me do you know anything yet about next year do you know anything and so they're very anxious to keep moving ahead. How about at Santa Cruz Dana are you guys you're headed back yeah I know you spoke about it a little bit but you're headed back to more face-to-face right? Yeah we're headed back to more face-to-face in fall but we'll still have some mixed modalities as well and some sessions we'll still be remote much to the chagrin of some instructors so. You sound like you're a little sad about going back to face-to-face. I'm not it's definitely mixed feelings I think again the pandemic has done a lot of bad but also a lot of good again it's interesting to hear the comments from instructors about what they want to keep since they've had to use a lot of these technologies like canvas so it's going to be an interesting fall to say the least. Yeah I think of nothing else the pandemic has it's highlighted so many things it's brought so many things you know emphasized many things that are good and many many many things that are bad also and it's just given us a chance to I think be more intentional maybe about how we think about some things. I know we had to do it in a bit of a rush but like just the idea of remote delivery like everybody has had to sort of grapple with it maybe in a way that a lot of folks didn't have to before and you guys have of course been on the front lines of that so I should shut up now. What about what about at Wake Forest Kyle are you guys you're headed back to more face-to-face to. Yeah it's I you know knock on wood that I think they're hoping to have fall be as quote unquote normal as possible and in a normal semester we would have very very few online or even blended courses at all. And I don't think they're going to revoke faculty preference if they want to teach fully online I don't think they're going to disallow that but I think departments are really pushing heavy for. Pushing hard for as much face-to-face as possible. So yeah it'll be different for sure and I'm anxious to see what that's going to do to our hypothesis usage once we're back and you know like I said earlier we have very small classes. And you know that seminar style discussion is one of those things that hypothesis allows you to do in an online course but when we're back in a seminar room right are they going to continue. Well interesting thing came up in I think it was yesterday's featured educator officers and I think Sean and you were there too so maybe you remember but we were talking about. The degree to which students pre-reading and pre-discussing and hypothesis before the synchronous sessions that might be face-to-face right was actually. Doing a couple of different things one was making the discussion that finally happened when it was synchronous better. But enabling the teacher to focus on some things that had come out in the discussion but also meaning the students had sort of gotten some of the preliminaries out of the way maybe and we're ready to dive in on some particular parts more deeply. And then the other part that was new to me and interesting was talking about how social annotation was also making grading take less time. Which I thought was a really I'd be interested to hear your guys's take on that because the instructor was saying that she felt like because students were sort of more engaged in the reading because of the social annotation assignments. It improved their overall kind of success in all the assignments like in the writing assignments and so she felt like she spent less time grading the assignments because she wasn't focused in on some of the basics more and it was more like they were grappling with the real ideas that she wanted to grapple them with. And so I'm my fingers are crossed that we can bridge this gap to face-to-face and go back to that more without like losing the powers that we discovered when we were remote. I guess that was more of a statement than a question. Because I'd love to talk about student and student perspectives and there's a question in here about students suggesting hypothesis. And I think I haven't seen that but we did some, I did evaluation so I did survey, we got almost 200 student responses and then focus groups with faculty. And the interesting piece about what that was, the faculty saw much more of an impact of hypothesis on things like discussion in papers and that the students weren't quite as enthusiastic. But I don't, you know, we'd ask them, did you read closer and they're like, nah, we did it the same. And then I would ask faculty the same thing and they're like, oh yeah, they were reading closer. So I just thought that was a fascinating disconnect between what the students thought the impact was and what faculty were saying. So, yes. Different points of view. Danae interrupted, you were going to. Oh no, by all means. No, it's an interesting point because we haven't, we don't have a lot of student feedback on the tool yet, more general comments, but it is interesting talking to the instructors reactions from what they think they're interpreting from their students. And I don't know about making grading easier, although it doesn't surprise me, it's very interesting, but from one of our strongest users of hypothesis, he said that it's been more interesting for him to see how their annotations change throughout the quarter. Because our quarter is 10 weeks and he had a reading assigned every week. So from week one, how their annotations change compared to week nine and week 10 has been really transformative in his methods of teaching as well. I heard that too. Same thing. The student results have been fascinating to me to see where they, the things they like the best and I should have pulled them up before I came. The things they like the best were knowing what other students thought. And I, there's real value in that that I think they, oh, someone else has that question. So I don't feel so stupid asking it, you know, where they would, you know, they would say, hey, I know the answer to that so they would jump in. It felt, I think it was very empowering for students. At least that's what I'm reading in the data. That was the number one thing and that they were able to reference their notes during discussion, but that knowing what other students thought during the readings was the biggest, that was the biggest takeaway that students gave us. Yeah, and I think we hear that from other folks too. Like, I don't know if you remember back to I annotate 2019 vintage I annotate before the before the pandemic. When Juan Pablo Alperin gave a kind of a talk that I could find the recording for folks who are interested about some early sort of quantitative quantitative and qualitative data that they'd collected at his institution about students. And student perceptions of what social annotation ended up doing for them. And a lot of them did focus in on that, that, you know, being able to see others read alongside them as being a power. And I think one thing that has come up in some of these other office hours is this idea about how social annotation can kind of, in a way, lower the stakes of reading. At the same time, it's making it more intentional because it like it enables people to say things in the margins like I didn't quite understand this or something. And as soon as you feel free as if you can feel free to exhibit that wonder and curiosity openly with your classmates, then that can unpack a whole flood of other possibilities of pure learning and, you know, different kinds of discussions that could fold out from that. And so I do you guys all have plans to do sort of student data gathering or are already underway with some of that. And I don't know, Kyle, you may have done some of that at Wake Forest. Yeah, we had a fall 2020 survey for both faculty and students. And by the time we did it, everyone was so surveyed out that we wanted to make them really concise. So I think we asked like some general usage information about hypothesis on both surveys. It didn't shed a lot of light on like what they thought about it or you know what their attitudes or how it helped their learning. So it was me trying to like elbow my way into a survey, but it didn't really yield a lot of really useful results. I would love to know more about, you know, what they think of the tool and just kind of anecdotally what I've heard from faculty is I'll just echo what everyone else has said that, you know, it improves student discussions and it improves their reading. And I even heard from a few students who I've spoken with anecdotally or, you know, my own students in the course that I teach, you know, do you prefer this over someone like a canvas discussion and they're like, oh yeah, this is so much better. Yeah. That makes sense. Dana, are you guys doing student I know that hypothesis itself offers a student and faculty survey that you probably may have all participated in and I can't remember exactly. But are you have you guys surfaced any student reactions to using the tool Dana. We haven't yet and I don't know if we have plans currently for doing that we did have the hypothesis survey that Aaron helped coordinate. That was interesting but it was really more anecdotal evidence. I wish we had more it'd be great to see. Mostly my view is is from the dashboards from hypothesis which are also very fantastic and helpful to see but of course that's just the analytics of student interaction and student use so it's helpful to have the students opinions. I'd be interested to know. I echo the admiration for the dashboards spend really for a lot of the couple faculty who have gotten to use them had spent a game changer it's really made a big difference. I do what we did. My previous position I did program evaluation all the time so it's just a knee jerk reaction for me to do evaluation so we did do the students and the focus groups and I think it was very instrumental in getting adoption. We're hoping yet in getting adoption because we had that student voice and that student input on it. I hope we can continue it I've been working with Dr. Bowdoin Chen who spoke led the research panel yesterday and I'm hoping we can continue doing some of that research and would love to coordinate meet with hypothesis on the. Surveys that go out this semester that we could so we're not double asking it would be great to coordinate. Yeah I know we've we've had some tension over that and thank you for your thank you for your goodwill with that. Oh no tension just let's coordinate so we get you know we are and we have found to that we do the surveys and then if I we did focus groups with faculty as a way to get some deeper dive. Qualitative data and I would like to be able to do that with students if I just put a call out to students even if we get you know eight or 10 students on a focus group that's. It's always hard to get students to do anything when they're so busy. I know faculty is also hard but yeah. Well that's the thing we had so many faculty willing to do the focus groups we had to turn some people away well that tells me something yeah we did. I think we offered three focus groups and we kept them small on zoom but we did end up with I think almost we I think we ended up with 14 faculty who did the focus groups which blue which I was really surprised at. Yeah especially in this time when we've all had too many zoom in things right and they were so and it sounds kind of like you experienced this to the. They were so excited to talk to each other about how they used it and they learned so much from each other. And I asked them point by said do you want us to do like a community of practice around this and they all agree so. We'll see I really like that idea especially since you know my job is instructional technology specialist is like a look at all these fun tools you can use in your course but I think sometimes it comes off as a little like. I'm just trying to get you to use tools because that's my job is how to work these tools. I it's interesting to get faculty buy in because I think faculty talk to each other and sometimes they're a lot more convinced if a faculty another faculty member has had such a great experience that they want to start using it to so focus groups is a good idea. The focus groups we did have questions but kind of I would ask a question then we would shut up and sit back and let them talk to each other. And that's how the word of it's all been word of mouth. They brought it up with other people and then for the training I'm going to do in August I'm going to use faculty more than me. I'll also be reaching out to Becky but I'm definitely going to pull in faculty to do that. That's all great stuff and I should learn from your practice and just shut up more and let you guys do the talking. One thing that I wanted you guys mentioned the dashboards and we probably really can't show one off here because of student privacy issues but just so folks in the audience may not know one of the primary folks behind the dashboards. John Udall my colleagues I think is here in the in the crowd and so big kudos to John for having moved through that work a little bit and we'll get to a question from Curtis in just a second. But I also wanted to post in the chat a link to a study that's going on at Indiana University that is really going to try to dive deeply both quantitatively and qualitatively. It's a multi-year study into the connections between student reading and writing and success and social annotation. And so that that's a really powerful set of data that's just finished its first term of collection now and will continue over the next couple of years. And so we'll start to be seeing some I think scholarly output from that maybe even informally starting this summer and then moving on. But going back to those dashboards for a minute. The reason that they're not that not everybody has experienced them is again sort of like with a dock drop thing. We kind of developed them in a little bit of a laboratory environment and have been working with folks like these folks here to kind of figure out what kind of data is going to be most useful both to you who are stewarding institutional usage of a tool but then also you know at the instructor level and even at the student level. And so we've been drawing some really good lessons from that and I think have really gotten to a place where we've we've been able to produce a lot of really a lot of data maybe too much in some cases. But in other cases some really interesting I think insights have maybe come out of that we've been noticing some things like connections between in certain classes there might be a high number of threaded annotations in the sense that there's a root annotation. And then a conversation that happens on you know out of that route annotation and in other classes it's the opposite. It's just a series of kind of standalone annotations without much conversation attached to them and we don't know maybe why yet but there's these already these interesting patterns of usage arising in different disciplines and and context. And I think there's there's a really great future in in exploring that and with the caveat that I think Rami and and and tarot were guiding us toward in their keynote this morning of whose data is it and not not trying to turn it into a surveillance of reading on students but instead turning it into like what can we learn as educators from the practices that already exist and what can we do to make them better. And so I'm kind of curious if you guys have interacted with the data that's come out of the dashboards in any in any sort of kind of deliberate way yet or is it still just exploration. Yeah, go ahead. We haven't really. Okay. Yeah, for us it's fairly new. Yeah, for us it's just been mostly satisfying some curiosity and just poking around a little bit it's really neat to see those numbers. But we don't quite know what questions to ask of it yet, I think. I would agree. It's been very exploratory for us as well. I will say that the surprise confusion or lack of understanding. I don't know if that was the right phrase but though that that has been the most interesting one for me to see. Yeah, and that's I think it's maybe John could pop up here and explain a little better but I believe that it's a pretty simplistic thing that just looks for annotations that you know have certain kind of keywords and stuff in them that might express surprise lack of understanding confusion whatever. And so it just tries to surface that in maybe a simplistic way but it's already interesting right and we can do I mean there's kind of a whole digital humanities possibility standing behind this right because now we have a whole set of data about people interacting with text that we didn't really have before. And I think that could really lead to some interesting things. So yeah John so just to finalize this kind of conversation on the dashboards maybe so institutions that are working with hypothesis formally and pilots of subscription have access to these dashboard environments. And so and I'm trying to remember Fresno States relationship I'm sorry I typically have these things memorized but I'm not sure what what stage you guys are at with us but if you could reach out to your other or I'll follow up on the back end to and reach out to your to my colleagues who are interacting directly with Fresno State but the basic idea is that and and this is for institutions that are using hypothesis integrated into their LMS. There's a way to kind of surface an externalized dashboard of data that can be read and so depending on how what your relationship with hypothesis and how far along it is we can help hook you up with that. And then something else that comes with a supported pilot and subscription environment with hypothesis. I did notice that there is John has raised his hand which could be really cool so we could bring him up on stage to talk more about the dashboards. And then maybe while we're doing that I'm going to throw Curtis's question up on the stage and you guys can address that while John is joining. How about that. Yeah I've been thinking about Curtis this question here and maybe I'll just dive in. So as I said earlier my team also handles copyright questions for faculty. We kind of liaise between our course reserves department and help faculty better understand what they're able to do with texts in online environments. And this is another one of those areas where hypothesis has opened up opportunities for conversation. And I'm not a lawyer so none of this is legal advice but kind of where we've settled on this question is that if there's no other way to access a text. And the pedagogy relies on social annotation the use of hypothesis and the only way to get that text is through like an OCR PDF then yes we're going with a fair use argument for that. And you know it's not neat and tidy but I think empowering faculty to teach in the way that they need to teach without feeling like they're constrained by copyright concerns is really important. Now if this were happening in an open environment outside of the LMS we'd probably have a different attitude about it and of course we're telling faculty like you know don't go wild and scan your entire textbook. Let's have a conversation about that like maybe you can scan a couple of pages that are really central to the text and have your students annotate that. But yeah going with that four factor fair use analysis and if you're not familiar with that you can Google a four factor analysis that the quantity of the text I think is the most important part here. If they're reading a short article then they need to read the whole article and annotate that. But if you're scanning entire chapters of textbooks then that's going to kind of jeopardize the the market for the book. So yeah it's as with all things fair use it's a case by case basis but we want to empower faculty. I know it's a balance right. Hey welcome John Udall you're here. I am. And we can hear you. Excellent. So I just wanted to mention a couple of things real quick. First of all we don't know what questions need to be asked and answered either. So what you see on the dashboard right now is essentially what everyone's asked for so far. But we've had relatively little feedback at this point mostly from administrators and especially not enough from teachers. So and I think that the questions that we need to be asking are ultimately going to be coming from teachers. So you know kind of looking forward to some more feedback. The the stuff that you see on the dashboard is kind of the tip the iceberg and the real work that's gone into this is making it quick and easy to ask and answer. You know new questions. So you know bring one and and we will continue to iterate on this stuff. John that's exciting. I have a couple of faculty who I know would be very would love to talk to you at some point about what they want. If I put this idea in their head I'll ask them what they want. Love to connect with you on that. Great thanks. And thanks for thanks for coming up John. It's weird for some reason I seem to have lost my ability to move us back to the Brady Bunch view. I don't know what the problem is. Maybe it's because we're out of time but I did notice that it is the top of the hour. And so maybe maybe we should bring it to a close there even though I'm sure everybody could talk all day if we wanted to about this really cool stuff. But I really want to thank Kyle Dana and Shayna for for coming today. You guys were just awesome. It was a really great conversation. I enjoyed it immensely. And I think that our our folks did too. We did record it so we can also share it in an ongoing way as long as you guys are okay with that. And there's I want to just point out that there are a couple of other things happening here on both today and tomorrow at the conference that might be of interest to folks including sessions on world languages and STEM. With social annotation. And then in about minus two minutes Jeremy is going to be giving a kind of update on the hypothesis of roadmap. So I'm thinking we should adjourn. I will say goodbye and thank you and people can go attend these other events if they so wish. Thanks so much. Thank you.