 Hi, I'm Brandy from Hypothesis, I'm on the marketing team here, and I am so happy to be here and love this conference and love open annotation and just the amazing people who are taking part in it. And I'm going to turn it over in a minute to one of our amazing people, our moderator, Bo Dang Chen, but I want to introduce our guests today as well. We have Jinran Zhu, and Yanji Zhang, and Chris Andrews, and like I said, I'm super excited about this, and we are recording it so if anybody ends up coming in late, they'll, they can watch the beginning after the fact. So with that, I'm going to be quiet and turn it over to Bo Dang. Thank you so much for the introduction, Franny. Hi everybody, hope you are doing well, and I know this is an international event attracting colleagues and friends from around the globe, so I'm really happy to be here. And this is my second I annotate conference. My first attendance was in three years ago actually in 2018 in San Francisco, so really happy to be back to this community and to say to see the growth of work related to hypothesis and annotation. My name is Bo Dang Chen. I am an associate professor from the University of Minnesota, and I see myself as a learning scientist who does research on learning processes and also design technologies and pedagogical practices for learning. So I'm happy to really see three interesting projects represented in this panel, and I will let our panelists introduce themselves, of course, and their project as well a bit later. Thanks for putting up the slides. So our panelists, as Franny mentioned, Jung Ji-Joon from NYU, Jung Ji-Joon from Minnesota, University of Minnesota, and Chris Andrews from University of Indiana. And I see there's, by the way, all of them are PhD students actively working on cutting net research in the learning sciences and also in the field of learning analytics. And I'm actually very excited to listen to their talks and hear from them about what they're doing, what are they designing, and what are they researching, and so on. And how, next slide, please, here. And this is a really a tentative agenda for this panel. And the structure of this panel will be for each panelist to spend 50 minutes to present their work and then have a really quick five minute Q&A after their talk. And after every panelist present their work, we will have a panel discussion. We will invite everybody to pose questions along the way so that, you know, those questions will be either addressed after their talk or added to the final discussion. So I hope that's, I believe we'll have plenty of time to really listen to each other and also to have a conversation during the panel. So without further ado, our first presentation will be from Srinan. Take this from here, Srinan. Hello, everyone. Thank you, Vodong, for the great introduction. So my name is Srinan Zhu, a PhD student from University of Minnesota. And it's really a great honor to be here to talk about collaborative annotations. In general, it's really a fun and great conference. I heard lots of inspiring voices from people from all different areas. And so for my interest, what do I do? I see myself as a learning science researcher and learning experience designer. I'm interested in learning analytics and design-based research to develop educational innovations and implement learning theories into the real world. So those also have been my guide to the study I'd like to share in a minute. So about a study, as we all know, hypothesis can support note-taking in many different areas. So today, I'd like to share how it has been used in educational settings. So this study is based on our collaboration with college-level instructors who were piloting the hypothesis tool. So we started this design-based research to support their integration and study how collaborative annotations work in their classrooms. So here's the study, designing support for productive social interaction and knowledge co-construction in collaborative annotation. So I also would like to introduce our team here, Hong Shui, who is also a PhD student at University of Minnesota. She might be here today as well. And Bodong, he's the moderator here, as you already know. And they all contributed a lot of wonderful ideas and efforts in this project. And yeah, and Shana, Bodong just mentioned Shana in the chat, has been supporting our study through at the beginning. So let's get started of the study. As you may see, a key element of the study is the design. So what is the design? As a preview, we designed two scaffolding strategies to support teachers' teaching and students' learning. The first strategy is called the dynamic grouping strategy. So this first strategy, the purpose is to create or design a learning community as the first step. For example, when a class has really big size, like 100 students, the instructors may want to divide students into smaller groups, like 10 or 15 students per group. This could avoid students being overwhelmed by 100 annotations per rating. And another example could be the instructors may want to assign different ratings to different groups of students so they can later share their understanding of different ratings and achieve a higher understanding of the topics. And in general, this strategy will help the instructors to create a learning community at the first place and that can feed their teaching goal, the context, and the community itself. And the second strategy is called the participation role strategy, which is also today's focus. So for this strategy, we want to provide an opportunity for students to take more responsibility of their own learning and support the process of their learning and to improve social interaction and knowledge co-construction. I'll dive into more details later about this strategy. So why did we design this study? It was initially conducted at the University of Minnesota in fall 2020 when the campus shut down due to COVID-19. And then many instructors pivoted to online instruction and were looking for solutions. So based on our observation, there were generally two directions. The first, given the limited time, some instructor may want to, by using some technologies or tools to replicate their face-to-face instruction. And then another direction would be they want to transform the student-teacher relationship by taking this opportunity. So this is where our collaboration started because we shared the same understanding that the effective usage of technology requires the consideration on both technology and pedagogy. So we think there is a need to design meaningful scaffoldings when using technologies, sometimes even redesign of the curriculum, instead of just throwing the tool directly to students. And by meaning pedagogy here, the pedagogy is drawn from both theories and practice, such as the computer-supported collaborative learning theory and some other learning theories. And based on that, we want to, through the design, let students take more responsibilities in learning, and we want to transform the dynamics between students and teachers embraced by the technology. And then we want to facilitate a natural space for social interaction and also engage knowledge co-construction in online learning. So about the study design, we want to support collaborative web annotation in college classrooms by designing sophisticated participation roles. Methods we use is co-design between researchers and instructors to design scaffolding roles and support their implementation with course-specific customization. And we use hypotheses as the social annotation tool. And this hypothesis in the current study has been integrated into the LMS system of UMN chemists. So the participants were from three fully online undergraduate classes in liberal arts. And the focus of the current study is dance history class. It had 13 students and one teaching assistant and one instructor. So what is the design? What is the participation role strategy? In general, it's a generic scaffolding framework comprising three scripted participation roles based on the computer-supported collaborative learning literature, which includes a facilitator, a synthesizer, and a summarizer. The facilitator is responsible for stimulating conversations by finding connections, seeking collaborations, and encouraging their peers to consistently tag their annotations for an entire week. And for synthesizer, they usually synthesize the initial ideas, highlights agreement or disagreement, and suggests directions for further discussion in the middle of the week. For example, in the dance history class, in the middle of the week, there is synchronous class discussion via Zoom. So the synthesizer were required to submit a paragraph or two or some nice fully points as their synthesis before the class discussion. And during the class discussion, students can have a further discussion based on the synthesis and other peers' annotations. And then for summarizer, they usually summarize group conversations at the end of the week, based on both the class discussion and all the annotations for the whole class. And also at the end of the week for this class, every student were required to do individual reflective writing. And all those roles and timeframe can be adjusted by the instructor accordingly based on their own teaching goals and context. In order to study if the design worked or not worked, we had two research questions. The first one is, how did the activity design facilitate social annotation? The second is, how did the design facilitate knowledge co-construction? So first we conducted a social network analysis to study the social interaction, such as the participation patterns in the collaborative annotation activity. Then we conducted a content analysis to study the knowledge co-construction levels. So for the coding scheme we use is a revised interaction analysis model of collaborative annotation. We developed this coding scheme based on Gunava-Durnas-IEM model and on Rubia and Engels model of collaborative knowledge construction. And we identified four levels of knowledge construction. The first one is called initiation, where students started to share initial understandings and ask questions and share resources without too much elaboration. And then it's called, level two is exploration. In this phase, students started to elaborate on the text or contact personal experiences with some critical reasoning. And then level three is negotiation, where students started to ask questions through critical reasoning or negotiate disagreement or connect readings with critical reasoning. And the highest level, level four, is called co-construction. In this level, students started to reach a consensus on the previous questions and they applied the knowledge or way of thinking. And also they can make metacognitive statements illustrating their learning outcome. So did the design work? In general, the answer is yes. For social interaction, as you can see from the table, the integrate or the degree means how many replies or annotations they receive were sent out. So the facilitators in general sent out more replies and they reached out to more peers and they also received more replies. And from the table, the betweenness, constraint and dominance, they're all centrality measures. So the higher scores in betweenness and dominance and lower scores in constraint, meaning those students are in the center of this community. So the facilitators, they were always influential in the collaborative annotation activities, which means they are always in the center position. And also the social interaction pattern varied across the facilitators in different weeks. And for synthesizer, they participated more than non-rotakers in terms of the numbers of posts they sent out, but not as much as facilitators did since they tended to focus more on synthesizing the readings and annotations on their own. And for summarizers, they participated as same as non-rotakers, which is also expected since the responsibility for them was to write the weekly summary on their own. And for knowledge construction, the facilitator, they generally ask questions or provided answers with elaboration examples, critical reasoning to start and push the discussion. For example, here's an example thread, facilitated by one facilitator. As you can see, the facilitator tried to connect other two students' annotations when replying to student A. And also the facilitator proposed some full-up questions to engage a deeper thinking. And for knowledge construction level, also varied across the facilitators in different weeks. And for synthesizer, their posts were also mostly classified into levels two and level three in terms of the knowledge co-construction. Well, for summarizer, they on average contributed much less annotations in all levels, most of their posts were in level two. This result were in line with the script role in the scaffolding framework. For example, because they focused on the class discussion during Zoom meetings and the composite summary that connected Zoom discussion with annotations. So in general, the result indicated that to a great extent, the designed activity was enacted by students properly. And then the role assignment was associated with students' social interaction pattern to some extent. And different role-takers may have different strategies when they're playing the roles. And most importantly, in weeks where role-takers posted more higher-level posts, the knowledge construction level from non-role-takers tended to be high too. So implication of the design, first, we proposed a scaffolding framework for collaborative annotation, which is applicable to many college-level classes, including the both the dynamic grouping strategy and the participation role strategy. And then we developed a revised interaction analysis model for collaborative annotation that is more appropriate for analysis of students' discussion anchored in web documents. This can also support teaching as a reference for evaluation of annotations. Finally, the result of data analysis has shown promise of the designed scaffolding framework for facilitating productive collaborative annotation in a study context. In particular, the facilitators and the synthesizers played roles in deepening collaborative annotation. So as for some final words, just to eliterate our purpose of the design, we believe effective usage of technology includes consideration of both technology and pedagogy. So for pedagogy, students are not always natural collaborators, and we need to make intentional efforts to help them become better collaborators. And for instructors, they need to provide careful scaffolding and detailed guidelines for students to take various roles. And the technology also needs to connect students and teachers' needs to provide a natural and effective environment for collaboration. And there's a plus sign here. I just want to say that this plus sign doesn't mean the relationship between technology and pedagogy is linear. Actually, they impact each other at every moment. So the pedagogy can impact the technology design and development. And in return, the technology can impact the pedagogy in terms of, such as curriculum design and the class evaluation. So we need to go back to the fundamental questions to rethink the relationship between pedagogy and technology. For example, we call hypothesis as a note-taking tool, when we discuss how to make better notes or how to support note-taking in a better way, we also want to go back to the questions that why students are taking notes and how are they going to use it in the future. So then we can think how we as researchers, designers, and teachers can do a better job to support the process. And as the fire knows, I also would like to invite everyone to rethink the relationship between technology and pedagogy and also between students and teachers, what can be done as researchers, designers, and teachers. So that's all for my presentation. Thank you very much for listening. Please let me know if you have any questions or suggestions. Thank you. Thank you so much for the presentation, Xunran. I already see two questions in the Q&A section. So the question is from Michael Welker. Michael was asking, on the assigned roles, were they rotated across multiple assignments? Also, how much orientation was needed to define those roles for the students? Yeah, that's really a good question. So first, yes, the role will rotate. Each student will have an opportunity to try each different roles. And each week, there is one facilitator for each reading, one synthesizer, and one summarizer for each reading. And for orientation at the beginning of the semester, we actually, the instructors spend a lot of time to explain how the different roles are different from each other. And we kind of spend two or three weeks to try and to adjust the roles to kind of make a co-design, not even between researchers and instructors and also between the students. Because during the class meetings, the students also shared their understanding of the roles. And then the students and the teachers adjust the role together. I think started from week four. And all the roles have been settled on and the activity has been smoother. And yeah, that's the right answer to your question. All right. Michael says awesome. Okay, great. Thank you. Any other questions? Quick question for Yixingran. I see another question coming in from Shana. Would you recommend using the framework level one to five for rubric for grading and or for scaffolding for students? I think by framework, Shana means the coding framework you used to analyze the co-construction. Yeah, I'll go back to that slide. So this, this coding thing has four levels. I think it can be a work as a reference for grading or evaluation of students' annotations. But sometimes I'm hesitant about using some framework or coding scheme to grade students' annotations quality, because it's really hard to to see if to identify if an annotation is is a high quality or low high quality. Even when some students are in level one or level two, it doesn't mean they're now learning. Maybe they're in some phases of learning or just their learning styles. But in general, I think it could be used as a helpful reference for teachers to see how students are learning, you know, in a specific week and how other scaffolding or support need to need to be provided to the students. Great. Thanks. Thanks for the questions and thanks for your responses, Shinran. I think we will have more opportunities later after the presentations to engage in those questions again. And at this point, I want to thank you, Shinran, for your presentation and then invite Chris to share your screen and and give us your presentation. All right, let's see if I can get this to work here. Okay, so I have a somewhat weak connection for those that are listening in. I'm going to try and turn on my video for a minute. I had to drive down the road to find a better spot for my for my phone to be able to hotspot in. So we'll see. That's why I'm in a car right now. It's also raining. But I'm happy to be here. I'm excited to present with these others. Shinran, that was that was awesome. I love the what you've been doing. I've already seen your your scaffolding framework from that you had posted up before. So I think that's awesome. So what I want to present on today is this this is kind of part of what I'm trying to do for my dissertation, which is thinking about kind of instructors and how they use it, how they use social annotation, particularly within these undergraduate reading and composition courses, which I'll talk about in just a second. Yeah, go ahead and go to the next slide. So some this this is these are some things that we kind of already know about social annotation. And Shinran talked about this that kind of with the pandemic, many people moved online needed something to, you know, kind of mimic what they were already doing in their classes or just something to help them with their their online courses that move to online and many turned to social annotation. Because it's this this flexible tool that we can use, even in a hybrid situation or even in a face to face. And I love this quote from Rami Khalir in Ontario Garcia, their new book on annotation, that they mentioned that social annotation is seldom an end in and of itself, rather it most frequently compliments a repertoire of other educational practices. So we need to focus on, you know, not just social annotation and what's happening in social annotation, but also kind of how it interacts with other things, particularly within a course. And so that's part of where I'm kind of thinking about this in terms of instructors. And, and then also what happens to these annotations once they're once they're created. So this also comes again, here I'm quoting Bodong and Shinran here on their most recent article in the Information and Learning Sciences Journal, that we need more in depth qualitative inquiries into how instructors are using social annotation and these this interaction of social annotation activities with other course activities. Go ahead and go to the next slide. So these are some of the questions that I'm thinking about as I'm, you know, trying to understand, analyze research, you know, how do instructors implement social annotation in their online undergraduate reading and composition courses? Again, we'll get to the context here in just a second. What are what are instructors actually doing with the annotations? And then how do these social annotations impact or align with other course activities and course goals? Next slide. So some of you may have may already be familiar with this, but I'm at Indiana University and Hypothesis and the Department of English at Indiana University have partnered to do this project that's really so what I'm doing is kind of a small part of this long term project at IU, particularly with the English W131 classes, that's their freshman composition courses, reading, writing, and inquiry. But part of the data that I'm looking at has to do with a couple other courses as well, this introduction to fiction and introduction to poetry courses. And I've got the nice little COVID graphic there that basically what happened is, again, as we all know, forced online because of the pandemic. And so they started to implement, they've been using Hypothesis in some of their courses. And when they started making this course shell for, you know, all of the instructors to use for English W131, the freshman composition course, they decided they wanted to embed Hypothesis as part of this and social annotation activities as part of this. So across all of these, there's more than 50 sections of English W131 that are going to be using or that have used over this last academic year, social annotations using Hypothesis. And so there's a lot of data that we're going to be sifting through. And I'm just kind of piecing off a little bit of this. No, that was actually perfect, but you like read my mind. So spring 2021 semester, this is where the data that I'm looking at in particular, there's several data sources. So the semester started and, you know, we're collecting kind of course activities that are happening, including their social annotation assignments that are occurring throughout the semester, as well as other assignments that they're working on. We had two instructor questionnaires, one at the beginning of the semester or near the beginning of the semester and one at the end, to kind of get an idea of what instructors are doing with the annotations. And then I also facilitated what we called an instructor inquiry team. It was really kind of like a mix between a focus group and like professional learning group, professional learning community, where basically we just talked once a month-ish about the decisions that they were making about their social annotation use in the classroom. What kinds of intentional decisions were they making? Why were they making those choices in terms of how they were using social annotation or how they were, you know, what they were doing with the annotations and that sort of thing. And so you kind of see that we've got kind of multiple time points all across the semester of kind of trying to understand what's going on. You can go to the next slide. So I kind of mentioned this already, I guess, that we had the two instructor questionnaires, that the instructor focus groups. There was actually, there were six instructors that participated that we split into two groups where we were doing these deeper dives. And then those are part of the course of artifacts. We were collecting these artifacts that the instructors created. But some of these artifacts were artifacts that students contributed to even though it was an instructor created. And I'll mention that in just a minute. So we go to the next slide. So this is, I don't know how many of you are college football or basketball fans. Well, right as soon as the season ends, they come out with rankings for the next year, even though nobody knows anything that's going to happen. And they typically call those their way too early rankings. So these are my, because I have not dived into the analysis yet. This is just kind of my thoughts as I've, as I participated in some of this data collection. So I haven't really done analysis yet. So this is my way too early thoughts and analysis. So we can keep going. I think I have these down one by one. So I'll just tell you to go to the next piece or you go to the next one actually. Perfect. So instructors design of annotation activities as and what I mean by this is that instructors, how they were thinking about using annotations. And I have an or between these, but I don't necessarily mean them to be either or it might be somewhere along probably a continuum of some kind. But some of the instructors were thinking about the annotation space as a student only space, not really a space that they really participate in. They might provide some private feedback, for example, within Canvas, the LMS, because the hypothesis we were using the hypothesis within the learning management system. And so some of the instructors are thinking about as a student only space, a space where the students didn't have to worry about the instructor making sorts of judgments or comments on in the public space, but there are other peers as really kind of more of a space for peer to peer engagement. Or some of the instructors were thinking about it as this kind of joint space where they could together make meaning. And so again, there's some and I love how she around put the this idea of you know, technology plus pedagogy. And these are some intentional decisions that these instructors are making these pedagogical decisions as they're thinking about how they want to build this space and why they want to build the space that way. And so that was just one of the interesting things that came out of it. And then there's this idea of individual engagement with the text or threaded conversations. What is the goal of these social annotation activities? Is it in these reading and composition courses, for example, is our focus really on analysis of the text or and really kind of having that conversation with the text? Or are we trying to promote this this threaded conversation, you know, focus more on the interpersonal or the community building ideas within that. And again, not that these are mutually exclusive, you can do both. But just as the teachers are thinking about how they want to approach the pedagogical process and pedagogical decisions, they're thinking about some of these things. And sometimes they're focused more on I need them to understand the text and sometimes they're focused more on I want them to build community or share with each other understandings. And go ahead and go to the next one. And then there's this idea of accountability and exposure to the text is a sort of similar to individual engagement with the text. But some of the instructors are thinking about this idea of accountability of, you know, is this just social annotation is maybe an easy way to find out if students read the text or not. And maybe that's all you need it to do. Obviously, I think many of us here in this session and many of us that have used social annotation, see other rich and wonderful uses for it. But also some instructors, all they really need is they want to make sure that the students have have read the text and had been exposed to the text. And so some of them aren't thinking about this deeper power of social annotation. Or another, you know, part of this and I see Jeremy in the chat has mentioned kind of these tensions that these could be portrayed that way this this idea of tensions that, or are they, can we use these social annotations as resources for future activities? Can we, and this was, in particular, I'm thinking one of the instructors who was having students use tags pretty heavily, as they're reading, particularly relating it to conventions of a particular genre of writing. And then they would go into a Google Doc, a shared Google Doc, where students could go back to the annotations and they could search for these tags. And then they could kind of add some information, okay, this or, you know, about a character or about genre, a particular convention of a genre, and and start to build out this. I don't want to say a catalog, I'm trying to think I need a better word for this, but they were kind of building out this this new resource. So they were using social annotations as a resource to build a new resource for when they went to their essays, where they could kind of organize the kind of self organized all of these, all of their annotations in kind of new ways. So, again, just kind of some of these interesting tensions that some of these instructors were thinking about how do we use social annotation, and how do we use them in different ways than maybe we haven't thought about before. And then going to instructors use of annotations, you can go to the next one. I think most of us are totally, we get this idea of preparing for class. You know, many of them were using the social annotations, they'd go read their students' annotations to prepare for their next class. They would, you know, they'd be due 24 hours beforehand or something like that, they would read through them, maybe to see misconceptions or ideas from the text that the students missed. A lot of times they could tell by what was annotated and what wasn't annotated. That, you know, if they didn't annotate a whole section of the text that the instructor felt like was really important, then they would, you know, bring that up in their conversations. Or some of them, one of the students, you know, mentioned, they used it as a warm calling on students in the sense that they knew a student had already made an annotation, had already said something about this, and they would reference their annotation and maybe either ask them to expound on that or maybe there weren't replies to that particular annotation in the social space. And so they would, in their synchronous session, they would, you know, warm call on that student and use that annotation as kind of a jump start for further discussion. Obviously there's another tension here where we don't want to rehash a conversation that has already been had in the social annotation space. You know, we want to be careful about not having a redundancy of a conversation. And so kind of trying to think about what that might mean. But that we can still use the annotations in this, you know, in this other synchronous space as well. And then this idea of public versus private feedback, I kind of mentioned that already in terms of this student-only space versus the joint student instructor space. But thinking about, okay, when do I make public comments, and when do I make private comments to students? And just thinking about that dynamic and trying to identify when, at what point, or, you know, what is happening in the annotations that I'm going to make a public comment versus a private comment, where I'm going to go into the annotation space where all the other students can see it, or I'm just going to do it within the Canvas LMS and, you know, give them some private feedback on the assignment, for example. So again, I reiterate that these are way too early thoughts and analysis that I'm looking forward to diving into it more this summer and being able to have some more specific and much more backed up by data and quotes and other things that I can link to to promote the trustworthiness of what I'm saying. So this is just from the top of Chris's head based on his participation in collecting the data. But I think some of these things are interesting and hopefully will bear some more really interesting data. And I think that was my last slide there. So I think I'm done. All right. This is fascinating, Chris. I already saw some really interesting conversation going on in the chat. All right. I see something popping up. So Jungi asked the question. Do you want to just directly ask to Chris? Yeah, sure. Chris, thanks for a wonderful presentation. And I really enjoyed your presentation. But I have a question for your personal thought about the potential impact of different areas of content, like your classes were in English, but what if like we are designing such as like STEM courses such as physics or statistics like their reading materials are really different from English or like social sciences? Do you have any thoughts on that? Yeah, let me think. I know I have a I have a brother-in-law who teaches physics and he's used another social annotation platform that I won't mention because it's not hypothesis and this is a hypothesis event. But no, I'm just kidding. I know hypothesis is all for any kind of social annotation. But he talked about and we know from some other places like Oh, now I'm forgetting. There's the website where you can see experts annotations on some science content, science in the classroom is that what it's called, I think. So I think there's some really, there's some really great ways that we can I see that annotation can be different in terms of, you know, there's the disciplinary side of it, right? That so for example, I was talking about the tags, you know, genre conventions within English, you know, as a specific thing, but thinking about how, what kinds of tags you might use in, you know, a science or a STEM situation to help students organize some of the knowledge that they're, you know, co-creating or recognizing. The other thing I was thinking about that, you know, English is primarily text based, right? But when we get into STEM, and maybe that's an overgeneralization of English others, maybe Jeremy, who I know has an English background might take some issue with that. But that I'm also thinking about like graphs and and, you know, other other kinds of figures that annotation and on those kinds of those kinds of representations, I could see being, you know, quite a bit different than, you know, thinking about a paragraph and trying to make sense of a paragraph and connecting to a paragraph. But the this idea of data analysis, you know, of these these figures or things like that, those are just some of the things that come to my mind. I feel like maybe I didn't, did I, did I address your question? I feel like we kind of tend to, I tend to go off just a little bit there. Yeah, I feel like you address the potential like considerations that instructors might consider in terms like, like the annotations and interaction can be mediated not only by the English text, but also by like graph or like formulas or the others in STEM courses. Yeah, you're addressing things. And again, I think that one of the key things that I think is just so important, and I think this gets a little bit to that quote that I shared earlier of this idea that social annotation, you know, is seldom an end in and of itself. That's just getting students to socially annotate is, you know, might be fun and interesting for students, but unless it's tied to some sort of course goal, or it's helping us achieve some sort of, you know, other objective, you know, and you mentioned this idea of mediation, right, that we want to think about not only how does social annotation, you know, mediate our accomplishment of some of these goals, but then how do those social annotations then become mediators for other activities in the course. And so that's part of what I'm thinking about. Like if we need to make sure we start with that or consider those goals. So what what goals are the instructors trying to accomplish? And then how does, you know, do these social annotations? How can they impact that? And then now we can start to think about, okay, how am I going to get students to create the kinds of social annotations that are going to be most useful for them to accomplish this? You know, it could just be an essay, but it also might be some other really interesting projects that they're that they're working on. So those are just some of the things I'm thinking about. Thank you. Thank you both for the conversation. I see another question from Shana, but I think at this point, I will save that question for the later panel discussion, because I think that's relevant to a lot of the project we already see. And now I want to invite Yungi to share your screen and share your work to us. Okay. Hi, everyone. Could you see the screen? Hi, everyone. I'm Yungi and I'm a PhD student at NYU. And I think it's a nice transition from Xinlang and Chris's presentations to my one, because my one is more about in what ways we could support quality and better social annotation activities by using another intervention, like social annotation itself can be learning intervention, but I want to add one more layer to that. So I can introduce that idea. So today's presentation is mostly about suggest this idea and conceptualization of what I mean by data-informed action or like what I mean by data-informed feedback, which can be used to support better participating in social annotation activities. Okay. So I think everyone here is very interested in social annotation, and we are very understanding why social annotation engagement is very critical, but by limiting this into a higher education context, especially for the course activities, I want to say quality learning engagement matters a lot for student success in higher education. And so especially in NYU, there are several courses that apply social annotation tools to course activities to help students better success in their courses and also better understand the course concepts. Then what kind of effective learning would be occur with social annotation? We can think about three different dimensions about it. So the first thing is by using social annotation, students can interact with peers, which can be an additional ways for their learning activities and also get exposed to diverse views and increase their perspective beyond their previous ones. And another way is by participating in social annotation, students can increase their additional participation in course activities as well as additional social annotation activities and also put more effort to complete their learning task, which we can say by behavioral engagement. And of course, by participating in social annotation and reading the materials and reading other students' ideas about the same text, students can deepen their understanding about the course contents and also formulate new inquiry or argumentations so students can increase their conceptual and cognitive engagement in the course. But if there are instructors for the courses who are using social annotation, you could see there was a lot of low quality annotations and also students showed a very diverse level of engagement commonly reported or so in the literature. So then my question is how can you design learning support where students can effectively engage with social annotation, which have been expected to help students? So one way Iona introduced about it is providing learning, providing kind of the learning analytics dashboard. But I don't want to use the term dashboard because dashboard is more like very sophisticated term. But the main idea about here is what if we provide providing kind of the feedback or weekly summary report that summarizing students' engagement progress in social annotations. So because when students participate in hypothesis, students produce a lot of different data sources such as annotation contents themselves and clickstream data about when and who put annotation and other metadata. So if we make use of these different data sources and create important feedback that can help students to take actions to improve their engagement in social annotation activities or also change their mindset to better engaged in social annotations, it might be great. So my general idea here is how could we make use of the social annotation data and how could we create a great feedback design. So the expected outcome is by using this data-informed feedback not only students but also instructors can get a better sense of what students are doing and how students are engaging in social annotations and like compared to A, reading material, B, reading material get most attention so this kind of quick diagnosis can be available and also based on that instructors can provide intervention to particular students who seemed in need or problematic and also instructors can change the order of the reading materials or take away a specific material or they can change this kind of modification in instructions. But going back to students, so students are actually not only the primary source of this learning data but also they can be the main target for this feedback. So the information generated by students themselves can be fed back into students themselves to help them to take actions for better social annotation engagement or also can help them change their attitudinal or motivational changes in social annotation. And the main reason about we are focusing on this action taking and motivational change is that I have been participating in several graduate level courses in NIU which uses hypothesis and other social annotation tools such as Peguzo. But students mainly sometimes dislike just the use of social annotation. They sometimes perceive these activities as additional things that burden their coursework. So we want to see about the instructors see the high value of using social annotations for students to better conceptually engage in course topics and also increase the interaction between students not only as in offline but also in the online spaces. So what kind of action takings can be possible based on this kind of feedback? So based on the literature so there can be three different kinds of actions. So awareness can be part of action or as a prerequisite for action. So by looking at what is showing about their progress in social annotations students can be better aware of their status of learning and also they can reflect on what they are doing good or what they are having problems in their previous weeks social annotation activities or if we could provide the peer references or the classes average or classes data then students can also monitor their and their classes progress together and they can be better motivated to keep engaging in social annotation. And also by looking at the feedback students can also increase their intention attention to the course and materials and also they can be more responsible for social annotation activities later and also they can actively eat listening to and speaking in annotations. And they can also change their argument, their way of participating in social annotations when they actually look at what they are doing in the social annotations such as they can develop argument or they can change their positioning in the threads or they can develop the new questions or change the questions and the most most cited actions or actually the help and resource seeking behaviors. So by looking at the feedback students can further go to instructors by asking what kind of additional resources I can see or what kind of or just asking the help to better understand the reading materials that they were not good at in social annotation at that week or they can also manage or plan their way of participating social annotation in a more strategic way. However, we all know that just simple exposure to data is does not always lead to meaningful awareness and actionable insights. And going back to that students always when they got the data and food feedback they have they can be challenged in figuring out what to do with this and also students are usually reluctant to make decisions or take actions to improve social engagement based on this because they don't know what to do or they were just afraid of using that. And by the literature these kind of misconnection is came from students mistrust in data even though the data itself shows their progress some students say it's not my data it's out of sync or it's not showing my true progress and also students are usually having less experiences in looking at their own learning data. So this kind of deficit experiences help them to reduce their confidence in making sense of what the data tells about them or what to do with them. So and also usually those feedback tools especially the dashboard usually focused on awareness or monitoring rather than helping students to burden prompting students to do something after looking at the feedback. So then my other question is then okay so we all know this kind of potential exciting opportunities but also we know the challenges so how do we design actionable data and for feedback? So my positioning about that is the most problems of challenges in making use of the previously developed feedback is usually from either the researcher side or just the developer side or not including students or not including teachers for design process. So I want to suggest the co-design conceptualization for developing these kind of feedback tools and also involving teachers for that. So what I mean by co-design with students is to design and characterize the values or analytic metrics and tool features that can help students to take actions in a way to improve their social annotation activities. So we think this kind of way can be helpful because this can address several ethical aspects of learning analytics what I hear say LA from students perspective by involving students in designing on this kind of co-design process. Students can develop their awareness of what kind of data from social annotations can be collected and they can also increase their agency in their privacy control and they can increase their trusting of data use and also we want to get some idea about how to make a balance between letting them know about we were going to collect your data and then gaming behaviors happening and how could we make a balance about it. So we want to have some broad idea about iterative co-design sessions so first we can start from needs and problem analysis that students have especially about social annotation engagement. So we need some the real experiences from students and by listening them we can make students to develop and formulate a set of values which they think very critical for effective engagement with social annotation. So what I mean by values here is for example oh I think providing question is very important for social annotation activities then this kind of questioning can be one value and like constant interaction with peers is very important then that can be another value. So by setting up the values we can think about later then what kind of analytic metrics students really want to see. So for example it can be like the graph showing the progress of their great quality of this questioning development or it can be just it can be written by a text just one text of oh you are doing good in this week. So we want to get some real perspective from students about it and then as a final state we want to help help students to develop the and prototype the final tool which we expected to purchase as kind of the weekly report or as unlike HTML version that visualize a set of metrics and with explanations. And in addition to that we also think it's very very important to involve instructors in designing the course activities of using this kind of data inferred feedback as part of the social annotation activities. So we think this is another step for designing. So the purpose of doing that is to we get more we want to integrate students use of data informed feedback activities into core course activities and therefore students can increase their awareness and action taking in a way to help them better participate in social annotation. So integrating students of feedback is very critical for them not only to revisit the tools but also make use of the feedback tool for their social annotation engagement. So we kind of suggest four different principles from the literature. So the first one is we can start from needs and problem analysis for implementation and then we need to think about integration piece which is how to incorporate this feedback used into core course activities tied to core schools. How can we introduce this feedback into students and so and but and another piece is how to implement this feedback to increase students agency in their own learning not only just introducing the tool and just letting them use and just observe what they are doing. We really want to see how students make use of this feedback into their own learning and the next piece is about dialogue. So based on the literature it has been very critically argued that it is very important to being transparent especially in implementing this kind of analytic feedback or learning analytics dashboard to students. So in what ways to inform students of the availability of their data collection and how to get agreement about it and how to inform students of the existence of this feedback. So being a transparent way is really important and also in a way to facilitate dialogue for their social annotation engagement between students and instructors. So what I expect is also to see how instructors and students make a dialogue and conversation mediated by this feedback. So then the next thing we can think about is okay so all of the great imaginary ideas then how could we imagine how effective this design would be or how could we analyze its impact or how could we think about in what way students make use of it. So I have several questions here so the fundamental questions about data and for feedback is during social annotation activities do students really benefit from looking at their own data and how do students feel about it. So maybe as an audience you might think about oh okay this is a good idea but I'm not fully buying this idea because it can be also additional burden for students. So I think it's important to check and confirm that students can benefit from looking at their own data and in what ways we could improve this tool in a way better benefiting students. And another point to think about is from learning awareness to actions for improvement in social annotation activities. So how do they how do students make sense of what the feedback tells about their learning is a very critical question to see and whether and how students translate what they find in the feedback into actions they want to take for further social annotation engagement and then in what ways. And of course as I introduced several sets of specific actions I also really want to see specific affordances about how does this experience help them take actions for facilitating their engagement annotations like awareness like adaptation or attention like what kind of actions do they take and how does this experience finally impact their learning attitude in the course and also motivation in social annotation further. So because the reason why I present these kind of conceptual contents is I'm not conducting any study yet but we are working on implementing this interesting design to the classes in the upcoming bar we into the courses in two courses which use hypothesis or other social annotation tools for their courses. So we really want to implement this design and see what happened and why something did not happen will not happen. So our my general idea about it is first implementing social annotation activities to students like by having them equally participating in the reading materials using hypothesis and then see in what ways they can get the data informed feedback and also how could they make use of it by interacting with them and then after they I want to see what kind of impact happening during the term and also at the end of the term. So yeah this is kind of the general model that I have in mind about how to designing and supporting the actionable social annotation engagement in courses. So yeah this is the end of the presentation so feel free to ask me questions or also if you have any questions later feel free to reach out to me using my emails here. Thanks. Thank you so much Eungi. I see around the virtual applause and heart for your for your talk this is great and I invite colleagues who are attending this to post questions in the Q&A or raise your hand if you want to ask Eungi a question directly. I see another already a question coming from Jeremy which is is implementation whether it's a typo or not I don't know or information or data informed feedback personalized report of your study going to be produced by you or by social annotation tool. So yep I think that's a good question for clarification so yeah it's actually both. So first what I'm trying to do is summarize a set of the data sources that can be measured by or collected by the hypothesis tool or other annotation tools and provide students a set of these kind of potential data sources and then make them produce the design or ideas for what would be interesting data sources they want to see to improve their social annotations and what kind of analytic metrics they want to see in the feedback report and after doing that I want to make them develop some idea about what the report would look like and then as a final step I want to produce the report with instructors and meet together so definitely it's not just made by the social annotation tool but I have some idea about it because the the other social annotation tool such as Peruzal has some automatically produced analytics but it's not for students it's usually for the teachers and it's for grading so they allow automatic grading which which those instructors can use but students cannot see at all so yeah do I answer your question okay I think so I think that's a great response any other question quick clarification question for Yongji before we move on to the panel discussion and I don't see more questions in the Q&A or in chat at this point I think I want to thank all three panelists for your wonderful presentations and especially thanking you for sharing your ongoing dissertation research I think that's a very very generous of you to share your ongoing thinking which is really inspiring to me and I see a lot of connections across different projects and and shared investment in a few principles such as co-design or centering what we know about learning and teaching and also the researcher instructor partnership I see that across all three projects as well and so I have some questions I want to invite you to think with us and I also want to encourage the audience to post your question to the panel you could direct your question to the everybody on the panel or you could direct your question to a specific panelist as we launch this panel discussion so my first question as I was really thinking about the design element that is cutting across different project the question I want to ask is because you do work directly with instructors or with students I want to learn from you a little more in terms of how do you see the instructors contributing to the design or the design product in your project which could be a framework or a student dashboard or tool and any tips you have that you can share with the audience in terms of ways to really engage the instructors to be active co-designers or contributors to design so feel free to jump in when you have a we have a thought to share with the audience I can go first if that's okay so I would say I think instructors they're really a key element during our collaboration because they're experts of teaching and the curriculum design so first as we collaborated I would say we all have shifted our identities a little bit we're not just researchers and they're not just educators we all became the research informed practitioners as we're all trying to turn research into practice by co-designing the social annotation activities so during our design we have co-design meetings and then they shared their course objectives their insights and their teaching strategies then based on that we introduced some scaffolding activities and the collaborative learning strategies from the research literature and I also really appreciate the strong rapper we have established during our collaboration I think that's really important so throughout a semester we kept our routine design meetings as a chance to share both positive or negative updates and solve problems together so our co-design has always been an ongoing process it's not just we designed activity at the beginning and never talked again so there are always great ideas coming from the design meetings throughout the semester with the instructors and they have inspired us a lot as well so I think the for the tip I think the first step is to build the partnership and respect and listen to their needs insights and confusions as we are here to solve a problem together yeah that that's my thoughts thanks shingnan anything to add um yungi and chris yep I can add so uh singland shared a great insight about how to like the importance of involving instructors in the whole process of designing and implementing this specific intervention for social annotation activities and I want to add some another layer about the importance of involving instructors or teachers in not only designing the intervention itself but also another work for designing the integration work to the course so in my case I want to not only not only make students participate in social annotation activities I also want to do another experiment of introducing the the data feedback to students so in that case usually instructors have been not involved in the implementation project but rather instructors are usually involved in designing the tool of this kind of report or feedback but they were not leading the main reading role of introducing this kind of data to the class or students so from my pilot study not involving social annotations but in overall discussion activities students were really wanting to see see that instructors really emphasize the importance of why we are using social annotations or why we are using the discussion post and how my activities really really impact my grade or my future career etc so I think it's really important for us to help instructors continuously participate and make interventions during the semester during the course of the semester by mentioning the the purpose or importance of why using the using social social annotation tools or in my case also why using data feedback for social annotations yeah yeah and I loved that um yongi and shunran both you know emphasized in their presentations as I this co-design idea with instructors because I mean I'm thinking about a couple of things like one coming from you know what we know about learning and you know learning sciences is that context matters and the instructor is typically the one who understands the context the best and so when we're trying to design or understand what's going on I mean obviously students or whoever's in that in that context is going to understand it better than the outside researcher typically and there are other approaches that we have such as research practice partnerships or implementation research or there's other kinds of approaches that we can take that kind of involve their voice more in the research but I I think it is so important to make sure that we're getting instructors voices in in in this because some a lot of times they're left out of this you know we focus so much on students because we want to make sure that students are learning but instructors can give us so much insight on what is going on in the classroom and little little things that we might not think about that are really important for that particular context I'm also probably quite biased I come from a practitioner background I taught high school for seven years before starting my PhD and I really love teaching and so there's there's a background part of that as well and then the other thing I want to mention with with mine that I kind of I kind of said this that the instructor inquiry team these focus groups that we had that I tried to intentionally designed it as a sort of a professional to mimic sort of a professional learning community or professional learning group with the instructors that I didn't want it to be I'm coming in and I'm going to ask you some questions and you're just going to answer but we started each session with what like what's going on with annotations in your class like what are some challenges that you're having with annotations that you want to talk through with the group because we had other instructors who had been using annotations and so I wanted that space to also be kind of for them to just uh yeah have this learning community to be able to say you know what and you know one of there were two specific instances that made me feel like okay I I don't know if I accomplished it 100% but at least some of the instructors saw this as a great opportunity because one of them said you know a lot of my students are just doing these one-off annotations there's not a lot of threaded conversations happening what are other people doing to have threaded conversations now that was also you know one of you know a potential research question or something I wanted them to talk about but instead of me coming up with that question they already had questions and I kind of let them lead that discussion and then another one where I said okay here's the general topic that we're going to talk about in this session and uh one of the instructors the first thing you said in in the session was um I'm so glad that you posed this question because I've never thought about this before and I want to talk to everybody about it um and so I thought those were kind of just a couple of instances that made me realize yeah like this is like for the research part that's great like I want to make sure that I'm getting um you know good data for my dissertation but also I want to make sure that this is useful to the instructors and getting their perspective and having them you know talk through some of this can really bring some rich data that because they're invested in it now that they're uh you know they want to have some of these conversations and so because they're invested it makes for much more interesting data and it also benefits them I wanted to make sure that you know part of this is it's benefiting these instructors thank you thank you that's a really good strategy to think about the community Chris I really love the strategy and we'll try to find a way to incorporate that in our future work as well and um I'm looking at the time and I want to really invite the audience to if you have any questions for the panel to ask the question in the chat or directly raise your hand I see uh some um some chat really interesting conversation going on in the chat thread which is great and and for me since there's no more question I'm really going to take the advantage of being uh as a speaker on the panel uh because I've been really thinking about this question myself and want to really listen to the panelists as well is we use the term social annotation quite often in different contexts right and as a as a researcher myself from the learning sciences we emphasize social cultural and political dimensions of learning besides cognitive and so on right there's multi-dimensional phenomenon learning is a multi-dimensional phenomenon so I want to really engage all the panelists and ask the question what do you think about when you listen or use the term social annotation what types of um what do you mean by social or how would you interpret the word social when you when you think about social annotation because I'm really curious about your thoughts on this so I can yeah right yeah um when I think about this term social and when I use the term social annotation I was also thinking what social I mean I mean by because I think uh like this social annotation can be also widely used as like online collaborative annotation or online like collaborative discussion text or something like other there are so many different things so my answer is I think it can be divided into three different factors so one is by the in what ways the collaboration is and is mediated by so the first thing is learner and learner like of course being social means making interaction between peers or between the learners but I think it's like a social annotation is also individual activity at some level as the other audience said before so I think it's also more about learner to context interaction so like learners can interact with the concepts and the materials themselves or text or topics in a more conceptual way so I think that's a second thing and the third thing is also learner to instructor learner to teachers so I think it depends on how much instructors want to involve in the annotation because usually some instructors do not want to participate in annotation at that much but if yes then I think also getting scaffold from instructors in a better way to fully understand the concepts in the materials or get exposed to new ideas from instructors is also another thing I can see as a social thing thanks a lot Shiren you can go next okay so yeah I really like how UMG kind of emphasize the connections between the students and teachers in a context and I have a really similar ideas and first I think social of course does not mean students are just talking to each other we're discussing things all the time but in the social annotations contact or the learning contact the social needs to be needs to support learning so it's not just two people work socially but their mind or cognition associated to the learning context are communicating socially so sometimes I like to call it social cognitive learning which means which means we expect there will be cognitive achievement like understanding of the topics or application of the knowledge facilitated by the social activity and also for my understanding of social learning or social annotation students don't just learn for themselves but also they take the responsibility to contribute to the community by sharing their voices and helping others to understand the topics but as I said the students are not natural collaborators they're not natural social cognitive beings as well so when we talk about social annotation we need to keep in mind that there there needs to be a sophisticated design or support to help them achieve the goal so in conclusion I think social is something like an ecosystem in social annotations all students are teachers well take the their shared responsibilities to contribute to the to the community's knowledge co-construction that's my understanding of the social yeah and I don't I don't know that I would add much more I thought those were both really lovely that I really like this idea of you know that we can't we don't want to lose the learning aspect of it in the work that we're doing in particular and so you know I like that idea of you know think about that the co-construction of knowledge in this space and that comes from these intertextual intersubjective connections that we're making as we're annotating so I don't I don't think I have anything else to yeah I concur those are really good great points I think really adding a lot of richness to what we think about social annotation and I think about the future work that's in front of us as a as a community if I could use this term to describe our social annotation research community I think there's a lot work to be done and really appreciate of you sharing your work and I hope to learn as your projects move forward and with that we are five minutes over and I want to thank all the panelists for your contribution to the panel and also thank you to thanks to the attendees for participating in this panel for any anything from you you want to let us know or kind of housekeeping things to to share with the audience hold on I'm gonna make it so that I'm on the screen there we go I have the power uh no I just that was great thank you so much I just want to encourage people you know to follow this incredible work that you all are doing and again thanks to everyone who's here and um we should probably wrap this up so that people can get to some of the next sessions um check the schedule hang out in the lounge meet people talk to people and don't forget about annotate this party later today it's the penultimate one it's gonna be great I got something kind of big plan for that so anyway that's another topic thank you again and uh yeah we'll wrap this up all right sounds good take care everyone it's so great to see you