 Hi, this is Kenneth Gleason from Highwire. We've got a very exciting webinar going today. We're going to look at how the American Diabetes Association and how they have chosen to work with hypothesis. And that is to provide more information throughout the year for their clinical practices guidelines. We're also going to do a tour of how hypothesis works and how you can expand the information on the page through annotations, as well as integrating with hypothesis so that you can also have annotations on your journal and article pages. We'll take a look at annotations and actions and see how they can expand and more information and allow groups to interact with each other on the articles. And then we're going to have a Q&A session so that we can answer your questions about how you can leverage hypothesis on your journals. So I'm happy to kick things off. Hi, I'm Heather Norton. Can everyone hear me? The director for the scholarly journals of the American Diabetes Association. ADA publishes four journals, Diabetes, Diabetes Care, Clinical Diabetes, and Diabetes Spectrum. ADA's Clinical Research Journal Diabetes Care publishes annual clinical practice guidelines and a supplemental issue to the journal. It's called the Standards of Medical Care and Diabetes. It's a comprehensive resource of all aspects of care for treating patients with diabetes. The Standards of Care supplements funded by ADA's general revenue, it does not receive industry support, and its contents are reviewed and approved by ADA's Professional Practice Committee, known as the PPC. The PPC is a multidisciplinary expert committee of comprised of physicians, diabetes educators, registered dieticians, others who have expertise in a range of diabetes care. With the quickly growing amount of information about diabetes available online, ADA wanted the ability to share important updates to the Standards of Care more frequently than traditional publication models allow. We wanted to be able to include new information as the evidence becomes available. For instance, if a new diabetes drug is approved by the FDA, which just happened this spring. Change the slide, please. To that end, beginning with the 2018 Standards of Medical Care, ADA began including this type of information in the form of annotations using hypothesis installed on our High Wire Hosted Journal site. Readers of the online version will see yellow highlighted text. Clicking on it opens the annotation pane to the right. As each annotation is viewed, the corresponding text turns from yellow to blue. Each annotation, each ADA annotation contains a PPC approval date, a publication date, a suggested citation, and options for sharing. There's a button for downloading a PDF containing the annotations, and users may also create private notes. The dropdown box at the top of the annotation pane, change the slide, please, links out to a running list of ADA annotations. Here, users can access the annotations in context through links, search annotations using tags, and create private groups for discussion. ADA hasn't actually made use of tags yet, but perhaps when we publish more annotations. Next slide. The process for updating the Standards of Medical Care with annotations is pretty straightforward. ADA receives requests for updates through the ADA professional website from readers, staff, board members, through a web form. Those requests are reviewed by the PPC, and those receiving greater than 50% majority are sent to ADA publication staff for inclusion in the Standards of Care in the form of an annotation. The process for adding the annotation is really simple. We log in with our publisher credentials, highlight the text, select the annotate button, and add the note. ADA's communications department simultaneously issues a press release and social media posts describing the update. Next slide. There are some important points to note without this initiative. ADA takes on a huge responsibility when publishing information used to treat patients. The guidelines in the Standards of Care have been researched and vetted by experts in various fields of diabetes research and practice. Hypothesis is also a nonprofit organization with a mission to bring annotations to all content on the web. Viewers of the Standards of Care may therefore sign up for or log in to Hypothesis and add public annotations on the public layer, which are viewable to others signed up for or logged in to Hypothesis. There is no way to hide the sign up or login links on publisher selected issues or articles, even those containing official ADA recommendation. There's also no way to verify identities or qualifications of public commenters or for moderating annotations added by the public to the public layer. Although there is the ability for the public to flag potentially unwanted annotations for ADA to view, edit, or delete. This does pose a challenge for ADA because public commenters could have a vested interest in diabetes-related products or potentially contribute false information. Therefore, this continues to be a challenge for ADA. Also noteworthy, ADA has heavily promoted or currently published official ADA annotations and it's resulted in a very large boost in traffic, which we've been very happy about. However, we've not actually received any public comments to date. Next slide. As so well said by ADA's Chief Scientific Medical Emission Officer, this initiative represents a paradigm shift that directly reflects ADA's mission to improve the lives of all people living with diabetes. Thank you. Thanks, Heather. Heather Steins, the Director of Partnerships for Hypothesis and I'll tell you a little bit about the underlying technology and how it's being used. If you're not familiar with Hypothesis, we are a mission-driven nonprofit. We are an open-source technology company. Some of our funders are represented here on the bottom of the screen. We're particularly dedicated to the notion that we are supported by the community and it's to that community that we are responsible rather than shareholders. You won't wake up tomorrow and find that we have been acquired. Hypothesis accounts are free for researchers. Anyone who comes to our website, they can be used on content across the web. Hypothesis worked with the W3C to have annotation approved as a web standard. We're excited to say that that happened about 18 months ago. And what does that mean? Well, in future versions of browsers, just like you indicate what your preferred search engine is, you'll be able to tell your browser which annotation client you use. And if multiple annotation clients adhere to the standard, we should be able to have annotations that interact with each other, even though we might be using different clients in the same way that we can email each other today, even though we use different email tools. We're at 3.4 million annotations and counting. Very exciting. We find that about 20% of annotations are made completely public. About 20% are made completely private. The majority, therefore, happen in collaboration groups. Either researchers working together, students annotating in the context of a class, other types of workflow teams, et cetera. Little bit about how Hypothesis works. Some people, when they hear annotation, they immediately think to comments. And annotations are actually quite different than comments. So I like to clear up some of those misconceptions from the start. When we're thinking of comments, each comment is actually stranded on the page with the article to which it was in response to. Whereas annotations are available through a variety of mechanisms that I'll show you. And so they're usable as a corpus of data and outside of the tool itself through Discovery, the Google and Crossref event data. Annotations are about collaboration. I mentioned that about 20% of our annotations are completely public. But even private annotations and group annotations are an indication of engagement. And we've seen more than 22,000 collaboration groups created. Annotations are also persistent. So each annotation has a unique persistent web address, which makes it really interesting when we think about link data. And annotations are available. Public annotations are available as well as organizational annotations are available through the API. So annotations are portable. We think of annotation like layers on the version of record. So one document may have numerous conversations happening on it all at the same time. Some annotations might be open for the public. Some might be for classroom purposes in a private group. Some may be invited experts by the publisher or even notes that the publisher has added to additional items. All of those things are happening at the same time. Hypothesis architecture, you can annotate with a client on any web page, PDF, HTML, EPUB or data. The general rule is if you can view it in the browser you can annotate it. The piece of hypothesis that appears on the side of the page is the client. And then those annotations are saved to the server. And they're re-anchored when the content is opened again. So we think it's really critical that conversations be enabled on the version of record, which is a message that publishers have been giving us. We're looking for mechanisms as publishers to keep readers on top of the content, on top of the version of record and to keep them coming back. This is an engagement metric. So even if those annotations are not in the public layer they are providing a valuable service to researchers who are using them either on an individual basis or as part of groups. We wanted to enable publishers to control their own publisher groups and enable both publisher branding as well as moderation, as Heather mentioned. And we wanted to actually break down these content silos and increase discovery through the activity pages. I'll show you an example of one of those. Publishers are completely in control over who can see the annotations and who can create them. And we do have robust APIs for repurposing data if you're interested in text and data mining or perhaps displaying annotations on other parts of your site. So what's it like to integrate hypothesis? Well, the publisher groups akin to the ADA group, as I mentioned are branded. They can be configured to meet specific publisher needs. You can have annotations that are created by a limited group, but publicly visible akin to the ADA example. Or you can have annotation layers that anyone can join. So world readable and world writable. Publisher can moderate and you can control visibility options on your page. So here's an example of an individual page. This is my page. So I can see all of the annotations that I've made. Every individual user, every group gets a page. You can also browse and discover through this type of mechanism all of the publicly visible annotations. You can see on the right, I utilize tags quite heavily. Annotations can be filtered by tag, by user, by group. It's a great way to interact with them. For the group configurations, as I mentioned before, we've got a couple of different options that are designed to meet your particular annotation needs. And you can even have a variety of each type of group. Many publishers are interested in enabling one open group for general discussion and then one or more dedicated layers, perhaps one limited to authors and one limited to editors or invited commentary. And we can make certain configurations to fit in better with your UI. For the implementation, it is an assisted integration. We work with you and with Highwire to precisely meet your needs. We utilize a document based pricing which depends upon how broadly you wanna deploy annotation across your page. We use that document, the number of documents you add per year as a proxy for publisher size, but you can add back to volume one if she won, that's included. Included as well are branded and moderated publisher groups, that UI customization that I mentioned. And one of the things that we think is particularly important, installing the tech is just part of the equation. To ensure a successful rollout, we provide training and outreach as well as annotation best practices to make sure that you're achieving your annotation goals. We provide full customer support as well as open source maintenance. It is a community project after all. In terms of analytics, we have a robust analytics that are provided. If you have existing comments on your site, we can port them over. We can also tell you of public, private annotations that have been made already on your site if you're interested. And there's an account called Cybot which is for reducibility purposes, creates these research resource identifiers. I've got a slide coming up on that. If you have any of these annotations on your site, we can let you know that as well. After you launch, you'll continue to see annotations that are made in the public channel and privately, as well as anything that is part of your dedicated publisher group. And again, we differentiate between things that are public and things that are private, as well as any automated annotations. You can see the number of monthly annotations and cumulative annotations, as well as the number of active annotators and annotators active on a monthly basis and total registrants, it's very robust. So how is it being used? Heather gave an example of a restricted group that we have a number of publishers experimenting with open groups. Here's a screenshot from an Elife page. Elife was a publisher we worked with at the beginning for the development to enable these publisher-specific features to be rolled out more broadly. You can see that the annotations match the font on the publisher site and they've made some slight changes to the client to fit in better. You can see they're branded there at the top. And if you look closely on the annotation pane, you can see their flag for moderation, as well. In the center of the page, there's a little button. This is annotations that shows the number of annotations that are actually on that page, even when the annotation page is closed. One of the interesting use cases that we've seen from Elife is editorial folks using annotations for any kind of updates for corrections to the content. They can go in, add information, as well as links and even citations. Heather talked about the restricted group, so I'm not gonna go into detail on that, but I wanted these slides to be in here for anyone who's reviewing them later. Another example of a restricted group is a project that we're doing around citation in the social sciences. It's called the Annotation Transparency Initiative and it's designed to bring along the lines of how data and information's been made more transparent in the hard sciences. It's enabling the authors for social science works to add additional context around citations. So this, again, it's world readable for anyone who comes to the page, but the citations can only be created by the authors of the work. Much like the ADA example, there's lots of things that can be done with these annotations, particularly around these citations. They are treated as first class research objects. They have source excerpts, translations, links back to the original data, as well as full citations. They also, our annotations are also being used in peer review. This is a screenshot from an integration done by eJournalPress. You can see it's the usual hypothesis annotation client there on top of the PDF. I know it's tiny, but the different colored flags, colored panes that you see there are tags at that's been integrated to indicate certain types of reviewer information, like major or minor revision or problem with a figure. I wanted to give you a couple of examples of how annotations being used in publisher workflow. Earlier, actually late 2017, Springer Publishing was getting ready to launch their new website. They were loading a very large number of books to that new website, and they wanted to do QA on top of the XML on that site. So they created a group. They invited their proofreaders and typesetters together, and in two months, they made more than 10,000 annotations on top of that content, and we're working with them now to improve the flow for the next batch of titles that they add. This is just an example from the private group page, which just shows Rachel Chappelle here has made a note to make a change on the page. You can see along the right other members of the group and some of the tags that they were using. And here's an example of what it looks like on top of their website. Another example, which I found particularly interesting was the American Society of Microbiology. They were in the midst of migrating 15 journals within Highwire to JCore, and they knew that there were gonna be a number of informational advanced pages that were gonna change as a result. So all of those pages were added to their internal staging environment. They created a group of editorial folks and they annotated those pages. They particularly liked the fact that you can sort annotations from oldest to newest. So when they were working on the project, they could see exactly what was new. They added information in links into other resources, and for the next phase of the project, they're considering how they might use tags. This is just a quick screenshot. It's a little bit blurry, sorry, of their group page listing some of the pages that they ended up changing. Just a couple of their quick examples. I mentioned for reproducibility purposes, the research resource identifiers. These are used in the event that a particular stem cell line or a particular reagent was used in methods. There's a group out of University of California, San Diego that created the SIBOT account that looks for these RRIDs that are typed into papers and displays the information on the page in the form of an annotation card along the side. It's a lot of great information. Each RRID gets its own tag. So if you find one that's particularly useful to you, you can click on that. It will take you to an activity page where you can see other articles that have used that same RRID. Just a few quick other use cases. We worked this spring with ISMTE who used annotation to review poster proposals for their annual meeting. They got together and created a group of their reviewers and once the review was complete, the author was invited to join the group. I thought it was a pretty cool use case. Some other things that have been suggested to me in conversations with publishers were sales reps who are going out for campus visits rather than lugging the heavy hardbound copy of a textbook with the little sticky notes for all of their examples that they wanted to show the faculty member. They can now create digital examples using annotation which they can demo for the professor but also they can send out through virtual office visits or webinars. Another thing that I thought was particularly interesting and I'd love to have someone try is some publishers do an annual best article context for their journals and one of the ways that the staff or the invited judges might utilize those processes by creating a private group. That way everyone can see each other's annotations and it might make the review process a little bit more straightforward there. So again, both two great use cases. And with that, I think we will take questions. Yes, and so I invite all the participants to feel free to use the chat or Q&A tools to ask their questions. We have one question already and this might be something that Heather Norton wants to maybe even demonstrate or maybe Heather Staines could do it. Kind of interested in how one actually accesses and if you could show directly how one can get to annotations on the ADA site, especially if there are annotations that reference any of the scientific information or sources behind some of the material in the ADA site, is that possible? Yeah, actually the default version is for the annotations to always be visible to any user. So upon arriving at the standards of care, you'll note within the articles highlighted text and just clicking on that highlight will open up the annotation window to the right. So each title that has annotation, each title of the article that has an annotation contains an annotation itself to alert the reader upon landing on the article that there is an annotation below one or more. Great. I'm waiting to see if other people have put in more questions. Don't be shy folks. And I can show the group activity page that Heather mentioned. So this actually has indications for all of the annotations that have been made across the ADA content. This is the article that we were on, but you can see the other are parts of the issue that have been annotated. Get back to see that annotation in context or even share it out. It's also possible to share this entire page so you can share the group by clicking on the clipboard. So if you send someone this link, they'll come to this page and they'll see everything here whereas opposed to if you want to share the annotated issue page, you can share that from here in the client as well. Another question sort of a follow-up to the same one is from Tufankar, are you planning to release a corpora for these annotations? I'm not sure if that would be a question better for Heather Norton. I'm not sure. I don't know. I'm not sure what that is. Tufankar, perhaps you could explain a little bit more what you mean by corpora in the chat. Meanwhile, Susan Wildner has a great question, probably again for Heather Norton. Has ADA received any feedback from readers about the annotated content or updated content that's come through via the annotations? All the feedback we've received, which hasn't been a lot, has been very positive. Most of the feedback we've actually received has been from ADA board members and staff. They're probably keeping a closer eye on it than some other folks. True. And so, Susan follows up. The board's been happy with it and sees it as successful. Absolutely. As a means for relaying the important information that they wish to publish immediately, in terms of that, they think it's very successful. But there are some concerns about potential for public comments, as I mentioned. Great. And so, just as a little bit more follow-up on that corpora question, so Trivencar says that by a corpora, he means a collection of scholarly articles with their annotations, identifying the source of scientific claims. I mean, that's your whole website, right? Yeah, I think that's kind of what we have. Is it not? I'm sorry, maybe I don't understand the question. Yeah, maybe they're looking for a printed version or something. Well, I should mention that we are publishing in Diabetes Care, not a full-length article, but more of sort of an announcement that annotations have been made to the standards of care and explaining what they mean. So, each time we publish an annotation online, or once we publish a few annotations online, we'll gather them together, publish them in the journal as almost like an in case you missed it type of page. Got it. And here's another question from Helen Liedem that might be better for Heather Stain. So, in reference to the journal press example for peer review, can you talk a little bit more about how hypothesis annotation is being used in peer review generally? Sure, so if you've ever participated in or received a peer review on your work, you know that having a long document that says page five, paragraph two, line one, I disagree. And then you have to go off on a treasure hunt to find out what was it that you actually said there and what perhaps they were disagreeing with. So, the ability for reviewers to put their feedback and tie it to specific pieces of the text when relevant and for the editors as well as the authors to be able to look at that in context. It can streamline things tremendously. The project with eJournal Press, as we are open source, they integrated hypothesis into their manuscript submission system and they made some customizations which they could do to fit to their dashboard. They wanted the ability, for example, to obscure the identity of reviewers and for those reviewers to indicate if they wanted something to be kept confidential and not shared with the author. So, some things that were a little bit unique to the peer review process. That said, we do have a number of journals who are interested in experimenting with hypothesis for open peer review. Actually, just a few weeks ago, we launched on a journal in the humanities called Memorations and they use an open review process. So, they created restricted groups similar to what we see here on the ADA example where the peer reviewers, the author and the editor were all part of the group. Again, it's open peer review, so they didn't need to, where do they want to obscure anyone's identity? And so, we're really eager to see how that plays out and we're in conversations with all of the other submission systems, including Bench Press, to see what type of an integration might make sense. So, if you're interested in that, we'd love to hear. Great, and I was just gonna say, Heather, did you also wanna talk about the Center for Open Science exam? Sure, I don't have any, I should have an example in my slides on Preference. I'm meant, I need to add that. I'm sharing a link to the recent blog post about it. Yeah, yeah, so many of you probably know more about Preference than I do. I'm originally a historian and historians are kind of slow to these types of things, but submitting your paper to a Preference server for feedback prior to publication is a great, just a natural use case for collaboration through annotation, whether it's public annotations or whether it's some type of a preprint journal club taking place in a private group. So, we're really excited to get a group of Preference servers together earlier this year, including the Center for Open Science, which has a huge range of servers, including in the social sciences and humanities, and for that capability to go live on many of those Preference servers just earlier this summer. So, we're excited to see how that plays out. When you think about how a preprint version might flow into peer review, there's interest in whether annotations made on a preprint can be valuable to the editor or the reviewers who are considering the manuscript for publication. And there's also interest in whether the final published version of the article might want to point back to the DOI for the preprint so that readers of the published version might be aware of some of the conversations that took place on an earlier version. So, it's a great sort of circular example of how annotations can be a great part of workflow. That's great. And I've shared links to a couple of posts on the Center for Open Sciences, Open Science Framework Preprints annotation work that have been just recently posted there in the chat. Another question on a different subject from to thank her, and I think Heather Stainz, you and I could talk about this one and maybe delve a little bit into Saibot as an example of it. They're asking, could we be using hypothesis annotation as a means of incorporating artificial intelligence in the peer review system, a kind of decision support system for peer review? And that's probably a really imaginative stretch, but I was thinking that the Saibot example of using a third tool to use machines to produce annotations that add value to a document is an interesting example that kind of leads in that direction. Yeah, I mean, the examples that we focus on today are mainly human-generated annotation and Saibot is an example of sort of machine-generated human-curated examples. The European PubMed Central and their platform has entity annotation, which is really exciting. If you want to look at where it will identify certain species or elements and the like. And so it's interesting to think about how the two might be combined. We'd be here all day if we talk about whether any kind of automated entity would replace peer review. But I think that there may be ways that work for the humans who are involved in peer review might be streamlined relations between articles, connections between entities, that type of thing. Yeah, I mean, there's certainly been a lot of talk recently about how the process of open peer review that Heather was describing before might be something that could really advance the practice, maybe even more than bringing machines into the equation. So don't have any more questions queued up. There's one thing, Nate, if I could just elaborate a little bit at some when Heather mentioned the group layer and the public layer. And we do hear from publishers who have concerns about this. With the initial collaboration with E-Life to sort of build out publisher tools, the idea was that the public layer would be suppressed for users who are not logged in. So I would imagine, and I don't know if you have data on this, Heather, that the majority of the folks who read the article and look at these annotations are not, probably not hypothesis users at this point. So they do see the American Diabetes Association layer by default. And you cannot make an annotation into this layer unless you are part of the ADA account or someone who's been specifically added by ADA. I'm actually logged in here. So rather than seeing sign up or log in, I see the regular hypothesis tools, but I too see the authoritative publisher layer by default. When I come to this page, the difference being, I can actually switch into the public layer if I wanna make an annotation and the like. ADA is responsible for moderating annotations that are made in their layer. I would imagine that's not that relevant because they're made by the team. But if you were to add other annotators that might come into play, hypothesis actually moderates the public layer. So if there are across your content, any annotations made that perhaps you feel violate community guidelines, those can be flagged and they're reviewed by our support team. Heather mentioned in her slide that there hadn't been public annotations that were made across ADA content. That's something that we monitor for them since the group was live. But it is technically possible to make annotations there if you want to. Thanks for clearing that up. Yeah, and one thing to remember, the wide use of hypothesis by researchers and students in classes, even if we were somehow to disable a public layer that would keep a lot of people from using content as part of their work for which they depend on private notes and collaboration groups that are in the public channel. So it is quite an interesting discussion and we're happy to have that with anyone who might have concerns. Even in the hypothesis public channel as long as that's been available, we've only seen a handful of instances where an annotation has had to be removed. My best reference when it comes to open groups and moderation is the fellow who does the moderation for the E-Life group and he said he's happy to be a reference for anyone who's interested in how much time it might take him to review those. And in addition to looking at the annotations that may have been flagged, he also uses the Hypothesis API to review every annotation that's made across the platform. And for publishers who have open groups, those annotations populate to that activity page immediately so they can be reviewed there. There shouldn't be any surprises for you on your site. Great, well, we don't have any more questions lined up. So maybe if we wanna just wrap up and say goodbye. Canna? I was muted. But I think everyone for joining, it's been great to walk through a number of use cases that Hypothesis can provide for your journals and articles and there's lots of varieties and ways that it can be used. I think Heather Norton for presenting an exciting use case for us here at Highwire and for Heather for walking through all the examples that are available. Both Heather and I are available for discussions around Hypothesis and annotations. So feel free to reach out to both of us for any questions you may have. And we hope to hear from you. Thank you again for joining.