 Hi everyone. I want to welcome you and we'll just go through these slides really quickly so that we can get on to our webinar. Remember, connecting to collections care is a lot of different aspects. One is the online community forum and if you are a registered member, which doesn't cost anything, it just takes a few minutes. You can post questions and get answers quickly on the online forum, so take advantage of that. And you can see all of the past webinars by looking in the archives. You can also, in the link library, you can put in a topic and it will give you not only resources but also links to the webinars too. And you can always contact me. This is my email address. I'm happy if you have problems or you want to say something nice or say something bad. It's okay. You can contact me there. And we also have a community website. In addition to that, we have Facebook and we have Twitter. And we also have a listserv. So coming up in August, we're going to have a collections care training for small museums. And then I want to point out that on the 31st, we're going to do something different that we usually don't do. We're going to do a joint webinar with the New England Museum Association in their program called Lunch with Nima. So that will be a different time. That will be at lunchtime Eastern time. So if you're like me in Western time, you'll have to start at 10. So pay attention to that. And here we go. I'm going to introduce Megan Faradar and we're going to have the new webinar. So enjoy it. And I'll catch any questions so that they get answered. Just put them in the chat box. And also you should know that on the website when the ad slide on the homepage disappears, that means that the webinar recording has been loaded in the archives. So that's the key. So Megan, go ahead. Hi, everyone. Can you hope everyone can hear me? I can see my microphone is blinking. So hopefully we're good to go with that. I just wanted to say thank you very much to Susan and Mike for inviting me to share about my experiences with the Smithsonian Transcription Center in engagement and crowdsourcing and also for their support in preparing for today. There's my face in the lower right hand corner. And I wanted to share a couple of things that motivate me in my role as project coordinator for the transcription center, which includes a combination of working with staff and preparing collections, but I'm not a collection manager myself. I also do community management. And the volunteers like this amazing group from our event in March and the reminders of my desk keep me on track and remind me to be grateful and encourage me to think and be open to learning and think through particular ways of approaching public as recommended by Smithsonian. Today I'm going to talk about crowdsourcing and engagement transcription, the ways that I selected the approaches that I've used so far and the way I continue to continue to learn the ways that I apply those approaches and some of the results and some final tips. And then hopefully we'll also have plenty of time for questions and I can speak to more specific examples when answering some of those questions. So what exactly is crowdsourcing? The definition of crowdsourcing that I like to use is by Dr. Mia Ridge who is a digital curator at the British Library. And I particularly like this definition because I feel it identifies collaboration and rewarding experiences for everyone. It seems to acknowledge others in a sort of relationship in crowdsourcing and aims towards exchange and supports notions of respect. So a form of engagement with cultural heritage that contributes toward a shared significant goal or research area and ask the public to undertake tasks or distribute tasks that cannot be done automatically. But ultimately that is happening in a space where those tasks or goals are rewarding for everyone. What else is crowdsourcing? So crowdsourcing I also approach as within our model with the Smithsonian Transcription Center is an opportunity to bring in further elements of exchange and collaboration. So I think that these things are underlined by multidirectional benefits. They're not merely bounded by the tasks at hand. So we do more than just focusing on the goal. We also acknowledge the work of staff. We create data that requires careful management. These are not activities that replace staff, but rather build opportunities to connect to work that's already occurring within the institution. And we also push us a little bit further. We are encouraged to be flexible and creative and to iterate with the types of activities we're doing with our collections, enriching our collections. And I'll explain a little bit more about that as we move forward. Some examples of crowdsourcing in cultural heritage space and library collections include projects such as Flickr Tagging with the Biodiversity Heritage Library, geo-referencing menus and maps with New York Public Libraries and the British Library, or even uploading and tagging newspapers with projects like the U.S. Holocaust Memorial Museum's Unfolding History Project. And just yesterday, a brand new crowdsourcing project launched at UC Davis Library, which is labeled this, which is labeling tagging wine labels. So there's also opportunities to... I'm speaking a little bit more specifically about enriching collections data in this space, but there's also many other examples of crowdsourcing. And I have some references and resources that you'd be able to take a look and look at other examples. I'd also welcome any questions that you have about crowdsourcing. So when thinking about preparing a crowdsourcing project and the kinds of engagement that you would want to undertake in a crowdsourcing project, one of the first things that is useful is thinking about the need at hand and then therefore defining the goals. With the Smithsonian Transcription Center, we knew that we had a lot of digitization occurring. Collections were being made available and access was being created to these collections, but each of those collections needed or could benefit from additional enrichment. So this was activity that's beyond the scope of staff work, but a clear need for folder level description, creating metadata, and generally enriching collections for search and discoverability. So with thinking through the needs of discovery within those digitized collections and enriching those item level records, and in some cases fleshing out those records, and also unlocking the treasures, creating greater access to Smithsonian collections, the primary goal or preliminary goal as well became creating indexed searchable texts. So we wanted to create text that would support more robust and specific search results and also allow greater flexibility to match the rise of digitization within the institution. And when I came on board in January 2014, the transcription center had been launched in June 2013, and some of the primary collections that were considered in the beginning and that were included within the transcription center were items such as diaries and field notes, manuscripts, and correspondence, where the digitized item was cataloged at the item level, but creating transcribed text would allow those items to become even more searchable. So you would be able to find exactly the exact page on which something happened rather than merely looking for an item and having to read through it. So over time, these goals are initial goal of creating indexed searchable texts, and then some supporting goals of showcasing collections that sometimes may be heavily requested for research but are not displayed in other ways, and also creating metadata for new records allowed us to move. We also moved our objectives forward into engaging audiences and learning within learning with audiences and creating opportunities for learning. And all of these goals inform both the flexible design of our transcription center site and the engagement approaches. So transcription. Transcription has it worked with the transcription center. So the Smithsonian transcription center is a website, a program of engagement, and a service for Smithsonian museums and archives and libraries and galleries here within the institution, that cohort of 19 galleries, libraries, archives and museums, nine research centers, and zoo. And currently we have 16 of those organizations represented within the transcription center. So our transcription center works within a peer review process, so anybody can transcribe. We've made the barrier to entry very low specifically. We wanted to make sure that there was mechanism in place for approving those transcriptions. So we have a review step and volunteers must register to review. And then finally, Smithsonian create a final pass of approval over those items. But one of the additional design components for creating the searchable text is one we wanted to make sure that that text is indexed quickly and useful. So we have also made it possible to index that text very rapidly when a project is finished within about an hour. And those items, the text that's transcribed and associated with those items can always be improved. So pages don't end up being firmly locked. Rather, if someone comes across a page that has an error in it, they can contact us and we can open the page for further transcription or Smithsonian staff can improve and improve upon those transcriptions as well. This is the process that we demonstrate visually on the site, transcribe, review and approve. And frequently we see pages moving back and forth between steps one and two being transcribed and pushed into review and then open back up for transcription. So we may see, frequently see, yes, same question, anyone can transcribe, no login is required. But again, creating an account allows the volunteers to track their work and have the opportunity to pursue particular projects and make connections across projects. This is a top-level view pulling back just a little bit of the transcription center. And the transcription center, again, is multiple units, multiple archives and museums at the Smithsonian. The material types vary. And in general, we need a few things that match up our transcribed text with existing Smithsonian systems. And I won't spend a lot of time talking about this graphic, but I'm happy to answer specific questions. I just wanted to give an overview in case someone had a question about that. So for the Smithsonian Transcription Center, again, we knew that we had many different material types. We have diaries, manuscripts, correspondence. We have biodiversity collections where we're creating individual records. We're fleshing out individual records. We have had photo albums, also ledgers and long books. And I'll show some examples of those later. So therefore, with all of these different kinds of tasks, we know, or all of these different material types, excuse me, we know there may be different tasks. A combination of tasks depending on the material that's being addressed at hand. So the main task, of course, is transcribing, but we also review. We do record creation. So we're specifically adding information into fields, which can then populate records at the item level. We also do geolocation. And we'll soon be launching a new template that includes an opportunity to translate locations into geocordinates. We communicate with peers and with staff. We also do some translation and interpretation. We don't do translation of full items, but there are some moments of interpretation occurring. And then there are also many, many moments of self-guided learning. And those are some of the elements that I support specifically with engagement activities. If we know that these combination of tasks can come together in various ways, that also means that there need to be engagement strategies that are flexible and design of the site needs to be flexible as well. And at Smithsonian Institution Archives remind us every collection tells a story. So we want to find ways to connect the stories to these particular tasks and make it a rewarding and engaging space for everyone who's participating, whether they're coming to work on one page or they want to maintain activity with us and become a digital volunteer for many years. Furthermore, transcription projects finish very quickly. So some of the things that I do are finding ways to sustain engagement or to make sure that every interaction is engaging in and of itself. So with the type of activity that we, again, our goals, our main goals are to create transcribed text and then knowing what types of material types we have. There are many approaches and many ways to think through how engagement could occur within the transcription center and outside in the transcription center. So we use a combination of communication built into the site as well as instructions and inline help text. And then we also do a combination of communicating in social media spaces and direct campaign and communication emails. And along the way, these are some guiding ideas, guiding perspectives or requirements in my mind of what engagement needs to be. And that includes communicating in a collective way, maintaining enthusiasm, but also acknowledging the gravity and the importance of collections. Being genuinely curious both about participants and the context of collections as we are transcribing them and their contents as well. Offering empathy and support for volunteers who are learning how to use the system or even volunteers who've been using the system for a long time and are encountering new challenges. Being open to problem solving and openly problem solving. So inviting other voices or other ideas, whether that's staff members, people who are participants within the transcription center and even in a wider cultural heritage space. Taking feedback and listening carefully to people's opinions about the site and then trying to collaboratively solve for it. And then finally being authentic about ways that we engage. So if I don't know the answer to something, I'm going to say I don't know the answer and try and find the solution. So again, I mentioned very briefly a moment ago some of the different tools thinking through those requirements. These are the tools that I use with engagement. So some of these tools are used daily. Some of these tools are used just once. For example, a welcome registration email occurs one time. But we want to make sure that that engagement email makes it clear that we're welcoming someone's registration on site. That we want to know more about them that we want to know and not in a ticking boxes kind of way, but in a genuine way. How can we find opportunities and search for or surface new types of material to bring forward for transcription or to in my case to cultivate relationships with members of staff who want to share more about collections. And if I know that we're getting a set of new volunteers who are really interested in transcribing artist's diaries, I can reach out to people at the Archives of American Art to see if they have collections that are in need of transcription would benefit from that. And if also if they would be willing to share some behind the scenes insights about those items. Other things are tended to as I said in a weekly or a daily matter. So I do answer feedback emails very regularly and try to make connections and help volunteers, whether that's they're having problems with registering or they want further information, then I'm able to share pieces of information like that in social media. I use on a daily basis Twitter and usually Facebook and Tumblr as well and tried to use several times a week Instagram. And we also share videos specifically from our Google Hangouts, but also other videos on YouTube. Don't have any vlogging or podcasts yet, but that would be a pretty interesting space to engage with right now. This this communication and engagement activity is handled predominantly by me and then with representatives from our participating museums and archives. We'll also engage with volunteers in social media space and there could be some future opportunities for even greater collaboration there. Also do pretty regular in our I guess package or our program of engagement. We do Google Hangouts and opportunities give volunteers opportunities for behind the scenes access. So let's see. These are some of the material types that I mentioned, which thinking about material types as well as the task at hand influence and inform some of the engagement strategies. So we have again field notes, diaries, manuscripts, correspondence. This is a view of our transcription page. We can see a combination of things happening on this page. One, people have transcribed it, but they also have given further information on this notes on transcribing this page field. So we don't have discussion boards on site and we had a commenting tool discuss on site, which we have recently taken down. It was not being used in ways that we it was not being used poorly just was not being used frequently. I think this space this notes on transcribing this page space was fulfilling volunteers needs and we had some feedback about that. So we've taken that down for now and we'll probably re-engage with some other opportunities for communicating. We also have projects where we're creating metadata were fleshing out collection records. So that includes our 44,000 bumble bees that we've transcribed, including this beautiful specimen from Winchester, Virginia. We also have transcribed botanical specimen sheets, labels, and we also work with the National Numismatics Collection creating individual currency proof records, which will allow greater access and research within the National Numismatics Collection, which is a part of the American History collections. And then we also have a combination of material types, log books, lists, and ledgers. So with these material types, this is often an opportunity to bring out further context about the collections and the ways in which recordings and observations were being made. And we had one great example with log books. These are this example in particular is from the Harvard Smithsonian Center for Astrophysics. And these were ledgers that were keeping track of glass plate images of the night sky. So we had the opportunity not only to describe and to hear from the dash team, digital access to a sky at Harvard, a sky, excuse me, digital access to a sky century at Harvard from the team there. And hear more about this work, the ways that women were involved in capturing these observations and making computations, as well as the way that these historical items continue to inform scientific research today. And we've also had the opportunity to transcribe Apollo Stooges from the Aaron Space Museum Archives. And again, the engagement approach, although these types of materials sound very similar, the engagement approach can be modified or changed in relation to the collection at hand. So when we think about selecting different approaches, this graphic just demonstrates the many different ways in which approaches are implemented, sometimes simultaneously, sometimes sequentially, and then always building and learning and iteratively being applied. One thing that we want to think about when we, even if we were starting at the beginning of a project, if we were designing a project, we would know our project goals or ideal goals and have envisioned a pathway forward. Towards those goals, we probably know a little bit about our project design, and we might even be able to envision a mutually beneficial space within a crowdsourcing project. But we're pretty likely to succeed if we also purposefully think through some of the reasons why people would participate or volunteer their time in a project, such as ours. So we want to think through why do people contribute their time, and even why should we consider their needs in designing crowdsourcing projects and engagement approaches. For me, this is very important to address this question up front to say it's important because these participants are sharing their time. We want to make sure that they feel valued because they are valued and that we don't become too closely focused on our own goals and continue to always advocate for participants who are joining us in transcribing. So some things that have come forward from our volunteers relate to some of their motivations in participating in this project. So this little grid that I made represents our volunteers and what they have given feedback regarding their motivations. And this general framework is also seen in volunteering in crowdsourcing spaces and citizen science spaces as well. The intrinsic box is the most darkly colored box because that's the primary motivation that our volunteers report, followed by altruism, altruistic motivations, and also principalism. So with intrinsic motivations, our volunteers report having and pursuing personal goals and interests through the collection. So rather than maybe already being subject matter experts, they want to learn. They want to explore collections. They take delight and pleasure in looking at collections and also contributing. They feel that they're learning and they're having the opportunity to expand skills, whether that's subject matter skills or perhaps increasing digital literacy or finding community or communication as a part of their participation. They also report altruistic motivations so that they have the goal of increasing the welfare of or the contributing to the greater good through their activity. Principalism is also reported. Sometimes that's the goal of upholding principles dear to one's heart. So some of our volunteers are very interested in making sure that collections are accessible, both pure access as well as increasing the legibility and readability of items, as well as connecting to wider goals of open access and moving and connecting collections across many spaces. Then our volunteers also report through their various means of communicating these sets of needs. So this is a set of needs that I've identified over time over the last two years or so of the ways that volunteers are indicating they would like, they have needs that can be met by an engagement approach. So some of those things include requesting communication and updates, specific types of support, both technical support as well as shaping support, where their activity is being guided, where they can make a decision but they might want input from me as a program coordinator for that. They also report being more motivated when they have a clear mission or objective that they can pursue. They also may need a reason to conform. So if they are offered a set of guidelines that allows them to tack on to something and follow that set of rules. They also again appreciate guidance as well as freedom, freedom to explore and to do not be pulled down or roped into one particular project. In the transcription center's design, it's very easy to participate or contribute to an individual page and then change to a different project. That's not always the design configuration of a crowdsourcing space. Other crowdsourcing spaces may, because of their specific goals, want volunteers or participants to engage and stay very focused on certain sets of material. And our design tends to facilitate the little bit of a choose your own adventure kind of experience. We also know that volunteers need time to grow and time to learn within their experiences of participation. So those needs also translate into certain sets of behaviors or activities. So when our volunteers are in action, they are operating on their own schedule. So a difference between volunteers versus in person volunteers is that a lot of this activity is occurring on their own schedule. So this is an opportunity to contribute to give to give their time and to learn, but it's usually around their own schedule rather than let's say just a building operating hours. They also frequently are ready for a challenge. They enjoy some of our engagement activities, which I'll describe in a few moments. They like to move between projects. They are very active on the middle of the week, Tuesdays and Wednesdays seem to be our busiest days and not usually the weekend. They are remote. They might be people who if they lived a little bit closer could contribute during those hours of building operations, but they are remote because they are actually remote. And then they are also very curious and want to help. So when I say they're very curious, that means not only are they contributing information by transcribing items, but they frequently surface connections between materials. They'll ask further questions about individuals or events that occur within the materials. And that's an opportunity for us to engage with staff members to find further answers. And they frequently want to help. As you could see in that previous example, when I mentioned the notes on transcribing this page field, that portion of the page was originally designed to communicate with staff. And what's happened over time is that volunteers contribute information and support one another. So they'll give feedback to one another in that space. They'll share where they found information resource that might be useful in transcribing an item, which has been really a pleasure to see with the way that those volunteers have contributed and supported one another. So just very briefly, our volunteer life cycle is mapped out in this graphic. And you can see that about 33% of our volunteers work on one to four transcription activities. And, you know, close to 50% of our volunteers are doing 10 transcription activities or less. So when I say transcription activity, you know, there's a challenge with capturing the exact extent of activity. So that for us is anything from an individual word being transcribed to a save activity on a page. So any time a page is saved, we count that as a transcription activity. So if most people are doing 10 transcription activities or less, a good 50% are doing 10 or less. That means that we need to try to make sure that the experience is engaging from the very first moment of an engagement and also that we are creating reasons, positive motivations for continuing and matching that experience with other sets of expectations that a volunteer may have. Excuse me. So with those sets of information in mind, so the combination of motivations. Everything we've talked about before with our volunteer with our project goals and our volunteer needs, our volunteers, standard activities. My approach has been informed by all of that as well as this kind of set of principles. So I take forward and every time I'm engaging with volunteers, these sets of ideas. And at different times, I may have to emphasize other different points of these principles. So I try to make sure that in any time engaging with volunteers that I'm giving care and careful listening and that I'm purposefully engaging that I'm flexible in my communication and also in the ways that I'm receiving information and that I'm constantly iterating and testing that information. I also try to, and I won't say that I'm perfect. I'm sure I've made mistakes in communicating with volunteers, but I hope that over time I can show through communication both from the direct communication through feedback from volunteers with giving access to collections and make sure that our completed projects are publicly available to be downloaded for free by volunteers so that they're able to use the work that they've created and also share with other individuals if they wanted to. And I also hope that our volunteers know that we are having fun, but we are also very serious about our goals of improving access to collections. This little graphic just again kind of reinforces some of my approaches, which have to do with making sure that the experience is maybe a bit like hosting a party with staying up, keeping up with volunteers and kind of juggling a combination of motivation. So we do a combination of helping and directing and pointing them to projects, connecting them with other opportunities. Sometimes those opportunities may not be with our project. Maybe someone has a really, has an interest in a particular set of materials and we've transcribed something of that content or collection and there's no longer an opportunity to transcribe that. So they connect them with another project and mostly this is because I think that if that, if these types of projects improve and support and respect volunteers then they are going, it's raising everyone together at the same time. Also clearly converse with volunteers in again in multiple spaces, whether that's direct email communication in social media spaces or broadly speaking within campaigning communication emails. Also do a little bit of governing, making a kind of decision or getting feedback information from staff about how they might want an object transcribed. Also try to guide volunteers in by modeling communication approaches and then also guard from from unpleasant experiences or maybe thought thought less experiences. So I'm reminding everyone that everyone takes a little bit of time to learn on site that there also are always opportunities to improve and two individuals may approach a page and see an item and transcribe the item in different ways. So they might look at the same letter and have different interpretations. So it's not that either is wrong and how can we come to consensus over those items. That's another kind of aspect of approach there. Also within the ways that we engage, try to keep this careful blend and balance of representing authority in the sense of the ways the instructions have been written with also some flexibility being being authentic in all of my approaches. So again being honest if I haven't considered something before if we have tried something and it hasn't worked if we have if we can improve upon our approaches that I think leads very easily into accountability and then also try to remain approachable. So having perspective and as well as a voice that appears always welcoming to have feedback, whether that feedback is positive or negative. So some of the elements in a crowdsourcing project that are important to consider include risks and the risks of a project. Maintaining the success and activity around a project can be informed by some of these elements. It is one thing to change or to elaborate your mission or objective, but it's probably best to do that as portions or phases of a project is completed. So frequently changing or being unclear about the mission objective is a risk and something that can occur very easily, particularly if you start working on a particular item that's being in our case an item that's being transcribed. You may find that the result is not quite the format or the style that was envisioned. So there may be revisions to that. Frequently we update or have to revise some of our metadata generation projects such as our bees or botany or the numismatics collection. Also making sure that the workflow we've revised our workflow in the transcription center several times to improve the review and the approval process and also to make it easier for staff to participate and volunteers to contribute information with one another in again in that feedback field that we discussed earlier. It is very challenging when you're managing a crowdsourcing project to give timely updates, but that's also something to be carefully planned into communication and planned into the workflow for a project. They're also it's also extremely important for and this comes across the board with crowdsourcing projects, whether in the cultural heritage space or they are within the citizen science space and that is failing to clearly communicate or give clear plans on how data will be used. So in again within the Smithsonian transcription center, our main goal is to use the transcribed text to improve search and access. We also know that some of our objects that are being transcribed go through an additional verification process of the approval process takes a little bit longer. So some of those items include our bumblebees and our botany and our national numismatics collection as they as the objects are transcribed the text has to be matched to particular. So if you are managing collections information, you will know the similar process of matching content to particular fields and so that requires additional cleaning and so the delay on that data can be a little bit longer. So we want to always make sure that we're giving our volunteers updates on what is happening with the text that they've contributed. We also try to acknowledge with even within the collection record that volunteers have contributed by saying that that item has been transcribed by and crowdsourced by digital volunteers. Another challenge that I've seen over time occurring with crowdsourcing projects is not a either not very carefully planned or inadvertent lack of communication around scaling down or the conclusion of a project. So sometimes projects have limited information to transcribe. For example, we transcribed the Apollo stowage lists from Air and Space Museum and those were transcribed very quickly and there are not other stowage lists they were it's a finite collection. So that set of objects has been completed. And if we weren't careful about describing the ways in which that information is going to be used and plans for future projects may have inadvertently broken trust with our public and that of course is something that we not only don't want to do, but we want to make sure that we're actively thinking through that from the beginning. I would say one final thing also is failing to share results or or shrouding results and even failing to share failures can be a bit of a challenge and a risk to manage in building or designing engagement with crowdsourced collection. Those risks can also cascade into failure. And some of those examples I've seen from other smaller projects or other experiences are listed here including growing too quickly and failing to scale or expecting absolutely not being flexible in the way that you are designing your project or engaging with the public as well as inadequately resourcing and and I would argue that the it's very easy to to budget for the design of a technology or tool. It's more difficult to design for communication and continued engagement with the public. So something that's really important to consider when designing your crowdsourcing project. And then the final bullet point has been excuse me has been brought forward by some of our volunteers which is experiences of having their contributions going behind a paywall which violated their feelings and their expectations of trust with working through a project. So with all that in mind with designing approaches and thinking about risks and potential failures thinking about the goals of our project and other tasks at hand as well as the variety and scope of materials. These are some of the ways that I've applied approaches for for engaging volunteers with transcription. So have a couple of these four examples are engagement approaches that are campaigns that use smart principles so they're specific measurable achievable realistic and time bound. And I mentioned previously that some I've seen some projects move forward without careful consideration of communication and support for the engagement. And so one main consideration that you may have in mind is sustainability or capacity of your staff to engage in a crowdsourcing project. And that certainly plays into my decision making when I'm building my engagement approaches because I'm just one person handling most of this communication. So I'm thinking through capacity and this is hopefully a space as I've talked through some of these approaches and give some examples where I can connect some of the practices and experiences of the transcription center with your capacity. So if you're contemplating engagement and crowdsourcing some of these may be useful models for you. So again with these different types of smart approaches applied to seven day review challenges coordinated social media campaigns working with different parts of the institution so working together with multiple libraries or archives or museums. And then also we've had one experience so far with a set of this is a citizen science project we dig bio so a set of transcription platforms who are at different institutions that are different institutions excuse me coming together for a brief period of time to emphasize transcribing and therefore liberating biodiversity data. And then we also do contributing connect projects so just give some examples of the ways that these challenges these particular campaigns meet smart requirements so seven day review challenge that that time bound is immediately apparent in the title. So we have seven day review challenges ask our volunteers to focus a little bit more tightly on review. So our volunteers love transcribing and they enjoy reviewing but they may move forward in transcribing before they go back and completely review projects. And the review step is very important in transcription not only for validation or verification but actually to make that in that text actually indexed because the project is not index until it's completed. So our seven day review challenges really use specific focus by asking volunteers to do one sort of task within the site as well as specific projects on site every day. Each of those seven days we measure our success and benchmark against previous seven day review challenges. So I think that we do we suggest achievable goals we're not asking them to transcribe everything that's in the transcription center but actually to review and to chip down or chip away at particular projects. And I think there are realistic challenges as well. So by contrast is a four day approach that asked volunteers to specifically last year we asked our volunteers to specifically focus on bumble bees and go B by B through those collections. And I think we may have shot a little high was we wanted to finish as many of the bees as possible and we finished 4000 so that was a success. So contributing connect approaches again ask may have a varying time range we may ask volunteers to focus on transcribing one or two particular items and then give them a reward at the end of a behind the scenes access or special curator talk. Some other engagement campaigns they've used include and I write weekly ish because this is a space for me that I've engaged with some success and also some failure in communicating or failure in connecting and having to think about my capacity and rethink the style of communication. With transcribed Tuesday Friday final lines and my TC discovery I originally started doing these as it sounds weekly transcribed on Tuesdays on Fridays and my TC discovery was once a month. And over time, when there have been other demands on my needs, and we're needed to communicate or contribute my manage my community management time in other ways, for example, specific answering specific email questions or dealing with a bug on the site or something like that. This is something that has been moved to lower priority. It also if we're measuring engagement and success. Sometimes these communication strategies have limited engagement or they're only occurring in social media spaces so they their reach is not as far as an opportunity to iterate and test assumptions about the frequency of communication or frequency of certain types of communication and an opportunity to build engagement or to adjust engagement if and when it's successful. So I would again look forward to I can see some questions coming in and I'm looking forward to addressing those specifically and also would like to again give more specific examples as we talk through these questions. But this is some top level information that I wanted to touch on. So over time with the transcription center, we've seen steady activity and release of projects. One thing that became very apparent towards the beginning of my time in this role in January 2014 was that slowly building a queue and creating steady release of projects created a pace of activity and a set of expectations that were sustainable and creating returning participation rather than releasing everything at one time and seeing what gets transcribed release a few projects each week or you know once the project has reached. If it's a set of projects that are within a wider goal, let's say there's 40 Bumblebee projects releasing one when one the previous one has reached 75% complete creates a very steady queue of material and an expectation that there's going to be an opportunity to transcribe that material that people are enjoying working on. Also over time campaigns and showcasing collections in various ways have developed their own rhythm. So probably about every six weeks to eight weeks to do a campaign or showcase materials in a particular way. And I think one of the other elements of cumulative growth is being there and listening and receiving feedback and carefully integrating that feedback into the design and execution of projects as well as the design and execution of engagement and the general features or capacity of the transcription center. So this shows this chart on the left hand side shows the growth of active volunteers so we count active volunteers or someone who has done transcription activity. We actually have had more people about 15,000 more people sign up for transcription accounts but not actually ever transcribed and that's fine if they don't want to transcribe. We don't we send communication emails only to the people who have actually transcribed to ask them to continue. And we also welcome someone whether they're going to transcribe just one word or if they're going to have sustained engagement with us. Then over time again this is from April 2015 to about the same time period in 2016 and the graph continues to rise in this way. We also see an increase in completed projects and an increase in number of pages or a steady climb in the number of pages completed. This is clearly cumulative reporting. The general numbers are probably about 110 to 120 active volunteers on site each week and somewhere between 2000 and 3000 pages transcribed each week is about our average probably over the last year and a half. Again depending on the different material types and the balance of material on site and the engagement campaigns that are occurring at a particular time these numbers may change. And then again our cumulative numbers since launch in June 2013 the transcription center has grown from eight archives and museums and libraries to 15. We've completed nearly 1500 projects and we've completed 186,000 pages. And our volunteers continue to grow. Most of our volunteers are hailing from predominantly English speaking countries but they do make visits from around the world and of course we welcome activity from around the world. And then I just wanted to give a brief overview. I mentioned that we are transcribing text to make the discovery of pages and the discovery of content more effective. But in addition to that these particular projects have created new collection records and I think that this is a really frankly incredible element of the transcription center. So the volunteers are committed to accuracy and quality and they are having the opportunity to see the transcriptions they've created within collection search center to pull up individual pages and to ultimately have contributed to over 127,000 new collection records since June 2013. So just very briefly I'll wrap up with a couple of recommendations or tips about places to start. If you're considering a crowd sourcing project and engaging with the public and then a few tips. So one thing that we've learned over time and one thing that I've learned in implementing engagement strategies is it's useful to start small and to build slowly and also consider the scale at hand. So some of our projects have been in the transcription center have been pilot projects and some of my engagement activities have been experiments and we may have had success with those but it might not be feasible to bring that in to roll into a full program of engagement. And I think that that is a valuable learning point and something to consider if you are thinking about building a crowd sourcing project. I think it's always incredible that I can say with a straight face that I learned something new every day and I think that our volunteers learn something new every day as well. And that is through thinking about the use and the condition as well as the content and the context of collections that we're bringing forward for transcription. So that may be we have our participating museums and archives make their own decisions about what is going to be transcribed. And a lot of that comes down to what are their digitization priorities as well as what is the condition and what's already known or what is valuable or important about a collection. I think it's really important to be prepared to assess to set goals and set metrics that can be measured. But again be selective because measuring everything is actually more disruptive to your approaches. Accepting and integrating participants into the design process. The design of the site as well as the design of your engagement I think is really sets a ground for trust to be established and honors and recognizes their participation as well as staff contributions. And I think involving staff as early as possible in the process is very important. It's useful to think through the capacity and the opportunity that staff members may have to contribute. So again in our in the context of the Smithsonian Transcription Center we have representatives from 15 different museums and archives and libraries. And so they have a number of other tasks that are on their plate every single day and the opportunity to engage may need to be paced carefully or matched closely to other engagement activities. We can also say I can say very clearly that testing assumptions and adjusting has been one key to success and I would say that at the same time actively critiquing successes. What are other conditions that may have grounded the success of a particular project can help to inform and prepare for future engagement and also help you prepare to pivot from failure. And my final thoughts and tips in this space and there are a few extra slides in the slideshow or just some additional resources. But in general I just would recommend that when you're building engagement that you are intentional and that you reflect on your practice taking the time to think through feedback to make observations and maybe connect your approaches with other organizations or other people within the same space as you. Other projects in my case would be project coordinators. My work kind of also meshes with a lot of engagement and communication strategies, education strategies within the museum and cultural heritage spaces. And I think it's been a really amazingly rewarding opportunity to connect with professionals who are sharing constantly within this within the museum library and cultural heritage spaces and sharing their practices. I think it's seems like a given to say advocate for your participants versus a crowd, but it can be too easy to be swept along in the flow and to forget that you may be the only advocate for your for your participants. So carefully thinking through their needs and bringing their needs forward, both in your engagement strategy as well as the design of your project is a clear key to success. I think celebrating successes regularly while also highlighting individual achievements is also not only necessary but really cultivates activity and I should also take the opportunity to say thank you to our development team. We've had a number of people who have contributed to the design of the transcription center and specifically I want to acknowledge Rich Brassel, Andrew Gunther, Mike Shaw, and Johnny Bilzma who continue to improve the site over time and our program manager, Ching Shen Long, who has been instrumental in connecting the opportunities for transcription with our existing Smithsonian systems that support collections. One final thing would be to combine opportunities to learn with taking time, some self care, pausing, being patient. I'm probably not the best at being patient. I want to learn more and share as rapidly as possible, and then taking, taking engagement and crowdsourcing one step at a time. And those are some, some closing ideas and tips on building crowdsourcing engagement. So thank you for spending time with me today and I look forward to answering some of the questions I've seen come in as well as other questions that may come forward now. So scrolling up through some of these questions. Do you want me to read them. Sure. Okay, so there was a question early on from Jacqueline Spoon saying, do you have any concern that campaign emails sent to patrons are an invasion of privacy and against standard library policy and culture? Well, I do have concerns about campaign emails. Part of the sign up for the transcription center indicates that we'll be communicating with them via email. There is always the opportunity for our participants, our volunteers to elect to no longer receive those unsubscribe. And we very carefully maintain and protect the privacy of volunteers. We only capture our volunteers emails and an avatar, whatever name that they would like to, a username they would like to represent themselves. I think that I'm very actively considering every single day privacy and respect and trust with our volunteers as well as making sure that we are honoring and thanking them for their contributions with rewards such as behind the scenes access and really carefully thinking about exploitation of their privacy as well as exploitation of their labor. Okay. I've looked at a lot of crowdsourcing projects like yours at the Smithsonian and New York Public Library, etc. And I struggle with finding a basic starter article for the tech program aspects that can create volunteer friendly platforms. Does your download point to something like this? Yes, I think so. Some colleagues and I gave a presentation in March at South by Southwest and one of those colleagues included one of our transcription center volunteers. And we pulled together some examples, some resources and some examples of platforms. But off the top of my head, I'd like to just make some recommendations of some additional platforms to explore. Some of those include Omeka and Scripto as a plugin for transcription that's out of the George Mason University Center for History and New Media. We also I think acknowledged Ben Brumfields from the page which is an opportunity to, he supports transcription as well as I think tagging and other activities as well. And then there are a couple of other sets of resources within the downloaded file. There's a link to the same downloaded file. There's a link to a Google Doc. And I will in that Google Doc put some other sets of resources for finding some starting points. Yeah, or you can give them to me and I'll post them with the recording. Fantastic. Thank you, Susan. Yeah. Okay. In addition to the comments above, I work for local government and we are very sensitive to security in online applications. How have you addressed this? I'm not sure if the question is about the soft, the technology. So as a web application or if the concern is for email addresses. So, yeah, I think the same, I see technology. So it's a web, a web based application and our platforms that we use go through review within the Office of the Chief Information Officer, the, excuse me, the systems that are built and they go through review for security. We have several, this is a Drupal based module. We have several other systems in place that indicate the stability and levels of security within the site as well within the various features and components of the sites. I can try to ask for more information about the review process. I'm not a developer and I'm not developing the tools and technologies for this. So I'm not sure that I'm going to give sufficient answers for that particular question. Okay. Can Vic now give a couple of resources that he posted and then we'll move down here. How do you choose the collections that will get transcribed and what's the benefit of having these transcribed any increased usage? Sure. So the, again, I mentioned very briefly earlier that our museums and archives teams make their own selections of materials that they would like to see transcribed based on other sets of priorities within their offices. So that may be down to this set of items is prioritized for digitization. Obviously in order for us to have it transcribed, it has to have a digital facsimile. So if something has high priority for digitization that may be a candidate, most of those museums and archives also prioritize frequently accessed research collections. So if those collections are already being requested frequently, they may be candidates for transcription. We know that we've had an increase in downloads of these materials, but we in some ways know less about the use of the specifically transcribed items because they go back into both the ecosystem of wider Smithsonian collections by being indexed in that system. And they also are made freely available for PDF download. So there could be people discovering them through general search, general web searches and accessing them in that way. Okay, and I'll add Ken Bicknell's resources to the resource page before I post the recording. So don't worry about getting as if you were looking. Where, how do you find support funding to image your collections for crowdsourcing projects when the immediate gains are not apparent? One of the challenges I've encountered is convincing people, even colleagues at imaging specimens for downstream crowdsourcing project is important and something worth putting money into. I think that's a really great question, both because I think that it is an experience that even at a larger institution, many departments experience similar situations, questions over the value. Well, truly it comes down to a question mark over value or utility, both of the digitization process as well as the validity or quality of the data that's created through transcription or other crowdsourcing activity. I think that my answer from my positioning in this particular role is that it's very clear that both access and engagement with collections in crowdsourcing and transcription is valuable and a positive experience. I don't have answers that across the board show convincing. I would say my best piece of advice would be to start very small and to create empirical evidence for the success of crowdsourcing. I think that the funding support there are a number of funders who are interested in supporting this type of activity. I think in particular hidden collections is a great program and a very valuable and important program. I think collaboration with other organizations seems to also be a pathway to success. So related collections from multiple institutions and connecting those collections digitally through crowdsourcing or other projects. Also, if you look in the I'm connecting to collections care website in the link library, there are quite a few resources on fundraising and supporting collections care projects which this would fall into so check that out too. On our university campus we have students volunteers interns image items for transcription. If you're not directly connected to a college campus campus you might you might be able to contact one nearby to make it known that you have imaging projects available for unpaid volunteers. So that's one suggestion. Yes, and actually there are there are programs that other institutions, for example, the Australian Museum, Atlas of Living Australia's DigiVol, which is a transcription platform for biodiversity materials. They have a really wonderful program of in person volunteer digitization and then also I would recommend that you connect with the innovation hub at National Archives. Deena Herbert is always excited to share her experiences and best practice in having public volunteers digitize collections with their scanning center. And how do you turn transcription into usable data, for instance, feed transcription data into specimen records? So, and a number of different ways. It's a great question. So our transcribed text from items such as diaries and field notes is indexed. So let me take one step back and share that this these types of transcription relate to the ways that we have designed in essence the the text capturing space on the page. So for diaries manuscript correspondence we use an open text field it looks like a little box and each of the images that's being transcribed is given it within the database its own row and then against the row is the text that's transcribed. So it's indexed at the asset level. The asset being the image for our entomology and botany and National Numismatic Collections. The fielded data is exported as a CSV file and the columns are corresponding to the database in which the information is being stored. So the collection information systems have particular fields and collection managers are able to export the data from the transcription center, clean it and match it to their particular import pathways or their standard import of data and then bring the the items into the record to connect with that. And I hope that makes a little bit of sense. I'd be happy to answer those questions a little bit more in detail. Megan, don't you have some additional slides that show that? Some additional slides. Let's see. I think that these are, this is the description of OCR. I have a slide earlier in the presentation that kind of describes the top level connection of materials and what are our standard pieces of information that we need from the collection holding museum or archive in order to bring a project into the transcription center. Okay. All right. So I think that's all the questions. Unless somebody else has any. But know that I'll post this as soon as you don't see the announcement slide for this, the thing will be posted. We'll make additions to the resources and I'll post them with the recording. I'll post the PowerPoint slides and take a look in the Connecting to Collections resources. There's actually, there've been a couple of webinars before on crowd sourcing. So take a look at those and we'll see you towards the end of August. Have a wonderful summer and thank you, Megan. Thank you, Mike. And we'll, I think that's it. I'm going to say goodbye. Thank you. Yeah. And also please feel free to get in touch if you have questions. I really tried to have a very top level overview of engagement and I have many, many specific examples that I've shared in other spaces and also happy to share those again. Thank you so much. And we'll see you all the middle of August. Bye bye. And also on just to give you a heads up in September and early October, we're going to do a course on collections and management issues. So keep an eye out for that. It will probably be posted in the middle of August. So thanks. Bye bye.