 Good afternoon. I'm Shanika Morris of the Association of Research Libraries, and I'd like to welcome you to this afternoon's webinar, iPad's Academic Library Definition Changes for 2016-17. There are a few announcements before we begin. All participants have entered the conference in listen-only mode. We invite you to join the conference by typing questions in the chat box in the lower left corner of your screen. Robert Dugan will answer short questions or questions that probably have a yes or no response while the conference is in progress. For longer questions, we will submit all of those to the presenters at the end of the conference, and the speakers will answer the questions at the very end of the presentation. This webinar is being recorded on Thursday, July 28, 2016, and ARL will share the presentation slides and a link to the recording in the next week. Today, we will hear from iPad's program staff, as well as members of a joint ARL, a CRL Task Force, on Aligning iPad's Academic Libraries Component Definitions with Established Practices. Our presenters are Mary Jane Petrowski, Associate Director of the Association of College and Research Libraries, Martha Kirillidou, Consultant for the Association of Research Libraries, Robert Dugan, Chair of ACRL Academic Library Trends and Academic Survey Editorial Board, University of West Florida, Chris Cody, National Center for Education Statistics, Integrated Post-Secondary Education Data System, Oliver Pesh, Chief Product Strategist, EBSCO Information Service, and Kristen Martin, Electronic Resources Management Librarian, University of Chicago. A warm welcome to our presenters and all of our attendees. And now, to begin the presentation, I would like to turn the floor over to Martha Kirillidou. Martha, please go ahead. Hello. This slide shows the Joint Task Force members and Mary Jane Petrowski and myself are representing ACRL and ARL in this call. And Mary Jane Petrowski, would you like to go ahead? Thank you, Martha. On behalf of the Association of College and Research Libraries statistics program, I would just like to remind all the participants that our annual survey, our annual Trends and Statistics survey, is now going to include the iPad's Academic Library component that we're going to be talking about in great detail today. So just wanted to let everyone know that we include that as part of our annual survey so that you can download your data and transfer it to your iPad's key holder. We've gotten a few questions about when our next survey will open, and that will be very soon on September 1. And we will close on February 15. We are trying to align our survey with the iPad's survey period. And we intend and hope to make our 2016 data available by late spring of next year as part of our ACRL metrics statistical product. And back to Martha. Thank you. A few words about the ARL 36 mailing. There are no survey changes for the 2015-16 survey cycle. The forms and the mailing are now available. And the due date is October 15, 2016. Shinika is expecting your results and data. And the data are available to the members who are contributing as soon as the data are submitted. And many of the iPad's Academic Library data elements are, of course, part of the ARL statistics. So with this brief introduction on the Association-specific surveys, we are trying to align them as much as possible with the iPad's framework. This is the main work of this task force. I'm glad to invite to Bob Robert Duggan as co-chair of the Joint Task Force to describe the more specific work of our task force. Bob. Thank you, Martha. Hi, everyone. Just a couple of things about the Joint Task Force. It's a joint task force of ACRL, ALA, and ARL-appointed members. And the task force was in place last year in which the membership provided recommendations and suggestions to iPads concerning the definitions and the instructions for the Academic Library's component, the AL component, for the 2015-2016 year. The suggestions were accepted from the task force, and iPads made some changes to the survey in hopes of improving it. This year we reconvened, and we have provided suggestions and recommendations to iPads again, and those changes are what we're going to talk about today. The task force is made up of academic librarians from the various Carnegie classifications, and it's under the oversight of ARL, ACRL, and Martha facilitates it. Most of our work is done through conference calls. We also share Google documents, and we have had, in the past two years, face-to-face meetings at ARL in Washington, D.C. So we have been working in terms of, with the survey directors, iPads survey directors in terms of improving the instrument and making it a little bit easier for the libraries to work on this. Today we're going to be discussing specifically those changes that are going on with the 2016-2017 Academic Library's component. For those who are most of you are familiar with iPads, it's a term that we use a lot, or a place we talk about a lot in higher education, but this is just the definition or the actual mission of iPads. And with that, I'm going to turn it over to Dr. Christopher Cody, who is the Academic Library's component survey director. Thank you, Bob. Hello everyone. As Bob said, I'm Chris Cody. I'm a senior researcher for American Institutes for Research. We're one of the projects I direct is with iPads as I serve as a survey director for academic libraries, as well as fall enrollment and 12-month enrollment surveys for iPads. Today I'm going to go over the upcoming changes to the 2016-17 collection for the Academic Library's survey. Prior to going into the changes, let me first go over how these changes have come about. iPads collects basic data from approximately 7,500 post-secondary institutions in the United States and the other jurisdictions that are eligible to participate in the Title IV federal financial aid programs. Title IV institutions are required to respond to iPads. iPads also allows other non-title IV institutions to participate on a voluntary basis. The National Center for Education Statistics, also known as NBES, recently saw authorization from the Office of Management and Budget to continue iPads' data collection. Our current authorization expires December 31, 2016, and these authorizations usually last three collection cycles. So iPads has recently requested a new clearance for the 2016-17, 2017-18, and 2018-19 data collections to enable us to provide consistency in our collection post-secondary data over the next three years. It is during this window between our OMB clearances that allows us to change iPads' surveys substantially based on several factors that we'll get into in the next slide. With the OMB package, the Department of Education in accordance with the Paper Reduction Act of 1995 provides a general public and federal agencies with opportunities to comment on proposed revised and continued collections of information. What this does is it helps the department assess the impact of its information collection requirements and minimizes the public reporting burden. It also helps the public understand the department's information collection requirements and provides the requested data in the desired format. The Department of Ed solicited comments on the proposed information collection request of iPads. The Department of Education was especially interested in public comments addressing the issues of the collection, of the timely manner of the collection, any estimates of burden, accuracy, how might the department enhance the quality and utility and clarity of the information collected, and how might the department minimize the burden of the collection on the respondents. There are recently two comment periods for our OMB package, a 60-day period which closed in April, and a 30-day which recently closed on July 25th. During this period, people and organizations were able to comment on the proposed changes provided by iPads. And iPads respond to the comments, oftentimes resulting in additional changes to the survey. The instructions frequently ask questions or tips for being added to the survey. So while the OMB package is now closed for comment, it's up on the website provided to you on the slide, and the comments can be viewed using the commands presented on the slide. If you go to the website, you can see what would the proposed changes to the upcoming collection cycle were, as well as what people, comments people had about the proposed changes. So how did these proposed changes to iPads survey come about? There are several different avenues that a survey for iPads can actually change. Many of them occur based off what we call our technical review panels, also known as TRPs. Meetings of iPads TRPs are conducted by a research triangle international to solicit expert discussions and suggestions on a broad range of issues related to post-secondary education and the conduct of iPads. A TRP is designed to allow the public to advise and work with RTI to improve iPads data collection and product data quality and user-friendliness. Also, these changes to iPads surveys during the OMB process can occur through quality control of reported data. This is where basically, based off the reported data, we've identified areas where we need to clarify questions, re-analyze how we are presenting questions and instructions, and then those changes are made. Also, through work from instrument organizations, changes to iPads surveys can occur during the OMB clearance. Such an example is this Joint Task Force Committee to provide substantial feedback during the OMB comment period as well as prior to presenting changes to the OMB process that has helped develop the next collection of the survey. Also, feedback from institutions or comments during the OMB package will allow us to change items on the AL survey. Today, when we discuss the proposed changes to the academic library survey for the upcoming collection, most have actually come about due to the hard work from this task force, and we'd like to thank them very much for their work. So before I move into the overall changes for the AL survey for the upcoming collection, here's a graphic of all the surveys and the time period they open and close for collection. As you can see, the Academic Library Survey opens for collection this upcoming cycle on December 14, 2016, and closes for key holders of the institutions on April 12, 2017, and iPads coordinators on April 26, 2017. Preliminary data is then released to the public for the Academic Library Survey in mid-fall of 2017. Okay, so now on to the changes, which I'm sure many people are here to listen to. We're making some question changes to the survey and this upcoming collection cycle, as well as realigning and updating our instructions to meet the Academic Library Industry Standard on how things should be collected, which was largely based on the work of this task force. First off, who reports to the AL survey? It's our post-secondary degree-granting institutions that either have access to a library collection and or have library expenses greater than $0 are asked to report to the Academic Library Survey. This results in about over 4,000 institutions reporting each year to the AL survey. The reporting period for the AL survey is for the fiscal year of 2016 or for this upcoming collection. It's fiscal year 2016 defined as the most recent 12-month period that ends before October 1, 2016 or that corresponds to the institution's fiscal year. So there are two sections to the Academic Library Survey. Section one is a Library Collection, Circulations, and Interlibrary Loan Services section. Last year, this was only known at the Library Collection and Circulation section. The changes we are implementing for this upcoming collection that we'll begin to collect information on will go over first dealing with section one. The one, the first and major changes that we are implementing is that we'll begin collecting information for physical serials and digital electronic serials for collection, as well as we'll begin to collect information for physical serial circulation. I'll get more into this in a future slide. We also, as I kind of stated previously, are moving the Interlibrary Loan Service section from section two, which deals with expenses, to section one of Academic Library Survey since this deals with relaying more to access. We're also adding a screening question for this section that we'll ask if you have Interlibrary Loan Services. If you do not, then you will not be asked to answer additional questions on that area. We're also updating and changing some of those instructions to reflect the most recent addition of serials and to respond to recommendations from the Task Force and OMB comments. So the update to the instructions I'll go over briefly. I'm going to first skip the physical books as I've devoted an entire slide to it shortly. In regards to physical media, while we're still asking in the instructions that institutions report the number of titles of media materials, we are explaining what to include differently. We ask now that you include audio-vision video materials or audio-visual materials, sorry, cardiographic materials, graphic materials, and three-dimension artifacts and reality. In regard to physical circulations and our update of the instructions, we are now asking that you report the total number of times physical items are checked out from the General and Reserve collections. Reserve is at its new, this upcoming cycle. We are still asking that you include only initial checkouts, non-renewals, and exclude interlibrary loan lending and borrowing. We also ask that you'll include books, media, and materials in physical circulations. And we'll ask that you do not include transaction, equipment, or computers. However, circulation of electronic reading devices can be included if the device is preloaded with e-books. In regard to digital electronic books for the upcoming collection, everything has stayed the same in terms of instructions, except we have added to the instructions to include open access titles if the individual titles are searchable through the library catalog or discovery system. Except we are asking that you do not count e-book titles from Happy Trust, Center for Research Libraries, Internet Archives, and similar collections unless the library owns a digitized item and is accessible under current copyright law. Finally, for updating instructions for section one, for digital electronic circulation usage, we ask that you report usage of digital electronic titles whether viewed, downloaded, or streamed, but that we're asking that you do not include e-serials and institutional repository documents. However, we do ask that you include e-books and e-media titles only. So to reiterate, we'll be collecting physical serials collection and circulation, but we'll only be collecting e-serial collection. We will not be collecting e-serial circulation. Now moving on to a little bit more detail of physical books and then we'll talk about serials directly. First, this year we are going to change how we collect information on physical books based on recommendation from this task force. We will now ask that you report physical books by titles, not volumes, as seen on the definition or instruction changes on the screen. For 2016-17, we are asking that you report physical book titles on or released by the library if individual titles are cataloged and or searchable through the library catalog or discovery system. We ask that you exclude serials, microfilms, maps, non-printment shields, and un-catalog items. We ask that you include music scores as searchable by title through the library catalog or discovery system. Also include government documents that are accessible through the library's catalogs regardless of where they're searchable, separately classified and or shelved. Catalog includes documents for which records are provided by the library or downloaded from other sources into the library's card or online catalogs. So just to reiterate, the main change here is that we are moving from collecting physical books by volumes to collecting physical books by titles. In terms of serials, these are new instruction data since we have not collected serials before. In terms of physical serials, we ask that you report the number of physical serial titles that are accessible through the library's catalog or discovery system. A serial is a publication in any medium issued in successive parts bearing numerical or chronological designation and tended to be continued indefinitely. This definition includes in any physical format, periodicals, newspapers, and annuals, as well as the journals, memoirs, proceedings, transactions, et cetera, societies, and number of monographic series. We report serial titles, not subscriptions. If possible, report the count of only those duplicated or otherwise unique serial titles searchable through the library's catalog or discovery system. If possible, do not include either earlier title changes. However, do not worry about removing if not possible or feasible. And then in terms of reporting digital and electronic serials, we ask that you report the number of each serial titles that are accessible through the library's catalog or discovery system. Each serial is a periodical publication that's published in digital form and displayed on its period screen. While it might be a little bit difficult to see, this is a screenshot of what the future survey screens will look like for the 2016-17 collection. One thing you like to make note is you see the addition of serials as well as the moving of the interlibrary loan services to section one. Also, when filling out the survey, remember that we provide previous year's tolls by the upcoming collections. These may look very different based off the instructions for 2015-16 and what we are now requesting a report for 2016-17. For example, where books are prior year will show numbers and volumes, where the current year is asking for numbers based off titles. Moving forward to the section two of the Academic Library Survey, if an institution has expenses greater than $100,000, they are asked to report in section two of the AOL Survey. The changes to section two include relocating interlibrary loan services to section one, as I previously stated, and removing the question, does your library support virtual reference services? This was a recommendation by the Task Force and through data analysis, this question was no longer needed as most institutions reach support virtual reference services. Finally, we updated the following instructions based on the library community's feedback. For material and service expenses, we have changed a few of the instructions, one being regarding how you report one type purchases of books, serials, backfiles, and other materials. We have removed this section that says do not include current subscriptions of serials, subscriptions of serials, sorry. This change was just more for clarification of our instructions and to eliminate some confusion. For reporting ongoing commitments to subscriptions, we have added in the instructions that you also include serials and any other items committed to annually, as well as an annual E-platform or access fees. We have also stated that serials are publications issued in successive parts, usually at regular interval and as a rule, intended to be continued indefinitely. The change here is wording clarification on how to report serials and any other items you annually, as well as the annual E-platform or access fees. In regards to other materials and service costs based for materials and service expenses, we have updated the instructions where we have added one line where we are asking that you include costs associated with pay-per-view journal article transactions, and we have also removed a line where we ask that you report expenses such as those for cartographic materials and maintenance scripts. Finally, in the instruction sections for all of the operations material expenses, we have added to the other items that are supposed to be included in this section that you also include interlibrary loan fees paid to bibliographic utilities that you cannot separate out. This includes interlibrary loan costs and all other operation expenses with the library's expenses of the bibliography utility. This is only in this situation that cannot be separated out. Here is the updated section 2 screens for the AL Survey for the sub-counting collection. As you can see from here, we have removed the virtual reference service questions and we have also removed the interlibrary loan service questions and moved them to section 1. Finally, when filling out the survey, please remember to print any new survey materials with the updated instructions, FAQs, and glossaries to ensure you are providing data of the most recent set for the upcoming collections as these items have changed. So any frequently asked questions, any instructions you've had from previous years of reporting, it would be probably best to disregard those and print out the new materials once they are available on the iPad's website as those are now how the instructions and the FAQs that we are aligning the instructions to. I'll now pass the discussion to Oliver to discuss Counter and Sushi. Thanks, Cody. I'm Oliver Pesh and I work as a product strategist at EBSCO Information Services where my focus tends to be on our knowledge base and librarian productivity tools including some of our usage consolidation products. But more relevant to this webinar is probably my activity with Counter NISO. I'm the board of directors of both organizations and within Counter I chair the working group that's looking into the technical details related to release 5 of the Counter Quarter Practice and I'm also co-chair of the NISO Sushi Standing Committee. So my goal today is to provide some background on Counter and Sushi and how these standards impact and facilitate the iPad's reporting and I'm also available to answer questions that may come with regard to usage or counter or sushi. First for those who may want to just a real quick refresher on Counter and Sushi Counter is a code of practice that when followed results in a content provider offering consistent, credible and comparable usage statistics of this scholarly online information. The code of practice lists the reports to include, specifies how those reports should be formatted, the metrics to include in those reports, how transactions should be processed and how reports are to be delivered. And credibility is enforced through the required annual audit that each content provider must go through. Sushi which stands for the standardized usage statistics harvesting initiative resulted from a need to address scalability problems with counter reporting. So with multiple reports for content provider and usage being found at dozens of provider sites the effort of retrieving the hundreds of reports can be overwhelming. Sushi solves this by enabling automated harvesting of counter usage via web services. Sushi is an expected feature of an ERM or usage consolidation product. So once you have configured that should eliminate the need to manually retrieve the counter reports. Here's a quick historical timeline of counter and sushi which I've included to put these standards in perspective. Counter has been around for more than 15 years and was the result of collaboration among members of the publishers and libraries services group and they were seeking consistency around usage reporting of the growing online collections and this is sort of the late 90s early 2000s. Counter was formed in 2001. The first code of practice was released in 2003. Sushi came along in 2007 as a means of automating the loading of usage into ERM systems and currently we're at release 4 of the counter code of practice and work is underway on release 5 and we'll have more on that later. And the other event on the timeline to highlight I think is the introduction of USIS in 2005. USIS is a community website focused on issues related to usage. It's sponsored by counter but run by members of the community. You can access it at usis.org.uk If you haven't been to the site have a look. You can submit questions. If you have problems with a vendor submit the issue there and they will basically take care of finding a resolution for it. So it's a great site. Okay, so turning back to the IPEDS survey and the form in question is here and we've circled in red the library circulation for digital electronic resources. And this is where the counter reports should prove helpful in providing accounts for this entry. So IPEDS specifies as Kody said it's about reporting use of titles whether viewed downloaded or streamed and that maps nicely to counters full text requests metric. As Kody mentioned IPEDS limits the count to circulation of e-books and media and excludes the use of these serials content and institutional repositories. So the counter reports where you'd find the information you need to report on this would be book report 1, book report 2, and multimedia report 1. And the multimedia report would come from organizations like Alexander Street Press who offer collections of multimedia audio-visual resources. So the bottom half of the screen we see the sample of is 2. And in red is the number that you need your reporting period total for that report. So basically you would run your book reports 1 and 2 and multimedia reports 1 for all of your providers add up these totals and enter the number on the IPEDS form. And if you happen to be using an ERM or usage consolidation product you'll be able to save some time by pulling a single report that can consolidate usage across all of your platforms. But that's basically where you get that information. So I'm sure a few of you have experienced some challenges or questions with book reports. So let's talk about that just briefly here. Book report 1 is provided by e-book hosts that deliver the complete material file, a single PDF. And book report 2 is offered by e-book sites delivering books in sections, sections like chapters or pages or entries in a reference book or encyclopedia. So a given e-book host will typically provide one or the other but not both. They're kind of mutually exclusive reports. So a quick example of the problem. So let's say two identical users each accessing a book on two sites. The first site they access delivers the book as a single PDF and the second site delivers the book by chapter. Our identical users each read five chapters of the book in a sitting. For site 1 that activity will show up on their book report 1 as a single download. However, site 2 will show five downloads for the same book in their book report 2. And so the problem as you can kind of see is the usage is not really comparable between the hosts and between the reports. Now you're adding up book report 1, book report 2 stats in the IPES reports and so the results going to be a somewhat inflated number when compared to print circulation which tends to record things at the title level but it's just something to be aware of and we'll talk a little more about the solution that's coming up with counter on the next slide. So now looking ahead several initiatives are underway which should increase the number of publishers offering counter reports improve the quality of what they offer and make it easier to access and use the counter reports. So in development now is the counter report validation tool. This is a free tool that will be made available to librarians, content providers, auditors and it will allow them to check to see whether a counter report or a sushi implementation is indeed valid. And once it's in place we hope to see a significant decrease in the reports that are released that are not quite compliant. I'm sure we've all all of us have used counter have experienced some of these. The NYSO Sushi Team is working on Sushi Lite which as it sounds is a simpler version of sushi one that's quicker to implement allowing content providers to become compliant with less effort and hopefully fewer compliance issues. It'll also allow other applications to be able to take advantage of sushi and automatically include the counter statistics without requiring separate loading of reports. Release five at the counter quota practice as I mentioned earlier is in the works and the focus is on improved clarity and consistency trying to make it easier to both comply with the counter quota practice and to use counter reports. I'm not going to go into great detail here beyond what I've just said but you'll hear a lot more about this in the coming months. Then getting back to our challenge of reporting on e-book usage a new report book report seven is expected to become a requirement with release five. It's currently available as an optional report if you want to pursue that with your content providers but what it's doing is it's introducing the notion of a unique book view within a session and effectively removes the comparability problems that we see today between book report one and book report two and between various vendors offering book report two. So the unique book view count for a given book would increment only once per session where that book was accessed regardless of how many pages or chapters or sections were viewed so it no longer would matter whether how that book was divided up and presented to the user. So I'm going to wrap up with this slide. So when you're considering iPads using the usage reporting in a larger context of counter some suggestions come to mind bringing up a couple of here some of which you probably already have been discussing at length. The first which comes to mind is the inclusion of e-serials usage in the iPad circulation reporting. So as I was preparing for this I took a look at the usage from a few of our usage consolidation customers and the usage represented in their journal reports is typically five times greater than the total usage found in their book reports. So ignoring journal usage means that an important part of the collection is currently not being reflected in the iPad stats. And another set of metrics to consider I think is the use of abstract and indexing databases and discovery indexes. So providing users with discovery tools it's an important part of getting users to the information they need. So having use of these tools reflected in a survey could help provide a measure of the overall effectiveness of the online information environment a given institution is providing its users. So I'd like to conclude by saying that counter reports offer most of the statistics needed to report on the usage of online collections and thus it does make a good platform for expanding the usage reporting within the iPad survey. And speaking for counter we're here to help so please feel free to reach out with questions or suggestions and you can email me directly or contact Lorraine Estelle on the counter site happy to help. And now I'm going to turn it over to Jason Martin who will talk more about how you go about actually collecting this information. Thank you. Hi folks. So I'm going to try to tie my portion of this presentation back to both Chris's talk on collecting title counts and to Oliver's talk about counter and if you are already familiar with counter and have been using it some of this may be pretty rudimentary but if you haven't looked at your counter reports in a while or you just are kind of exploring this area new I'm hoping to give you some practical hands-on tips for getting reports that you need for your survey. So the first part that I'm going to talk about here is getting title counts for that first section of the iPad survey and a great tool that you may be able to use is your knowledge base and your link resolver particularly you know this is integrated in with your catalog or discovery tool this may really be where you're providing access to your electronic resources and for most people this is going to be most helpful for serials which are going to be part of the survey for 2016-2017 collection period and it may also be helpful for you for books and multimedia if your link resolver provides a full range of that. If you are say no have all of your e-books in your link resolver and you may then need to rely on title counts or another means but I'm going to walk through two examples of some common knowledge bases and how you can get title counts from them. So the first example which comes courtesy of Oliver and is for getting a count from the ESCO Holdings Manager and the example that I'm going to walk through here is collecting the book title count but there is in that link at the bottom and as you probably heard we will be sharing these slides so you don't have to worry about writing that down but there is instructions for getting all sorts of media out of there from your knowledge base if you use ESCO. So here we are going to kind of walk through the steps and thanks again to Oliver for providing this. So anybody who has been to ESCO admin this might look familiar there is this Holdings Management up at number one and you go there you can get to your Holdings Manager you have an option to download reports which is over at number two and you can go down to number three where you can choose your resource type so in this particular example we are choosing book. Basically name your report something that is useful to you click on the big blue button at the bottom to get your file and then over here you will be able to go to a download tab that ESCO admin provides and they also have the hand e-link right here that is your title list summary and it goes to that smaller spot right there where you can get a total count of unique titles. So this came from an example of someone who uses the Holdings Manager. My second example comes from getting your Serial Titles Count out of SFX and so this may look familiar to people that are using SFX for knowledge base and basically you are logging into your SFX admin then you are going from the main page to the main screen shot here but you would go to your knowledge base tools or KB tools to export tools and that brings up the shot here that we see where you have got this advanced export query you can select that you want to get this text file output and in this case we are getting Serials so we are going to select Serials SFX doesn't actually provide any multimedia so if you have all your books in here you can do a monograph too and we are going to limit our active portfolio to ones that are full text. So you do that and it will send you a text file and instantly you are in a little while depending on how big your knowledge base in there is and then you can use Excel to do a little more manipulation to try to get things de-duplicated. So this is like the screen shot that you may have seen if you have used Excel to bring in text and convert it to an Excel spreadsheet it works pretty well for the SFX output a couple of quirks I noticed was that there are no actual column headings and then you want to make sure if you have a long number that is an identifier like the SFX Object ID you need to make sure that you have selected text for that format because Excel thinks it hits a really huge number it is going to convert it to scientific notation and it is only going to keep significance up to 15 digits. So if you have an identifier over 15 digits it is going to round them and then they are not unique anymore which is truly a horrible thing to happen. So everything else seems to work pretty well in terms of the default although if you have got a lot of non-Roman characters you may want to choose UTF-8 just so that it looks cleaner. Once you are in Excel going to the data ribbon you can actually use a remove duplicates function and column E is the SFX ID in this particular case you can see we have got publishers weekly in ten different portfolios so we don't only want to have to count that ten times but we get this little remove duplicates come up we are going to select it on the SFX Object ID so we are relying on SFX's knowledge base to help us with this Excel deletes the duplicates that is counting that we have left there in this case 130,068 e-journals that is your count that you need for iPads for your electronic serials. Let's go over to the second part of what I would like to talk about today which is obtaining your counter reports for reporting that digital usage slash circulation. So some things if you are really starting at this from scratch and you haven't been collecting these before I am going to look at a really basic way of collecting your user statistics without necessarily going through an ERM or really needing much anything else besides a spreadsheet but the first thing that you have to think about is like where are you going to get your statistics from what are the platforms and providers that you have and there is kind of these three categories here that you want to think about you have got publishers that are hosting their own content like if you have got journals on Science Direct those are all going to be Elsevier journals you also have publishers that use a third party platform like the Royal Society titles are available on Highwire as well as a number of other science and medicine titles and then you have aggregators which are licensing content from a wide variety of publishers and generally offer it as a package through a database like Academic Search Complete on EBSCO Host or ABI Inform on ProQuest so you want to think about where you are getting that content from and you may already have some information about this because what you are going to need is the username and passwords to be able to get into these provider websites or into their administrative websites to collect that data and if you have already got that because you have had to customize things or because you have been collecting your statistics then you have got a great start for this and you have to do a lot of work kind of up front but then it gets easier year by year and if you are really not sure and you know you are getting some stuff through a consortia you might be able to talk to people at your consortia to figure out where the administrative places are you can talk to your reps and they will probably help you out so this is walking through some really basic steps if you are going to be doing these things sort of manually basically you have your sheet of providers then you are going to identify those administrative URLs and the login information that you need for collecting your statistics the different formats that you are going to want to report and collect under each provider and I would say although the JR1, the Serials is not necessary for iPad 2016, 2017 that is something that is up for discussion and it may be something that you just want to collect to find out how things are being used how are your subscriptions being used we have the two book reports the BR1 and the BR2 and then the multimedia which is the MR1 those are the reports that you really need for iPads you probably also want to include on this list providers where you can't get statistics so you don't have to go through that process if you are trying to figure it out and then you can put any notes in there if there is something that is particularly tricky about how to get the statistics and as you work your way through the list you can be recording that reporting period total in your spreadsheet that Oliver showed you earlier so I am going to walk through three different examples here of collecting statistics on three different sites the first one being getting the book report 1 from EBSCO host and this is the EBSCO admin module which actually looks pretty similar to what we saw as a holdings manager at the same setup so you have your institutional credentials you log in in this case we are going to the tab called reports and statistics and then you have options to select counter reports and counter R4 that is release 4 of counter so following that you get a drop down box where you can choose the reports that you want in this case we have book report 1 and you can choose your time period based on your fiscal year in this case we have a fiscal year that ended June 30 so we have July 2015 through June 2016 click that big yellow button and then EBSCO is going to place the report on a separate tab for downloaded reports and then you can go there and you can download it into a spreadsheet so opening up in Excel here I apologize but you have that reporting period total number you record that in your master list and then you can move on to your next platform so let's say that you also have ebooks from Springerlink now Springerlink offers the BR2 and unlike EBSCO they don't have a separate admin site per se but you actually go there with an administrative privileged login and this is fairly common with some publishers and then your login gives you special options that your typical library patron would not have so here I have signed in this is actually Robert Fletcher who is our electronic resources manager at the University of Chicago and we have this option for the admin dashboard which gives us library specific information including usage so navigating to the usage statistics we see a very similar drop down we have the option of what report we want you can see Springer does not offer the BR1 so even though BR1 is the preferred book usage counting tool for iPads we don't have that as a choice so we're going to take the BR2 which will give you the book chapter downloads and that is how Springer presents it on their site then once again you get this great Excel spreadsheet you have your reporting period total and you will note because it is BR2 that your number of usage per title will seem a lot higher than what you would get off of say EPSCO hosts which had the BR1 my last example here is from Alexander Street Press for the multimedia one and for Brevity I've simplified some of the steps that I'm showing in the slide because they're very similar to the other ones like EPSCO there is an administrative website for Alexander Street Press and there is a section right there for usage statistics of a very similar drop-down options in this case you know we've chosen the MR1 report once again we're looking at fiscal year and in Alexander Street what they've done is they actually provide you the report directly in HTML so if you don't feel like you want to save the report for any other reason you can just grab that reporting period total number for your main spreadsheet and if you feel like you do want to keep the report you can also download it to Excel and have it available for other purposes so that's the manual way of collecting reports it can become onerous if you have a lot of different platforms it doesn't mean you can't do it I work with a lot of platforms here and at this point we are still working under pretty much this process so it's certainly feasible but it's not necessarily the easiest way to do it but there are some things where you're looking at some different options here as well there are a number of tools that are out there that can help you collect and manage your statistics and they can also help you integrate them with acquisition and other data so you can calculate things like cost per use they come from some different areas of library tools like your subscription agents like EBSCO and Heratowitz they can help you bring in some of your journal acquisition information for people that are using like ALMA or OCLC WorldChair they have statistics support and they also may be integrated with your link resolver or with an ERM system like serial solutions SFX and the open source system coral has a usage module and these systems will also can provide a sushi client or though you can also get an open source sushi client and that can automate the harvesting now things have become a lot more standardized with counter so if you looked at counter reports or you looked at sushi a couple years ago and it seemed like you were having more exceptions and things that were actually running through it may be worth a look again to see if sushi could work for you better and the last slide that I'm going to share here is some resources for usage data and I know there are some questions that came up about this we do have a link to the project counter website here and that lists counter compliant publishers there's also a link to USIS which Oliver mentioned is the community run website that's focused on the usage of electronic resources it's a great place for asking questions for kind of getting information about issues or problems with reports I've also included some columns that Oliver wrote in Serious Librarian on implementing sushi and counter as well as an interaction to USIS and the final resource here where we are actually kind of waiting for the link it didn't quite get up in time is a longer presentation that Oliver gave to the Texas Library Association about counter and sushi and it both gives some more information on the database reports which we're not really covering here since they're not applicable to IPEDS at this point and information on sushi and how you can get that set up as well so before we go to the Q&A I'm going to turn this back over to Bob for some final comments about future issues related to the IPEDS survey thank you thank you Kristin a couple of things, the task force the next task force group that convenes hopefully by the end of the year or so are going to be looking at a couple of questions that have come up specifically one from IPEDS and one from the library community so future data collections a collection we're going to look at what is a shared library collection and when you start thinking about that there are just numerous examples and instances of different types of shared library collections IPEDS is interested in collecting this the task force membership will help IPEDS in terms of looking at how to approach that particular question a second one is e-serials usage the 2016 2017 academic libraries component will not collect information on e-serials the usage of e-serials however that's probably one of the the larger data compilations we have in libraries is our e-serial usage so we're going to be working with or the task force is going to be working with IPEDS in terms of looking at how can that be collected and defined the part that comes down to the JR1 and that type of thing ACRL's annual survey is already collecting information on e-serials usage and we can share our experience with IPEDS in terms of how that was handled the other thing that we would like to work with is the human resources component of IPEDS Chris Cote is not responsible for human resources but in the academic library component what is collected is expenses for library staff the classification of library staff actually sits in the HR component and we have brought this up the task force has had a conversation with the HR component survey director at IPEDS about some of the issues we're seeing with this and we will hopefully continue that discussion but in the meantime for everybody who's here listening your HR the HR component information which includes the library staffing classification information is oftentimes provided by your office of human resources it may not be collected and reported in the manner in which you would find it to be agreeable so you may want to talk to your OHR your office of human resources and or your IPEDS key holder to find out how the information is being reported and by whom but we're going to be hopefully working on that next so next thing we want to do is to go to the questions and answers and I'm going to turn that over to Martha thank you Bob the one question that I think we can hear again the answer is why haven't we included ethereal usage in the 2016-17 IPEDS can you give us some background on that? Bob Dugan? I didn't hear that the the reason why is that we're still trying to figure out how to collect it there is the JR1s which would be the counter JR1s which would be probably the most consistent and most usable however there are a lot of nuances to the collection of ethereal usage that we just haven't been able to deal with yet so one of the things that comes down to I think there was a question about usage of electronic serials that are other than journals we're still looking at this and I'm not convinced yet that we have that we know all of the ways in which libraries collect ethereal usage information so that's why we didn't we did not push to have that included in the IPEDS 2016-2017 survey because we thought it might confuse people more than be clear and we think that we need a little bit more time and the experiences we're picking up from the ACRL annual survey in order to make sure that the instructions are clear when we start looking at ethereal usage. Chris I'm going to ask you to chime in on that too. So with the recommendation from the task force not to include ethereal usage something we look at at IPEDS that I mentioned specifically when I was discussing how we get our survey changes clearance through the OMB is the burden on the institution or the burden and specifically the burden of this direct question and with our discussions with the task force on the uncertainty out there about collecting ethereal usage compared to a little bit more clarification on how we can collect physical cereals circulation. We felt that it wasn't we didn't have enough of an answer on burden we didn't have enough of an answer on how could institutions acquire this information, were we comfortable asking it and so it's something that we decided not to include in this collection cycle based off the recommendation from the task force and as well not having an idea of what the burden would be for the institution and whether it would be too great to ask the question as of now so that's where we're hoping future discussions with the task force as well as ACRL has been kind enough that they're going to include it on their survey that we might have a better idea looking at theirs and seeing if it is something that's easier to collect and have a little bit more understanding what this burden might look like before as a federal entity we start asking that are requiring that it be collected as part of the academic libraries and so on. Yeah. Just to follow up on that we really do want to collect these serial usage information it's just a matter of making sure that when we collect it it has validity and reliability and that is easily explained. I'm done. Yeah, I was going to it's all over. I just wanted to quickly mention that as you're doing your deliberations for the next survey please feel free to engage with COUNTER because the journal report is a bit of a misnomer today for example journal reports are really for anything that's serial, not just technically journals and so it would be good for COUNTER and all the CRL, iPads etc. to be aligned on the definitions of things and that could help out things immensely particularly with publishers knowing what to put in what reports. Yeah, thank you and yes there is some experience in the collection of usage statistics and usage of serial ARL statistics has been including that item for a number of years but it is a moving target as new products appear and also new technologies of how to deal with usage statistics appear. We do have a question related to this issue by Scott Pajel, Pajel what is patrons are accessing online data that traditionally would have been in books or serials but that is desegregated in a database. Example patron would traditionally have accessed a case in a reporter volume but in Lexis Nexus and Westlaw they are accessing the case directly and one doesn't really consider it a count of an online serial so use not counted at all in this case so he is basically talking about database use. Yes, so with Oliver I can perhaps offer some thoughts on that so in the case where content is desegregated into something like an EBSCA host database a lot of experience with that then even if it came from the print that item would be designated as coming from monograph or coming from a serial and usage would show up in the corresponding book report or journal reports so in that case usage would be provided in the case of Lexis Nexus and Westlaw I guess it would depend on whether they offer counter reports or not as to whether a counter report would show but I would think the same thing would apply. Anyone else want to add any thoughts on this question? Yeah, this is Kristen and I think there are some cases that do challenge where usage can be counted like if you have a reference work that really gets changed into a database sometimes that provider is not giving you a counter book report for example even if maybe it will be volumes in a book but they're giving you a database report and maybe they're giving you something that's non-standard the iPad survey is not requiring you to use counter so you could probably use some interpretation and try to determine what would be the closest thing where you can get used if you do have a non-standard report from a publisher that would be most accurately represents a use that would be as close to comparable as you can come up with and then there may also be cases where it is so radically different in how it's being presented that it is hard to figure out how to count it and I think that's something that the Task Force is still wrestling with a little bit and that probably speaks to some of the reason why we hadn't yet included the serial count. Yes, thank you Chris and clearly yes, it's a moving target and the landscape and ecosystem of electronic resources and their usage is shifting. There are a number of other questions that have been answered in the chat box and I don't want to take too much time with them but I do want to ask one of them Erika Linke asked where does one report expenses for data sets and for access to text files or purchasing of text files for text mining and I believe this was answered already that the expenditures will be reported either under ongoing or as one time purchases depending on how the expense occurred. So just to highlight that we have received some questions related to other issues ARL will publish the questions that are in the chat box and their answers so I'm not going to repeat them now but we can move into our closing section and I'm going to invite Bogdugan to offer us the closing. Hi everybody and the task force really appreciates your taking the time to join us today in terms of this. We hope you found it useful and that it was informative. One of the things is we try to keep the academic library community aware of what's going on through press releases. That's usually from ACRL and from ARL and discussion lists. Every once in a while we pop up on the ARL assess list so we try to keep up with this in terms of listening to it. I did a FAQ last year for when we were doing the ACRL survey and I had 4,000 hits on the database, on the FAQ. It's one of those things in which we try to keep people aware of what's going on and just kind of keep your radar up in terms of information coming. If you have comments or suggestions for the task force, if you want to know what's going on contact Martha or Mary Jane Petrowski on this particular slide. I'm going to turn it over now to Mary Jane. Mary Jane? Thank you Martha. I had to unmute myself. I was just thanking everyone for coming and just wanted to reassure you that we will be posting the link to the webcast widely and we will be incorporating it into the FAQ that Bob just mentioned and we will be making that readily available. I hope you will share that with your colleagues who weren't able to attend today. Just a closing from me, thank you to all of you who attended. A reminder that we did work with task force number one that addressed a number of issues and the webinar from last year is actually available for those who may have forgotten that part of the history of the work of the task force that works with iPads, the joint task force from ARL and ACL that works with iPads. Today's webinar captures the advice and changes we have been forwarding to iPads for the upcoming data launch this fall and as you heard from a number of speakers there are a few issues that are still on the table and we expect future collaboration between ACL and ARL to resolve and provide guidance and advice to iPads and also to everybody in the community. On this note, I would like to thank everybody who has attended this webcast and thank again the joint task force members that you see listed on this slide. They represent all types of academic libraries, federal agencies and we look forward to continuing their good work into the future. Thank you. I would like to thank all of our presenters Mary Jane Petrowski, Martha Curie Lidu, Robert Dugan, Chris Cody, Oliver Pesh and Kristen Martin and thanks to all of our participants for attending this afternoon's webinar. The recording will be available on ARL's YouTube channel next week and I'm going to post the link in the chat box for everyone. We will send the recording to all registered participants as well. This concludes today's webinar. Thank you all so much for attending and have a wonderful afternoon. Bye-bye.