 I'll break it with a wing. Yes, and we're going to be late starting. Okay, right. Actually, it's not distracting you, though. So, are we ready for thirsty? It's thirsty, yeah. Anybody come thirsty? If not, your sessions are just getting a lot longer. That's good. I always go to London anyway. I think I won't have to gamble so much. My presentation here is in this laptop. Oh, are you thirsty? No, I'm not thirsty. We have four sessions. There are four different speakers, okay? You can use my laptop if you want, but that's... Martin, we're going to need the name... Gentleman with the laptop. Gentleman with the memory stick. Okay, so we must be safe. You're taking your machine away at the end of you, so it probably... ...curses. Okay, so there is four. It's 10 minutes, of course. Is there anyone who thinks they're presenting in this session... ...who hasn't risen to sort of congregate around this area? Whose name might be Kirstie? Going once, going twice, going three times for Kirstie? I've gone. It's you. I can speak slowly. Okay, well... ...the people who are here, and the short time that we have... ...so this session is from 10.30 till 11.30, and we have... ...three, maybe four... ...presenters in the session. So each presentation will be about 10 minutes... ...and then there will be an opportunity for questions. So get your questions ready. This is Wes, come on in. Can we have some sound? Martin, can we have a bit of sound? Sound, yeah. On the laptop though, it's not me. Oh, yeah. Okay, so first session is Richard Leary... ...from the RES project, which he will tell us about... ...but the title of the presentation is... ...The Bastille, a pop group for a French board. Yes, that is. Good morning, everybody. I feel I've been set up quite nicely by Jim Grant... ...talking about syndication... ...because one of my previous job titles at BBC... ...had syndication in the title. So I think it's very relevant to what I'm going to talk about. The RES project is a partnership between... ...JESC, the British University's Film and Video Council... ...and the BBC. What we are trying to do is to create a platform. We are not trying to create a platform. We have created a platform. It's an open platform built by the BBC... ...which organizes and indexes... ...the catalogs of publicly held archives. We're also a collaboration... ...working with public sector organisations... ...to release their digital collections... ...in the form of linked open data... ...to enhance the discovery of those assets. And finally, and probably most relevant for this audience... ...we have an ambition to stimulate developers... ...to build projects powered by RES... ...for teachers, learners and academics... ...to help them access the data and collections. So here's a short film about what we're doing. At the end, it finds places where the same topic is described... ...you can track where it can be found... ...and fabricate it to live in a company... ...about a particular topic. RES then provides you with teaching products... ...that you can find in your students, teachers... ...and academics. For example, if you search for Bastille... ...it will display high-ranked material first... ...making it more likely to get references... ...to the pop band than the French fortress. And it's not always clear... ...what the material belongs to. The research and education space works differently. When you use a product powered by RES... ...your search will match the topic of your choice... ...and the platform will provide you with information... ...about the related resources... ...from all the reliable collection orders it knows about... ...collection orders... ...to release your catalogs as linked open data. We want the vision of product developers... ...to help us shape the many digital products... ...which can be powered by RES. And we want to hear from those in education... ...to tell us what you want. You need you to help us move RES the best we can. And end on bad pun. So, to recap, RES is a semantic web platform... ...that indexes collections of material from anyone... ...who publishes their catalogs as linked open data. The data must contain an explicit machine readable... ...and open license. And the assets described by the data... ...are licensed in any way the asset chooses... ...as long as the licensing is described in the data. I'll come back to that point later. So here's something that's available via RES right now... ...and bizarrely yesterday I was giving this presentation... ...actually in the Welcome Library. So I sort of felt quite close to this... ...even though this is all digital. So this is Darwin's On the Origin of the Species... ...held by the Welcome Library and indexed by RES. Right now we're indexing data from world-class institutions... ...like the British Museum, the Urban University... ...the Welcome Trust, the Natural History Museum and the BBC. We're also talking to as many people as we can. Recently our current list of people we're working with includes... ...Culture 24, the Collections Trust, your art... ...the National Library of Wales, the British Library, the Science Museum... ...the Royal Opera House, the National Archives, WikiData... ...and Edinburgh University. Really pleased that tomorrow I've got a whole day of meetings lined up... ...with major Scottish cultural institutions as well... ...because we haven't done enough in Scotland recently... ...though we have been doing a lot in Wales. We've talked to dozens of other institutions... ...and everyone we've spoken to has been so far very keen to work with us. And we've made more than a million TV and radio programmes... ...from our digital archive available. That's dating back to 2007. Further this week we're launching the BBC Shakespeare Archive Resource... ...to provide online access to hundreds of BBC television... ...and radio broadcasts of Shakespeare's plays and sonnets... ...as well as documentaries about Shakespeare. These programmes date back to the 1950s... ...and include the first British televised adaptations... ...of Othello and Henry V... ...classic interviews with key Shakespearean actors... ...including John Gilgood, Judy Dench and Laurence Olivier... ...several of Shakespeare's famous sonnets and TV and radio broadcasts... ...and more than a thousand stills of Shakespeare's productions. There are already thousands of BBC programmes permanently available... ...via our website. Though if I'm honest they're not always as easy to find as they ought to be... ...because they're not collated in one central place. So this summer BBC Radio is publishing linked data... ...about its 13,000 permanently available full-length radio programmes... ...which include things like... ...and I shouldn't go off script because I just forgotten... ...in our time, yeah, in our time, that's what I'm talking about... ...but also 20,000 clips will be indexing them hopefully in res. This adds to the almost 4,000 permanently available TV programmes... ...that my team is collating. And this is a good point to talk about the res approach to licensing. As I hinted earlier, we understand the difference between licensing data... ...and licensing assets. To be indexed by res, everything must carry an explicit machine-readable licence. But res requires that data be licensed under one of nine licences... ...and known open licences I should say. But for the avoidance of doubt, a Creative Commons non-commercial licence... ...is not an open licence. But we understand that it might be harder to licence your valuable assets openly... ...though we'd prefer that you did. Because here's the confession, BBC TV and radio content isn't licenced openly... ...and can't be, it's just far too complicated or far too many copyright holders. But we have made as much as the programme metadata available as we can under res... ...and the programmes are available via era-licensed organisations... ...that deliver into schools, colleges and universities. And that's how we've gone back to make the million-plus programmes... ...that we have digitised since 2007 available in schools and colleges... ...and universities across the UK. So I thought I'd expand a bit about linked open data. If I go psychotechnical here, I apologise, but linked data refers to a set... ...also, this is not news to you, I understand it, I also apologise... ...linked data refers to a set of best practices... ...for publishing, sharing and interlinking structured data on the web. Its main objective is to liberate data from the silos... ...that are framed by proprietary databases, schemas or proprietary websites. And it follows four rules set by Tim Berners-Lee in 2006. These are use of uniform resource identifiers... ...that's URIs for identifying entities or concepts uniquely in the world... ...use of HTTP URIs for retrieving resources or description of resources... ...use of standard formats like RDF for structuring and linking descriptions of things... ...and this is the really crucial bit, use of links to other related URIs in the data... ...to improve the discovery of related information on the web. These principles are best described as rules, but really they're recommendations... ...or best practices for the development of the semantic web. Data can be published according to the first three rules... ...but it's only when it's the fourth rule that it's really linked data... ...and it's only when it's got an open licence that it's linked open data. Tim Berners-Lee defined a five star schema for linked open data... ...and we are, and the fifth star is data that's available via open formats... ...containing a link to other people's data and has got an open licence. Now we are unashamedly built around five star data. We have had internal arguments in the team about whether we should make it easier to publish... ...rescompliant data, but the view that has prevailed, which I think is the right one... ...is that if we get the data absolutely right, then everything further down the chain follows. So there are lots of resources for educators out there already... ...but what we hope to achieve with RES is to make the aggregation and discovery... ...of such resources much, much more efficient. And this is the point that Jim Groom was talking about in the keynote this morning. It's releasing data from those silos. Human searching for information is a manual task. The original web technologies involve a lot of human understanding. It's time-consuming, it's frustrating. And the one thing we know about educators is they have very little time. Involving, you know, searching for information manually goes... ...involves going deep into one silo, coming back out and diving into the next. Link to open data allows the machines to do that work for us. We can use them to find the sources we want, to find where the relevant documents are... ...and to pull out and combine them into meaningful ways. So to illustrate the difficulties of human searching... ...let's look at the example that we've used in the film of Bastille. Does this refer to Bastille Day, or the historical event that's storming the Bastille... ...or the landfill indie band? Because if we use Google, you'll get 10 pages on the landfill indie band... ...which is probably not very relevant. Your students might like it, but if you're a teacher or an educator... ...you're probably less interested. And it also illustrates the other reason why we are very, very different from Google. Google doesn't care about provenance, and we do. Google doesn't care about authenticity, we do. Google doesn't really care about licensing, and we do. And Google doesn't really care about permanence, and we do. Google only really cares about what's contemporary, which may or may not be relevant... ...either in the glamour sector or in education. And Google only really cares about the number of links to an asset... ...and for us that's not particularly relevant either. And because Rez is indexing data from the UK's leading cultural institutions... ...you weren't running into the problem I was told about recently where a history teacher... ...teaching Henry VIII invited her class or his class, I don't know the gender of the teacher... ...to Google beheading with the catastrophic results that you can possibly imagine. There's one thing I haven't talked about, and that's the user interface. And that's because at the moment there isn't one. We're not building one, and we have no intention to build one. But because, as I said, Rez is an open flatball, built using open data... ...linking to explicitly licenced assets, people can build their own user interfaces... ...to do exactly what they like. We don't want to constrain the market here. We went to the Bertrod Fair in January, and these are some of the companies we talked to... ...and we got an enthusiastic response from all of them. Rez resources can be used in lectures and classroom via products... ...made by educational publishers and software providers. As a professional education, you won't be able to use the Rez platform directly. You will need to go via one of those third-party products and services. We hope to make planning, creative lessons, and searching for engaging content... ...and relevant content significantly easier. And those of you who are using a VLE or learning management system... ...can help us now by getting back to your supplier and telling them... ...how much you enjoyed this presentation and would like to work with us, please. Small hint there. So thank you very much indeed. Just a reminder, Rez content is relevant, authentic, reliable... ...and suitable for all subject areas and all levels. Content is freely available, it's licensed for educational use... ...or it requires membership in specific licensing schemes such as ERA. Rez is constantly evolving and growing, and it is here to stay. Its resources are persistent and can be cited in coursework term after term. So that's it. Thank you all very much. I like questions. Yeah. The BBC is a very large organisation. We're like any big city, we're a collection of villages. So opinions may differ from place to place. I mean, the Director-General has repeatedly talked about open. There are different conceptions of what open means within the BBC. I mean, particularly, I'm arguing strongly against the idea... ...that open means allowing other people to put stuff on iPlayer. That to me is not open. So the short answer is, and I can't speak for the whole of the BBC... ...unfortunately no we're not, there are parts of the BBC like the team I'm in... ...which explicitly believe in open. Believe that open is fundamental to the future of the BBC as an organisation... ...and we're going to continue making those arguments. But there are other powerful arguments as well that open might undermine... ...value for licence fee payers. It's a good argument that if we made all our programmes data open... ...then TV manufacturers would build their own version of iPlayer... ...and stick it on TVs, and iPlayer is a successful consumer product... ...that the BBC invested lots of money in it would be badly impacted. So we have to push the open agenda of the BBC... ...and we have to look carefully about how we're going to do that. For the moment I think we've gone as far as we can right now... ...in making everything from our archive back to 2007... ...available via e-relicensed assets... ...and via REZ we've got a lot of programme metadata available as well. But act to actually look at the... ...or listen to the TV already a programme... ...you have to be logged into either BUFVC or PlanetEastDream... ...or ClickView coming soon. That particular thing isn't going to change. That requires change to the law. I don't think that's going to happen any time soon. But thank you. Madam. Unless I picked it up wrong it's just trying to understand how that works. We're an open platform so yes people can build what they like on it. This is a controversial statement. Nothing to involve ISIS building their terrorism portal on top of REZ. Clearly rather they didn't and obviously they're not going to. But if a third party wanted to make a chargeable thing on top of REZ... ...then we would actively discourage them from doing that... ...and it would be contrary to the ethos of the project. But because it's an open platform we couldn't stop them... ...and we are a small team and we don't have resources to chase after people. So yes, it's possible. Yes. Thank you very much. Thank you very much. Thank you. I'll be over there for all the other questions. Time has just been on for... Is that right? So we just try to punch in. We'll get back to you. Is that right? Okay. We're on. So without further ado... Anthony. Tell us about another presentation in our Wikipedia series. This is about using Wikipedia as a primary course resource. Okay, thank you. Thank you very much. Good morning everybody. This research work is about the learning effectiveness... ...the evaluation of the learning effectiveness... ...and the perceived value of Wikipedia as a primary resource course. And in my presentation I will give you a real introduction... ...with the main research questions. Afterwards the general theoretical framework... ...that we have considered in this application. And then we will describe the concrete application... ...and we will show the main results of the evaluation of this application. So Wikipedia... ...we all know that Wikipedia is called to open educational resources. Also it is one of the biggest sources of information in the world. But although Wikipedia is frequently used by most of the higher education students... ...there are a few courses which are difficult to find... ...courses in which Wikipedia has a central role. Furthermore we have no evidence of the influence of the use of Wikipedia... ...in the academic performance of the students. So in this research work we will try to answer to this question. So the main research questions are these ones. The first one is we want to analyse the perceptions of the students... ...about the quality of Wikipedia. Afterwards we will move to the analysis of the influence of this use... ...in the academic performance of the students. And finally we will link these two first points... ...and we will see if there is any relationship between the quality perception... ...and the academic performance of the students. So we introduce Wikipedia in the courses... ...through the learning by comparing technique... ...which is considered in research in analogy... ...as one of the methodologies that permits and facilitates knowledge transfer. So this framework, Analogical Resoning... ...facilitates the ability to apply knowledge to various situations... ...and also permits to the students to find underlying structural similarities. And it's based on the comparison between analogs... ...and these yields to stronger solution schemes. So the application we've conducted a pilot in four different courses... ...in statistics, human resources, consumer behaviour and marketing... ...in the Open University in Catalonia... ...in the degree programme in business administration... ...and we've involved more than 1,000 students in these analyses. The structure of the courses is similar... ...of all four courses is similar... ...and it's based on the continuous assessment activity of the students. We have five assessment activities, one per each part of the course... ...and with a theoretical part and an applied part. And we also give to the students different learning materials... ...in multi-format, video, web, audio, books, case studies, etc. And in this pilot we've added a new learning material to the courses... ...which is Wikipedia through the learning by comparing technique. So we've introduced Wikipedia in two out five assessment activities... ...and the way we've introduced Wikipedia in these assessment activities... ...is through these four questions we ask the students... ...to give their perception about the completeness of Wikipedia... ...the reliability of Wikipedia. We ask them if they consider that Wikipedia is updated... ...and also if it has been useful in their courses. So they have to give a value from one to five... ...ranking from completely disagree to completely agree... ...and then they have to give also evidences of the answer. What about the results? If we analyse what is the general perception of Wikipedia... ...we see that it's positive... ...mainly positive because most of the values are greater than three... ...that is the middle value in the answers. So we can consider that they have a good perception... ...and the most valued item was updated in this case. So these perception changes across knowledge areas... ...we see that there are significant differences... ...between consumer behaviour courses and human resources courses... ...and the other two, marketing courses and statistics courses... ...they have low values with respect to the perception about quality. We will see that although we have these differences... ...the final influence on the academic performance is exactly the same. So it's a kind of strange thing... ...that there is no relationship between quality perceptions... ...and the effect that we will get in the final marks. That is one of the most important objectives of the courses... ...to get good values in the final marks. So this is just a presentation. Do these perceptions change with the academic profile of the students? So we see that in most of the courses... ...there is no correlation between these two elements... ...but in the case of marketing we have a negative correlation... ...which means that the best students are those who have the lowest perceptions... ...about the quality of Wikipedia. So this is just a specific thing with the marketing course... ...but in the other courses there is no relationship between the academic profile... ...and the quality perception. And what about the academic performance of the students? So at the end of the very end we have the results concerning the final mark... ...compared with the previous semester where Wikipedia was not used in the classroom. So the end is a good end in these terms. But if you see in the two assessments where we've introduced Wikipedia... ...the second one and the fourth one, we have a variety of results... ...where in some cases we get a better result in the semester where Wikipedia was not used... ...and in other cases we get the other result. So in order to analyse this influence we've done something that is not correct... ...and the fact is that we were comparing two different cohorts. The people are different so what we want is to analyse the net effect of Wikipedia... ...in the academic performance of the students. And we want to know if this difference is just due to the use of Wikipedia... ...or it's due to the fact that we have just clever students in the semester we've analysed. So we have to do these arrangements and consider that the difference between the qualifications... ...between two semesters depends on the contents, the cohort and the use of Wikipedia. So if we consider that there are no differences between the two semesters concerning the contents... ...we can eliminate this element from the question... ...and what we only have to do is to estimate the cohort effect. For doing this what we compute is the mean of the differences in those assessments where Wikipedia was not used. So we just have the differences in those assessments due to the cohort effect. So we estimate this effect through these assessments... ...and finally we get that the net effect of Wikipedia can be estimated... ...just subtracting this effect of the cohort. Taking this into account we get that the influence of the activities of Wikipedia... ...on the students' academic performance is positive at the end. It's also positive in the fourth assessment activity but this is not the case in the first one. We consider that this is due to the fact that in the first assessment where Wikipedia is used... ...the students are unfamiliar with the way they have to work with Wikipedia... ...to analyse these things but we have a positive shape in this case... ...and the effect finally is positive. And finally what about the relationship between quality perception and academic qualification? Well we have no correlation between these two issues... ...just this green cell and these two red cells... ...but as you can see there are low correlations... ...and what we have to do in the future research is to analyse what's happening in the marketing area... ...because as you can see through my presentation we get those strange results in this area. So maybe we have to check if it has been introduced correctly in the courses or whatever. So main conclusions in this work... ...students agree with the idea that Wikipedia has an acceptable quality level. Students consider that the most important facet of Wikipedia is that it is updated... ...containing recent information and recent references. Across knowledge areas the lowest quality perception is achieved in statistics. Usefulness is most valued in human resources and marketing. Quality perception does not depend on the student's academic profile... ...except in the case of marketing with a negative correlation. There is a negative and low positive direct effect of Wikipedia usage... ...in the assessment's qualification except in the case of statistics. Nevertheless there is a high impact of Wikipedia usage in the student's final marks. Statistics case is different although it has the lowest quality perception... ...there is a highest impact on academic performance... ...and finally in general there is not a significant correlation between quality perception and academic performance. So thank you very much for your attention. You have any questions? The very active engagement with the students as a source... ...is making them think that... I think this is because we are just analysing degrees... ...students in business and economics. So they are not there to study quantitative topics... ...so they feel that this is a topic that does not fit well with their objectives. So maybe they do not have these skills to analyse correctly... ...to have the correct perceptions about the quality. Class size. Each class has 75 students. So we have here... ...this is the total amount of students in each course... ...and you have to divide its number by 70-75. Well, we do not have this... ...I think that the explanation is... ...you mean this discrepancy with the person. Well, I think that the fact is that in the assessment... ...the first assessment where Wikipedia is used... ...the students are not familiar with how to use Wikipedia... ...and how do they have to compare different resources and to see if Wikipedia is... ...so in the second assessment they know how to do this... ...and they are familiar and then we get this positive effect in this. So you mean the second one is close? I don't care about further research. In the Scots, obviously, running a deal... ...if not from the Scots, the operation comes. What's the Scottish connection really? Oh, you can say rather than the deal. Okay, cool. They don't like actually saying devil in Scotland... ...because they think he might appear behind them, I believe. It's actually a Van Halen reference, of course... ...and we thought we should have some Van Halen this morning... ...but it's a little early for loud backbeats and primal screams, I think. So I thought I'd try and put a riff on the ukulele... ...but it sounds horrible. God, you know what to do. You know what to do. Maybe you could help find some slides and do something useful. Oh, God, sorry, right, okay, yeah. This is my co-presentative for me. I'm sure you will do it very well. The inspiration for this talk really... ...and this kind of line of inquiry... ...and I think as you see as we get to the end, it's a bit of a mess yet. What you're into is thinking... ...you know, we are going to think more critically about their movement... ...and where we're going, where we've come from... ...and what can we learn from the past. I think it's quite important that it sort of frameworks... ...and the things we're going to discuss. You know, it's a community approach to it. And we just want to throw some ideas together today... ...about how we might approach thinking more critically... ...about what we're doing and whether we need to. And the ideas from it came from the US conference last year... ...when there were sort of side mumblings within the conference... ...that where's the research and what we're doing? Where are we evaluating what we're doing? And where are we thinking critically about what we're doing these days? And I think it's quite interesting as an academic... ...when I think about what conferences used to be like 20 years ago... ...this would be the critical debate. This is where PhD supervisors would stand up... ...and have a scrap with each other and debate each other. But of course today we're working out in the open. So where are our critical spaces? Where are we having those discussions? So I think it's... ...if I sound jumbled it's because I am jumbled... ...and still sort of finding my way and navigating my way... ...through what I think we need to be doing. So there were some really interesting blogs... ...that came out of the conference that was in Vancouver last year. So if you go on to the OpenEd 15 website... ...and look at their archive, there's some fantastic writing. There's about 15 articles. And these were some really interesting ones. So Rob and De Rosa are thinking about... ...where's the pedagogy? We're talking about textbooks... ...and one artifact, one line of inquiry. But actually what's happened to the rest of the activity that's going on? Actually that is so vibrant within this community... ...and so apparent within this conference. These big and little projects. It all works and it all matters, doesn't it? And that was Adam Kroon as well. He says, you know, in a few years time... ...are we going to look back and say, is that what we meant? Do we want to be talking about content... ...and adoption of content? Is that what the Open Community meant? So his blog was absolutely fantastic. So I was thinking, well, you know... ...if we do want to move into... ...sort of a more open direction... ...and not just go down one route... ...what is it we need to do... ...to influence colleagues, influence policy makers? And I think it all comes back to being... ...critical about what we're doing... ...providing some robust evidence around what we're doing. So here are just some questions to throw out. So where are we actually exploring what we're doing? Where are we publishing and presenting that? And where are we debating that? Are we critical enough in what we're doing... ...as researchers today? And actually how do we measure this? And this is where I've got to really... ...and chats across the dining room table... ...have been quite bizarre. How do we start measuring, evaluating... ...and thinking robustly... ...about whether we are being critical? So an approach I just took... ...I always love starting with... ...a bit of a systematic literature approach. It's such a great way... ...of sort of getting a sample of literature... ...in a robust way that can be reproduced... ...so I can share my processes with you. I can share my search tools... ...that I've developed that you can then build on and adapt. And so that's what I did. So this is a bit of a pilot at the moment. It's a snapshot of the literature... ...and the intersection I'm interested in... ...is open pedagogy, the impact on learners... ...and outcomes of education... ...and the learner itself. So when you're sort of constructing these searches... ...you think about them as a Venn diagram... ...and what I want is that bit in the middle. Okay, so it's a really structured way... ...of delving into the literature. So I've looked at two databases so far... ...retrieved about 700 articles. And then you go through a filtering process... ...about, well, some of those articles are irrelevant... ...some might not be empirical research or evaluation. So you kind of do a bit of a sieving. And as always when you do... ...a systematic approach... ...you end up with a pitiful number of articles... ...that have actually looked at that thing. I mean, I would have thought after sort of nearly 15 years... ...of open education that they're within... ...this really important question... ...does open education impact on learners? There would be a few more. But this is, you know, it's just a pilot search... ...but I wouldn't imagine there were that many more articles... ...adressing that question. So I ended up with five evaluations. So that was interesting. So the thing I thought we could do next... ...in terms of thinking about, well, within those evaluations... ...are they being robust? Are they being critical enough? So there's some new techniques emerging... ...around the field of meta-research... ...where we're actually being critical of that research... ...and publication process. So what you can do is you can look at the outcomes... ...within the papers. Are they reporting positive outcomes only? Are they sort of negative and less successful data... ...also being shared? You can look at whether studies are writing up... ...being aware of the limitations of what they've done. So that's another sort of handle on whether some... ...when they're writing are they being critical... ...of what they've produced? That's quite an interesting thing to look at. And also you can look at the citations. So are they, when they're citing... ...and building on the work of others... ...are they doing that in a robust way? It's very interesting when I looked at the five papers... ...that I did find. Not surprisingly, we all know publication biases exist... ...but people only publish their positive results. I think it's quite interesting in some journals... ...they've started to introduce a negative result section... ...just to surface some of the stuff that doesn't work. That's what we want to hear about, isn't it? That's how we can all learn from each other. So there's quite strong bias towards publishing... ...what had worked well within these articles. When I looked at the citations, it was really interesting. There's a massive tendency... ...to positively cite the work of others. And so the law of averages says... ...if someone's writing a paper, if I've written something... ...I'm sure there's something that I've not done well within that. So we think within... ...when we're writing our introductions and discussions... ...and pulling all that evidence together... ...that we should be thinking critically about research methods. So I think it's an interesting snapshot... ...of a body of literature where... ...I think maybe we are losing our critical edges... ...or maybe we're debating and being critical in other spaces. So that's something that I'm thinking about at the moment. So then when we started thinking about... ...what else can we learn from citations... ...I need to hand over in a minute... ...because it gets quite complicated. Just as another point when I delved back through the literature... ...the massive body of work that was in the early 70s... ...around open education but in a different form... ...when you read some of those publications... ...I think there's an awful lot to learn. I think someone referenced to some of this... ...in their presentation yesterday, which I missed. But actually it's a really fascinating body of literature... ...about what happened in UK infant schools... ...and that work transferred over to the US. So some of those 1970s papers are absolutely brilliant. But I'll whisk swiftly on... ...and hand over to maybe a different approach... ...to evaluating our critical nature... ...and our critical abilities... ...which is around the use of citations. Okay, thanks, Viv. So this starts as kind of a cautionary tale. If you're not an academic... ...but you hang around with academics... ...and if you're not a technologist... ...but you hang around with technologists... ...you start thinking that you're both of those things. You start thinking that you can actually do... ...and understand the things you've read about... ...and if you know Tony Hirst... ...you start thinking that you actually are Tony Hirst. That's a whole warning. I'm not Tony Hirst. I'm not Martin Horksy. I'm not Grant Potter. I'm not even Brian Lam. So the simple life... ...just me sitting there and saying, oh, but of course all the literature on the blogs anyway... ...is not in the papers. I'm just saying, oh, of course... ...there's a whole field of study... ...of citation analysis... ...and there's semantometrics and stuff like that. What's going to happen is... ...somebody like Viv is going to turn around... ...and say, that sounds great. You should do that at the end of the paper. And I, you know... ...I found the simple life so simple. So these are my hypotheses. Effectively I'm saying most of the really good stuff... ...in open education is in the blog. And in fact it's so good that people... ...will be citing blogs... ...quite a lot in preference to papers. My hypothesis is... ...blog citations can be measured... ...veer the parts of the scholarly graph... ...that we have access to. Now that's a big issue in itself. Citations of blog posts will be in papers... ...concerning the kind of topics... ...the bloggers were writing about. So they're actually functioning these... ...blog posts as part of the literature. So these are my methods. I'm collecting basic stats from Google Scholar... ...using Harding's Publish... ...or Perish tool... ...which is a nice dirty tool... ...for getting lots and lots of... ...details about papers and citation. So I'd look at word frequency... ...in the paper titles... ...because the paper titles were... ...actually all I could reliably get. And then I would compare them... ...to word frequency in the blog corpus... ...and think okay are these citations in papers... ...that are talking about the same thing... ...as the blog does. It's incredibly quick and dirty. It's not up to any kind of real analysis... ...but this is just a demonstration... ...of what you could do. So the blogs I looked at... ...you will note... ...these are all serious bloggers. These are people that have shaped the field... ...that have been part of the field. I checked in a couple of wild cards as well... ...just because I wanted to look at... ...the term edgipunk. Is that referenced in the literature? I was just interested. There used to be a blog... ...called... ...Connectivism Cystiae... ...which is something I always went to... ...to try and understand... ...Connectivism. It seems not to be there... ...but I'm actually wondering if we can tell... ...if it has had an impact... ...from the... ...literature. Notable initial findings. George Siemens is a monster, isn't he? The papers site... ...his blog... ...have got a H-index of... ...38. I'm looking around the room... ...I'm thinking apart from... ...possibly Catherine Cronin... ...I don't think anybody is going to have... ...a H-index of 38. That is some seriously powerful papers... ...and this is... ...demonstrating... ...that it's not... ...just... ...scrappy papers like... ...we do that site blogs... ...is... ...big research, important research research... ...was impact. Connectivism Cystiae... ...which doesn't even exist anymore... ...it's a holding page... ...the papers that site that... ...have got a H-index of... ...29. Again, that's a lot more than mine... ...probably a lot more than yours. Actually, do people know what H-index is? Sorry, I'm just getting blank looks here. H-index is the... ...a measure of the influence... ...of your publications. Basically, it is the... ...number of publications... ...that have had more citations... ...than that number. So, say I've published... ...four papers... ...and I'd... ...and each of them has been cited four times... ...I'd have a H-index of... ...four. If I'd... ...published 927 papers... ...but only four of them have been... ...cited each of them four times... ...I'd still have a H-index of four. If I'd had three... ...four papers... ...three of which have been cited... ...four times... ...one of which have been... ...cited nine million times... ...I'd have a H-index of four. So, that's how that calculation works. Large amounts of the... ...edgepunk literature, by the way, is in Spanish. It's actually nearly all in Spanish. People who write academically about... ...edgepunk in Spanish. And the... ...look at... ...coincidence of citations. People that cite George Siemens... ...are also about to cite David Wyler's blog. So, I just saw that was interesting. So, the idea behind... ...semanto-metrics... ...which is what I'm moving into at the moment... ...is that... ...a better measure of the power of the paper... ...is not... ...the fact that there are lots of citations. But the fact that citations are doing something... ...meaningful, as... ...Viv was saying... ...people tend to do like a shopping list of... ...citations in a... ...a particular field. It's like the best of the field... ...at the start of the paper. But what's more interesting... ...is the citations that are actually of that paper... ...in a completely different field... ...that are actually making links... ...between areas of knowledge. So... ...a paper link, as you can see on this... ...graph here, which I've taken from the... ...just a bit... ...semanto-metrics... ...project, which is really cool. You should read about that. So, the... ...that citation there, the big long one... ...that's the most powerful citation... ...because that's joining two different fields of knowledge. A quick note, I've probably not got time... ...a little note on... ...what citation is. Read that blog post by Cameron Neil... ...and it's awesome. I'm a big fan of Cameron Neil... ...and now he's my... ...go-to person for kind of all this. Citations are complicated. They don't just mean I've read this paper... ...and I liked it, or I've read this paper... ...and I'm borrowing bits of it. They can mean all kinds of things. So, I mean, get into that... ...read about that. A little bit of methodology. Well, I did. The dark circles here... ...are the... ...common terms... ...in the titles of the papers that cite each blog. The light terms are the common terms... ...actually in that blog. So, I reckoned that... ...if... ...the... ...light circles are a good predictor... ...of what's in the dark circle. And the... ...blog is still powerful and important. But it's not as important... ...if there's a big difference. That's actually more important... ...if there's a big difference between the... ...two baskets of terms. Now, you can do that... ...quickly yourself... ...and you can just map that. What I find is really interesting. The citations, all we've talked about... ...technology, the blogs themselves... ...we don't talk about the technology. We don't actually write about that. That's not the point of the blogging. So, that's all kind of really interesting to look at. I thought to myself, Martin Hawks is going to be here. I'm not going to just sit here... ...and put things in groups because they look nice. Let's get some algorithms going on this. So, I came up with a metric... ...of semantic prediction. So, how good are the top five words... ...from each blog predicting the titles... ...of citing papers? So, if you've got a high number down there... ...it's actually a really good prediction... ...that the people that cite you... ...are citing in the same field... ...that you're blogging about. So, if you're like Audrey has... ...or to a certain extent like... ...Martin Weller has... ...it means that people are citing your work... ...outside of the field... ...that you are ostensibly writing in. Which I think is... ...a measure of the fact that... ...in some cases... ...the stuff on the blogs is so powerful... ...that it's actually going wildly outside the field. It's people's... ...go-to references... ...if they don't actually know the field as well. George Siemens... ...is probably high... ...simply because... ...there's lots and lots of citations there. So, there is... ...still quite a dirty metric. It would be interested to... ...clean it up... ...and I'd love to talk to people about that... ...if we can. So, overall conclusions from my bit... ...and Viv's bit. You can go back up now, please. I've stopped doing the... ...horrible stuff. There is still a lack of... ...robust research. This is something I've... ...heard Viv said so many times... ...that there's not the robust research here... ...and obviously subject to bias. The social media thing... ...is an interesting new angle into it. We can't tell much yet. I'd like to do some more of it. So, we think... ...looking at this field in the future... ...we need a combination of these approaches... ...of these metrics in order to go forward. Do you want to take off from here? No, I think we're pretty much done. Okay, we're pretty much done. Melissa's standing on the side. I hadn't planned on any requests. Don't give me any requests. We'll end it there, because I think... Okay, yes, some further reading. Yeah, but that's what we did. At the front here? Yeah, I've got a comment and a question. Firstly... ...this happens to be the second presentation... ...including a systematic literature survey... ...that I've seen in completely different fields within a week. And I'm really encouraged by that... ...because I think it's a great thing to do. There are some things in common... ...and I think this is probably always going to happen. You both went from hundreds or thousands... ...down to a mere handful. I think it's a very useful handful... ...and it's a really good way... ...of getting a grip on what's going on. And my question is... ...I agree that there's publication bias... ...but perhaps... ...you're being a bit pessimistic about it. So, for example, we published something... ...of using our own OER. And it went really well. And we acknowledged this in the papers... ...a preliminary study with the people who developed it. It would have been a disaster... ...if our evaluation showed it didn't work... ...because we're an enthusiastic group... ...in way above time. So, is it really the symptom... ...of the literature being preliminary? So, if this was developing a drug... ...you'd now need a clinical trial... ...done like you did... ...with a strong belief in that drug. Maybe you didn't even think it was going to work. But that needs all kinds of funding... ...and organisation. I think it's a figment of... ...you know, we don't tackle education... ...in programmes of research. So, staff training post-graduate certificates... ...where you're absolutely right. That's why I started off and I did a small study. You know, we get funding for a year... ...and you might have one in evaluation. So, you don't build up a programme of enquiries. I think you're right. I think then that leads to sort of... ...small-scale studies... ...that you can't sort of extrapolate from. So, I think that's part of the education research... ...apprenture, isn't it? But also, I think, again, you know... ...there is a tendency to... ...publish stuff that works. What other stuff than that wasn't so successful? The law of energy says, you know... ...if we're evaluating something, some of it... ...must not be working. But we don't see that literature. I find it interesting. Thank you for your comments. Thank you. You mentioned the counter-nail... ...and also research and social media. Yes. And you sort of offered, like... ...two... ...basically two ways of measuring... ...leads... ...as a way of sourcing information... ...or... ...the sort of impact of your journal article. And he suggested that social media was one way... ...and that Wikipedia... ...tend to do it another way. But his point was... ...social media might have a lot of noise... Yeah, it does. ...and that might be quite random. But we couldn't just pick out... ...the interesting stories from social media... ...like... ...if it's picked up... ...by a health worker or a politician. So I just wanted to... ...thoughts were about... ...the usage of social media... ...or... ...conversely... ...sourcing... ...through Wikipedia. I'd love to have looked at... ...both of those... ...sources. The subtext of what I did do it. I did that in about... ...four hours on Sunday afternoon. So I was... ...really just taking a look. I actually wanted... ...to introduce... ...the idea of... ...those tools and those ideas... ...to... ...this wider community... ...that I might have come across in previous days. That was kind of what I was thinking of... ...doing. I actually particularly wanted to look at the literature... ...because... ...I was interested... ...that there is an idea in this community... ...I think that the blog posts... ...are more relevant than the... ...literature in a way. Because they move faster... ...they're more personal... ...and they respond to each other. I was just... ...wondering if they were... ...reflected in... ...the... ...scholarly graph... ...and of course the big issue here... ...is people are publishing papers... ...on open education... ...in closed journals where I can't read them... ...and I can't look at the citations... ...and why are you doing this? You don't need to do this... ...it's not part of your job... ...you don't need to publish in a high-impact journal... ...nobody is asking you to do that... ...part from your manager... ...who's stupid. If you are that... ...manager... ...and you publish in high-impact journals... ...the ref doesn't... ...actually want that... ...the funds don't want that... ...nobody wants that... ...please stop... ...because I can't look at your citations also. So I would just... ...I suppose I would more interested... ...in the way that the... ...field of academic longing... ...and the field of academic... ...formal literature... ...were overlapping... ...with an interesting indicator... ...I think a Wikipedia will be an interesting indicator... ...the host, of course... ...subject potentially to large amounts of bias... ...both... ...like... ...obvious blame... ...biased... ...like... ...Josie Fraser... ...tweeting all her papers... ...and saying, look at my paper... ...it's great and here's a picture of a kitten... ...and then everyone can re-tweet that... ...in... ...trinsic... ...biased... ...I mean I know Josie, I know she's cool... ...and if she's written something I will probably tweet it... ...before I actually read it... ...because I know I actually want to read it later... ...so... ...it got those aspects of... ...well I think... ...and if you've read... ...James Wilston's report... ...The Metric Tide, it's a great read... ...I strongly recommend it... ...all of the indicators are partial indicators... ...that you need to be looking... ...at a basket of indicators... ...and you need to not be taking the indicators... ...as targets... ...because if you do that then you start... ...getting into the... ...nonsense... ...and the horror that the... ...publish or perish... ...world actually is... ...and we need to stop that... ...and think intelligently about what indicators can tell us... ...rather than... ...sorry, it's better about that. Thank you very much... ...and use your voice.