 Hello my name is Geraldine Gray and I'm here with three of my colleagues from Institute of Technology Blanchestown. So myself Laura and Brian are academics teaching in data science and Michael Keen is our quality assurance officer. So in terms of learning analytics we have we completed a three-year project looking at our first-year students. So similar to the UCD we were saying on how soon can we identify students who are at risk of failing. So what the question we looked at was well the day they walk in the door at that point are there things that we can measure that would allow us identify at-risk students. So we developed an online profiler which measured factors of temperament, motivation, self-regulation and approaches to learning and we combined that with their leaving cert data and what we found is even just looking at age and some of their scores they're getting in even cert subjects particularly maths, science and business. We could identify 72% of students that subsequently went on to fail and recall on that was 78%. So the minute they walk in the door we actually have a good idea of who is potentially going to going to struggle in first year. Of the non-concurrentive measures the other things that were quite predictive was self-efficacy which I think could be quite interesting to compare with the university sector given that Institute of Technology or CAO profile would be that bit lower. Deep learning style and also in terms of temperament people are open to new ideas to creativity and that kind of thing and those who don't have a kinesthetic learning modality tended to get on a little bit better as well. So that was the the modeling that we did. Where you go next with that the opportunities and challenges as I would see it I've borrowed a lot of this contact slide from research that was done in Australia so they did a two year study of all of their universities and how they're using learning analytics and they identified two distinct clusters of colleges. So on the left hand side were people who looked at learning analytics to analyze progression retention. So the sort of bean counter stuff that we want to get students to stay on board and that's very doable. We can see from our models from UCD models from another number of models around there it is quite easy to identify the students who are at risk of failing. There are technical challenges in gathering all that data together but the modeling of that data is quite straightforward. But there's a premise in that view of learning analytics that the people we should be analyzing are our students and they're the people that need to be fixed or molded or scaffolding scaffolded so they can fit in to our current models of teaching. There's in a completely other cluster of universities that would see learning analytics as playing a slightly different role and not just we put the spotlight on the students but we put the spotlight on ourselves as well. So we use learning analytics as sort of a disruptive innovative force within our day-to-day teaching and learning practices to see well actually are we encouraging the right key performance indicators. All of our strategic plans if you go across all the third level sectors we're all talking about 21st century skills and we're promoting critical thinking and learning to learn all these wonderful soft skills but are we measuring them and our assessments rewarding that softer stuff. So when we use GPA as our key performance indicator are we measuring the right thing and are our models predicting the right thing or should we be more putting the spotlight on how it is we're doing our job and changing using learning analytics as an enabler of change within what we're doing. So just don't put the spotlight only on the students if we're going to analyze them we have to be prepared to analyze ourselves as well and what we do from day to day. So in terms of next steps I would say the workgroups that Lee has set up probably map out the next steps for me a key interest would be how we can measure what we're doing and even measure our students without actually giving them questionnaires like we did an ITB. So what are the stealth measurements that we can analyze and as we become more digitalized the footprint that we're gathering about what we do and what our students are doing is becoming richer and I think there's a body of research to be done there as to how we use that data to measure things that are interesting and actionable in what it is we do which is provide education. Okay.