 Right, first of all I would like to welcome everybody to today's meeting. It's fantastic to see so many people here and we've invited a number of people to be part of the panel today. So just to frame this briefly what's going to happen. So I will just say a few words and then we will have the sort of main discussion with the panel. And then towards the end Olga will just say a few words as well. That would be the sort of more general summary of the meeting today. And I just wanted to sort of take the opportunity to first of all welcome you to today's meeting. And just to remind ourselves so basically SDCnet. So today is the last meeting of that formal network shall I say. So the server data collection network in this form. It was obviously funded by ESRC as you know and was run under the NCM grant and everything is on our website under the NCM website. So you can have a look at that. I've put these things here in the slides and everything will be available obviously from that website. And we also when more reports from this group or sort of online resources become available then we will make them available on that website. Over the next couple of months there will be more coming up there. Maybe that's all I wanted to say. So we had obviously just briefly reviewing several meetings on various key topics. In fact, for example, innovations in server methods or server data collection methods, the future of face to face. We've discussed the role of interviewers and so on. And yeah, today's final meeting is looking at the sort of commissioner side. We do we are obviously very keen to continue with this type of meeting or this type of exchange. And in fact, we had the process of sorting out and Olga will say a few more things at the very end of this meeting. Funding from the ESRC to continue with part of this work as part of a new funded collaboration. It's called the survey data collection methods collaboration and a number of us will be part of this and there will be opportunities to join in the future as well. But but Olga will pick that up at the end. So while this is the final meeting under this particular grant, we very much hope that we can continue with a very similar setup in the future. And today's meeting is basically looking at various decisions making particular focusing on the commissioner side. And making because of server, yeah, so they're commissioning in a multi source multi mode world effectively. And I would like to hand to Jerry Nicholas, who will lead us through the panel discussion. So I'm Jerry Nicholas, I'm director of methods at Natsyn and it's my pleasure to chair and moderate the panel discussion today. Gabby's already mentioned that the today's event is focusing on the perspective of those who commissioned surveys and how they decide on the most appropriate methods and data sources to meet their information needs. We're fortunate to have five very experienced commissioners and lease at least I hope it's going to be five of you. Because one of them hasn't quite joined us yet Olga and Gabby if one of you could let me know when he has joined because then I will stop and welcome him and introduce him as well. I'm very grateful for you giving up your valuable time to to come and join us today. But before I introduce the panel members I'd like to explain the format of the event. So first of all, each panel member will have about five minutes to tell us a bit about the surveys that they commission or lead on and how they use the survey data. I'll then ask them a series of questions on the pros and cons of the survey method for their information needs, how they make decisions on the choice of survey mode, the use of new technologies and the use of other data sources. And also what they as survey commissioners need from our survey providers and academic researchers to help them make better informed decisions on survey commissioning and research design. That should bring us to about 230 possibly earlier depending on how much we have to say. And then we will still have up to 30 minutes for a discussion with you the audience. You're invited to put your questions or comments in the chat which Olga will be monitoring collating. There'll also be an opportunity to raise your hand during the general discussion so not during the panel discussion but later on if you want to actually ask your questions in person. So let's introduce the panel members. First of all, Mike Daly. Mike works in the Central Analysis and Science Strategy Unit in the Department for Work and Pensions. He works on a range of issues mostly centered around external engagement with academia like the people here. And also with data linkage and evaluation. We also have with us Michael Dale, who's Head of Longitudinal Studies in the Central Research Division at the Department for Education. We're still waiting to see whether Alistair McAlpine will join us. Ali is the new Chief Statistician of the Scottish Government. His role is wide ranging and includes overseeing the quality and management of its household surveys. Martina Portanti is an Assistant Deputy Director in the Social Survey Delivery Division at the ONS. Martina is responsible for the delivery and development of the Household Finance Survey portfolio at ONS. And then finally, Andrew Spears. Andrew is the Strategic Lead for Research and Analysis at Sport England. So welcome and thank you all for joining us. So at this point I'm going to try and hand over to you so he can tell us a little bit about the survey research that you are responsible for. So perhaps I could start with you, Mike Daly. Just a few introductory thoughts about the sort of interest DWP has in surveys. First thing to say is a lot of the survey evidence we get, we don't commission ourselves directly. So we make a huge amount of use of surveys such as Understand Society and the Birth Cohort Study, English Longitude and Study of Aging and so forth and so on. Some of those we use more than others. Some of them we actually co-fund to some extent. So I'm not going to talk about those because although we're desperately interested in the way that those surveys are set up and run, the choices about methods are essentially for the people in charge of those surveys. And it will be completely inappropriate for me to start talking about them. We also have a lot of relatively one-off surveys that we run. So I was talking to colleagues in fact just before this about some surveys we're running as part of the evaluations we are doing of various employment programs. And I can say a bit more in a minute about how we make choices about those surveys. In terms of regular large-scale surveys, we have essentially two that we run. The biggest of those by a distance is the Family Resources Survey, which of course Martina will know all about. It's not that far removed from your interests. And that's a survey of around 20,000 households conducted face-to-face with field work running throughout the year. And that's been running for over 30 years now. And my colleague Joanna, little child being able to be here, she could have said much more about that. But in fact she is busily preparing for the publication tomorrow of the latest results. So if anybody's interested in the FRS, then you need to get on the internet at 9.30 tomorrow morning. And you'll see some lots of exciting stuff there. The other survey which we run regularly, which is very, very different, is our internal customer satisfaction and experience survey, which is carried out, field work is quarterly, around 12,000 respondents per year. And that is one where we are moving from face-to-face data collection more towards online data collection. And I'll say something about how we choose our approaches. And it is very much dependent on the context. So when you think about something like the Family Resources Survey, this is something which provides data which not only is a source for published national statistics, including important things like our estimates of how many people are below the poverty line. But it also forms the base data for the micro simulation model we have for all DWP policies and a range of other uses. So the onus on us to get this as right as we possibly can and our willingness to invest heavily in getting the right results is huge. As also is the pressure to make sure we don't do anything which upsets the consistency of data collected over time. And one of the things that we have to accept with our methodology for the FRS is that there is a significant time lag in it. So the results published tomorrow relate to the financial year 21-22. So there's quite a time lag in that. Another important consideration in our design of that is we collect an enormous amount of data. So the questionnaire is very long. Some of the questions are quite involved. We try to collect data from everybody in a household. So all those things tend to point us towards face-to-face data collection, as we've done for many years. The customer experience survey is almost at the other end of the spectrum, where although there is still a premium on getting good data out of it, the importance of getting results as quickly as possible. And having a survey which is as agile as possible to respond to changing policy and operational needs is huge. And the questions we ask are relatively straightforward, and we ask them of individuals rather than households. So that points us in a different direction for the survey methodology. And then all the other one-off surveys which we do, it's the same sort of criteria we consider. How important is the survey? How quickly do we need the results? How involved is the data collection? What sort of data are we collecting? And we make decisions as we see appropriate in each case. Good afternoon, everyone. So I'm Andrew Spears. I lead the research and analysis team at Sport England. And for those of you perhaps not familiar with Sport England, we're an arms-length body of the Department for Culture, Media and Sports. And our responsibility as the name suggests is to promote community sport and physical activity amongst the population of England. So core to our mission as an organisation is about increasing levels of activity amongst people, reducing levels of inactivity, I guess the flip side of that. And within that tackling the inequalities that we observe in that sort of form of engagement. So that might be by age, gender, socioeconomic status, disability, cultural diversity and other characteristics. For a relatively small arms-length body, I think it's probably fair to say we've made quite a major and long-standing commitment to sort of population level measurement survey. So we can go right back to 2005-6 where we launched our Active People Survey, which was a landline telephone survey for adults, people age 16 and over. With the exception of one year's worth of gap in 2006-7, we ran that for 10 annual waves. But then in 2015-16 we launched our current and adult participation survey called the Active Lives Survey, which is a mixed mode survey where we encourage people to complete online, so push the web survey. That is, again, an adult survey of people age 16 years and over, and it is now in its eighth annual wave of data collection. In 2017-18, we launched alongside that adult survey, our very imaginatively named Active Lives Child Survey, which covers children, young people in school years one to 11. And that survey is an online survey where we recruit the young people through a sample of schools. And within each school, we then randomly select year groups, and then with each year group we randomly select a mixed ability class to collect the data from. All of our surveys perform broadly the same purpose for the organisations. They're about measuring the levels, patterns and trends of engagement in sport and physical activity, and probably two of the sort of foundation or most important criteria that we've had when we've been thinking about how we've designed and set up those surveys is the need to be able to measure at a local authority level across England, so be able to produce a defensible and robust estimate for every local authority in the country, and also to be able to measure activity levels across a range of different types of sports and activities, some of which will be quite low prevalence. So that could be everything from walking and cycling, fitness activities, dance, but even what might go as more traditional sports like football, tennis, cricket, etc. So those two requirements, the ability to look at quite a local level and also be able to measure some quite low prevalence activities, has dictated to some extent the sort of scale of the data collection that we do. So for the adult survey, typically we get about 175,000 responses each year, and for the child survey it's a bit smaller, but we still achieve about 100,000 responses each year. And again, we'll come back to this, I'm sure over the course of the session, but that requirement has dictated some of the choices we've made around the mode of data collection we do. Alongside our specific needs around understanding physical activity and patterns of behaviour, we also collect some data on behalf of some key partners, so the Office of Health Improvement and Disparities, where we collect some additional physical activity data around gardening, which they add to the rest of the physical activity information that we collect. We also ask questions around height and weight on their behalf, which enables them to calculate excess weight in the population and also consumption of fruit and vegetables. So how many people are meeting the Fiverr Day target? So some broader public health measures there, which are very relevant to our mission as an organisation as well. Then we also do some work with the Department of Transport, who use some of the walking and cycling data that's collected to provide local walking and cycling statistics. There are a few other things the survey covers as well, which are of more direct relevance to the sport England. So those people that volunteer to support sport and physical activity, people's attitudes towards sport and physical activity, the extent to which they enjoy it, how comfortable and capable they feel playing sport, and also some information around what we've described as outcomes, but essentially things that are sort of positive outcomes, which we know are associated with physical activity, and it's interesting to sort of see the correlations there. All of the data collection we do starts with a probability sampling approach, and I think that's very important to us. It's all cross-sectional, and at the moment all of it is self-reported data that comes forth from the respondents. And I guess in common with most of the other sort of commissioners who'll be on this call, that data then is really important to us in terms of guiding our decision-making around a range of things. So it helps us understand the populations of most need, the places and geographies that most need our help and support. It helps us understand overall trends in activity and how things are going there, but also how patterns and preferences are changing between activities, which again is really helpful in guiding our sort of policy and investment decisions. So hopefully it gives a reasonable overview of, yeah, Sport England and our sort of background and history in some of today's collection. I'm from DFE, and I'm primarily responsible for longitudinal surveys, but I'll try and talk a bit more generally about the surveys that we commission. So I've got quite a long list. So longitudinal studies primarily focused on children and young people's outcomes. So we're finishing off LSPI B2, which was a co-quarter of adolescents passing into the labour markets. We're currently running a big programme called EOPS, which is four or five years studies looking at the progress and development of children and young people across various different states for education from early years through to post-16. And we've also got another longitudinal study of care leavers. So each of those are big probability sample longitudinal studies that collect a lot of data that really enhances the administrative records that we've got in relation to service use and that sort of thing. And yeah, they're all very much focused on children and young people and their outcomes and sort of be into strategy level, evidence basis and decisions and that sort of thing. So that's children and young people, if you like, in the longitudinal part. The other part of the longitudinal work we do is in relation to practitioners where we want to understand their experience of their job and retention and quality and all those sorts of things. We have one on teachers which is ongoing and we also have one on the cohort of social workers which is just wrapping up. It's been extremely valuable and perhaps we're looking to do something similar with that or extend that cohort somehow. So those are the longitudinal studies and then we've got more sort of topical omnivorous studies where we're looking to kind of a quick perception of our institutional data to help inform rapidly evolving policy and thinking. So we've got one of young people, one of peoples and parents, one of school teachers and leaders and I think with one in the post-16 space as well but as the omnivorous name suggests that's quite sort of an eclectic bunch of questions that I think it's been one-quarter of those surveys. A very good quick turnaround information that perhaps isn't quite so robust but nonetheless is better than having no evidence whatsoever. And then we've got very policy specific projects like the childcare surveys and there's one on skills and qualifications and we also have surveys that service evaluations etc and these are kind of studies that are narrow in their scope but have a great deal of depth in relation to a specific population of interest specific policy or it might be that they regularly collect trends that are most pertinent to a particular policy so that isn't a complete list but it's an example of some of the types of policy relevant. So based commission then we've got involvement in international studies so there's Isles of Pisa, through user benchmarking ministers do like a good international comparison they get a lot of press and we're also involved as co-funders the Millennium Cohort Study and in some sort of advisory capacity in relation to other ESRC funded longitudinal studies which are run by UCL and then on the question of why do we collect all this data what's the point why are we spending millions and millions of pounds on all this stuff and essentially it's to help improve policy or come up with new policies or ask the Treasury for sustained money or more money and those are very crude terms obviously but I think a lot of what we do that's what it boils down to and I could be a little bit more specific if you're allowing the time and so in terms of I can use that as my P2 as a case study but in terms of targeting policy that's recently helped us profile pupils with different levels of and reason score and authorised absence which has been a big issue particularly after the pandemic in terms of understanding outcomes so we've been able to breach number 10 recently on how those taking apprenticeship post 16 groups seem to bear well in terms of material and well-being outcomes and raising awareness of issues in the function so we've been able to enable a minister to write to the sector to encourage them to work against the gender issues in STEM and so showing that girls are at science and maths when everyone's taking the same exams they do just as well if not better than boys but after that they're less interested less likely to take this STEM job and go down that path and so yeah the minister wrote something for the National Press and wrote directly to providers as well encouraging them to tackle that prejudice and how to do it there's lots of things around government like government reviews and inquiries and that sort of thing and that's why P2 has been used a lot for that sort of thing so we had models of how ethnicity is predictive of P stage 4 for example and that fell into commission on race and ethnic disparities which I think was run by the cabinet office there's other things like emerging issues where it's just handy to have a really rich data set on something that suddenly become topical so there's lots of studies that fell into this but when the pandemic struck we were able to look in a longitudinal sense at how it was impacting the psychological health of a particular cohort of young people and the differential impact within that as well who was suffering the worst and then also just the big sort of strategic questions that sometimes require very rich data to understand for example here is better understanding understanding why disadvantaged people in London do better compared with their peers so with a multi-level model bringing in the admin data and all the survey data we were able to pretty much explain that which is helpful albeit not necessarily the answer ministers wanted because there was a policy a strong policy driver behind London's advantage if you like, didn't turn out to be the case so yeah, that's me thanks for giving me some time and happy to take questions next one so there was one question on the chart specifically to Michael about the Care Leavers study I just wondered if it was Leo you were talking about or something different I'm the Care Leavers Analyst Lead for DWP and wasn't aware of anything other than Leo no, so this is a dedicated longitudinal study following a particular group of Care Leavers I think it might have been those that were on a kinship of order arrangement I'm not sure the exact terminology not my remits but if you drop me an email I can connect you with a project lead Hi, my name is Martina Portanti and as Jerry mentioned I had the area at UNS that looks after all our household finances statistics surveys so when I talk about household finances surveys I'm talking about our living cost and food survey our wealth and access survey and the living conditions so obviously as people on the call will know UNS does carry out a lot of surveys quite large labor for survey, crime survey epidemic we're still running actually the COVID infection survey so there are probably more surveys that I can really mention in these five minutes call which is why I'm going to concentrate very much on the three that I look after in my area so I think I'm a bit in a fortunate position in my role that we sort of sit a little bit in between what is a more traditional commissioning role and data collection role certainly UNS doesn't have the sort of policy aspect so we don't contribute directly to policy we collect a lot of data that is used by a lot of government departments to inform their policies and we also collect a lot of data ourselves so we can commission design and carry out the collection only in the same place which creates some interesting tensions sometimes between the different hats that one is wearing but essentially the three surveys that I look after, they are quite large very expensive complicated surveys I will summarize them so a little bit similar to what Mike explained in terms of the family resource survey they are all surveys that collect sort of financial information from households so the living cost and food survey is around 5000 households per year and collects a lot of very detailed expenditure information on what UK households buy mostly face to face has also got a diary where we ask people for two weeks to absolutely record everything they spend their money on so that is quite burdensome not only for the respondents but also for us to pick up and process in-house we then have the survey on living condition which is something that ONS has been carrying out to feed into your statistics and we are still carrying out now even after Brexit and this is a longitudinal survey of households where we collect a lot of detailed information about income and poverty and the longitudinal element basically allows to assess whether people stay in poverty so it feeds into statistics of persistent poverty which is something that is very unique to the SRC actually given its longitudinal nature and the third survey is the wealth and asset survey and this again is quite a unique survey actually also internationally because we collect quite a lot of detailed information on what people own their assets, their pension in order to produce a figure of basically overall wealth for Great Britain so this one is not carried out in Northern Ireland the wealth and asset survey is also a longitudinal survey I think we are getting, we are currently collecting his eighth round of data so it's been running for around 16 years but some of the other surveys I find much longer history I mean ONS has been collecting expenditure data since 1957 in some sort of form or another so I think I resonate a lot with what Mike mentioned before about the need of consistency over time which sometimes creates issues when we try to modernize the data collection the other two things specifically that I find with these surveys is the complexity of the topic so at the moment we are looking at options to redesign these surveys, we are carrying out a large project to redesign the surveys, high school also finances, statistics, transformation, some of the people on the call have seen our recent consultation that closed but essentially from a survey point of view the biggest challenge we have is that our users want very rich information, a lot of variable for the same households and that is very challenging to move away from a more traditional interviewer led role because it just doesn't fit in a 20 minutes online interview the other aspect as well that is quite challenging is definitely the longitude in an aspect so it's good to know that the other Michael is also aware of some of this challenge with longitude in our data, it's just very difficult it's all the engagement in between waves making sure that people come back and don't get bored and I think it's fair to say that some of the topics clearly on household finances are not the most exciting so there are all these sort of challenges so we are looking at options to try to modernize and I think there is a lot of pressure to try to make these surveys less expensive which is probably something that most commissioners feel a little bit the pinch at the moment in the current financial climate this is going to be a bit of the context of the Scottish surveys that we've got running so there are three large household surveys that were run in the Scottish Government so there the Scottish House Survey the Scottish Health Survey the Scottish Household Survey which also includes the Scottish House Conditions Survey and then the Scottish Crime and Justice Survey and those are essential sources of data and they provide detailed information on all the kind of topics of housing, fuel poverty energy efficiency, transport culture volunteering, childcare lots, crime and justice as well quite importantly as well as also the quality characteristics I'll come back to that in a second but I mean data from each of these surveys is feeding into national statistics publications and our national performance indicators as well through the national performance framework which is our well-being measures in Scotland Prior to Covid-19 we were doing all these household surveys face-to-face in interviews with around 20,000 households every year usually involved in visiting between four and six thousand households in a four to six week period that's obviously changed a little bit but it was important actually those three surveys we combined them together we've also got the Scottish surveys core questions in there which allows us and those are the common questions that we ask across all three of the surveys which gives us a bit more breadth and allows us to give a bit more detailed analysis at local authority level and for small population groups which is why it's important to identify quality characteristics out of the core questions as well briefly running and I know you're probably trying to catch up with time Jerry so I'll not go into too much detail but I'll just highlight a few other big surveys that we've got so we've got growing up in Scotland which is the longitudinal study and it is a fantastic piece of work running since 2005 and 6 the children were 10 months old at that stage so Gus has followed them and it's the best quality available data source that we really have on children and young people of that age group and it's used quite extensively not just at a national local level but also at a voluntary sector level as well and policy makers are able to use that it's available for academics, practitioners, researchers but the longitudinal nature of that study really allows us to do some in-depth analysis of the impact of early life experiences and since we've gone through the first, second and non-third cohorts, later outcomes as well and that current first cohort are now in the cusp of adulthood and that's quite important I was also in a discussion yesterday with an aptly named Haggis which is healthy ageing in Scotland and that's a piece of work that's been done by David Bell it's still in university and we're looking at that as well because I think we're also now starting to track those parents who are now starting to move into older age so I think with all of these surveys together it's given us a real span of lived experience in Scotland outcomes and how that affects how it affects those outcomes I'll mention a couple of other ones I'll mention one more Jerry and then I'll hand back to you and I'll mention this one because it's close to my heart because it used to be in the area before it became Chief Statistician I worked in agriculture but we have the Scottish Farm Business Survey as well and that really is an authoritative financial analysis of farming in Scotland and farm businesses in Scotland and it's really important to be able to estimate Scottish farm business income which everybody assumes farmers are wealthy and rich which couldn't be further from the truth and it really does help us to look at how do we replace common agricultural policy what support needs to be done to help support farmers become more environmentally friendly not that they already aren't they are but how can we help support them as they support objectives to capture carbon on farms as well as reducing the carbon outputs. I'll stop at that point I could go on but Jerry I think you want to move on to further questions I'm sure we'll have lots more questions for you as we go along so that's fine but thank you all very much for that overview I mean one of the themes that seem to be popping up again and again in what you were all saying and it's something that we're going to try and unpick in the next what is it 45 minutes is on the one hand this need for very rich very precise accurate data on very complex issues the need for consistent time series which seems to be pushing people towards the face to face big surveys versus also the need for more timely surveys that can react to emerging issues and obviously how the data are being used what the reason is is affecting how you then decide on the survey design so those are things we will be looking at in a little bit more detail in a bit but first I thought we would just step back a bit and just think okay we've seen especially during the pandemic that the survey method has been great it's a lot of information that was needed at a time and it could be quite timely and agile possibly at the expense of some level of accuracy but something we can discuss but I just wanted to get your views really about what the strengths and the weaknesses of the survey method are for meeting your needs because of course it can do quite a bit as you've already explained but are there also limitations to what it can do so maybe I'll start this time with Michael it's a big question so just on top of the head reflections but I hope it's those that we've been setting up recently looking at the longitude and progress of young people over the five course periods being through early years primary school, secondary school, post-16 despite all the troubles and the pandemic and industry-related problems we decided to stick with wave one being face to face there's a couple of reasons that we decided to do that first we wanted to build up a rapport with participants because we're going to be speaking to them again and again and having someone in your living room having a cup of tea with a laptop is a good way of getting engaged with participants the other thing is wave one that's when you're going to have the most people in your study because you get attrition thereafter so having that as a substantive data gathering exercise is important and also there are just some types of data that you can only collect when there's a level of supervision or presence in terms of a field worker so in a number of our studies we're doing direct assessments of the child or parent that can't just be administered via remote modes so it's kind of a bunch of things acting cumulatively to suggest that face to face is important in that context of wave one there's going to be in subsequent ways for each of the studies there will be online modes in operation and they will absolutely fit the bill but we will be going back to face to face modes later for the exact same reason as those that I mentioned before so in that context at least face to face still is indispensable and worth the considerable cost is that helpful thing to have said to that kick us off? Yeah that's fine, I'm just also wondering whether there are any particular data requirements that you find currently that are particularly challenging to meet if you're using a survey so irrespective of what mode but just the survey method itself but what does it fail to deliver is there anything it fails to deliver? So again just going back to the direct assessment point that is a super useful way of generating evidence in as much as you can use a validated objective instrument to get a good read scientific read if you like for example Charles Cognitive Development as opposed to just relying on their attainment records from school which aren't necessarily a good reflection of that Charles Cognitive Development all sorts of factors influencing whether or not children do well at school So the survey method is predicated on a sample we all know that and that's where it starts to fall down sometimes because the administrative data sets that are available in a lot of departments are pretty much a census and you can do detailed subgroup and intersectionality analysis that just isn't available in survey data sets because you know the surveys aren't sufficiently powered to be broken down to lots of cuts to the point where it can quite help some specialists at least the confidence intervals become unhelpful So in that sense the surveys will always be competing against economists and statisticians in the department who are looking at the giant data sets from administrative records which are often linked and can be studied in the longitudinal sense as well I suppose the other thing to say is about as I would interest and hear colleagues' thoughts but surveys are typically thought of as a way of getting perceptions and attitudes and opinions and intentions which broadly fall under the soft evidence rubric if you like people's intentions aren't always followed through and people's attitudes and opinions in a survey context aren't necessarily how they operate in real life etc so I think that's some potential limitations there too Great, thank you I think I might hand over either Mike or Maritina because of course your surveys collect a lot of factual information as well so why aren't you using your administrative records and all the other data why do you need surveys Should I have a go at that first so certainly there are limitations of surveys maybe I go the other direction and talk about limitations with administrative data so it's certainly true and it's one of the reasons why we are continuing to develop the potential for linking administrative data to the survey data in the family resources survey one of the difficulties is that the the administrative data can tell us a huge amount about exactly which benefits people are receiving and how much they're getting and can reduce that on a very on a very frequent basis but they tell us virtually nothing about the people who are receiving the benefits so if we want to know even something fairly basic actually with our administrative data like the breakdown between people from different ethnic groups for example or people's family circumstances socioeconomic group they come from none of that's in the administrative data so if you're using a single source then there is choice sometimes between whether you want a dataset which is very large very accurate, it's got a huge sample size but it's actually very thin when looking at an individual or something which is incredibly rich on an individual basis but has tiny samples comparatively tiny sample size so we are doing more work to combine the sources but it's for various reasons that can be somewhat slow going the other thing that is worth bringing out and I think this is sensory indictment has pointed out a few times in when I've heard him talk is that sometimes surveys are actually rather more nimble than administrative data and you think particularly at the speed with which ONS set up the Covid infection survey and all the other survey work was done so for instance the Covid supplementary surveys introduced by understanding society in the birth cohort studies much much more quickly than you can adapt administrative records to produce similar information so I think the assumption that administrative data is always more timely is sometimes a bit of a sloppy one you need to look at the limitations I think you also need to bear in mind that there are errors in administrative data as well and there's both measurement error because our administrative system is not perfect and there are coverage errors that not everybody's included in administrative data and sometimes I think that the kind of errors that we get in surveys are better understood we're used to dealing with them and taking them into account in analysis whereas with the administrative data there is I think particularly if you're quite mindful about people loving to get their hands on those huge data sets there is sometimes just an assumption that it's all perfect and you don't have to worry about where it comes from and that's just ain't true. Okay great thank you just going to check before I move on can I just check if anyone else wants to add anything to that about the strengths and limitations I think the one thing I would say is I agree with what Michael have said actually and I think I don't think it's either or I think there are advantages to both administrative data and surveys and there's obviously differences in which way we can do it I think there's a real value in interviewing and I've recently been out with a field force officer and actually seen why that adds value and you just have to sit in one of these interviews and you get it and I think there's a lot of value for administrative data for me the crux of it is we can bring both things together and bring all that rich data then we can bring every bit of value out of it and as somebody who is on the mission and surveys what I'm looking for is how much value can I get out of that amount of money that we have spent either on administrative data or on survey data so by bringing those things together we get every ounce of value from all of the data that we've collected right okay thank you I'm going to there was just one point I wanted to add I mean I feel something else we need to be a little bit careful with admin data is some barriers in terms of being able to share it particularly micro data level at the moment some of the other sharing agreements we got in place actually that don't really allow us to make that data available to the wider community so I think it's something else that we just need to be a little bit careful there is a lot of value I think there is a lot of value in admin data particularly for small area estimates which is where we really struggle with the surveys but there are some more some drawbacks great okay thank you I'm going to move on now to think about modes we've really started thinking about that and talking about that for many decades face to face data collection has been the primary data collection mode for most high quality surveys and I say most because of course the active people survey which was the precursor of the active live survey was a telephone survey and I would like to explore that reason with Andrew in a minute but first I'd like to ask why has face to face data collection been the mode of choice for most high quality surveys in the UK and of course a lot of you have already mentioned about the richness of the data being able to understand what the data is about so yeah why have we stuck to the face to face for so long maybe I could ask Ali first I think there is a temptation to say because it's the way that we've always done things and a lot of the things I'm trying to change within the Scottish Government is the way the statistics is done is to look at reviewing those things and I'll come back to that point where I walked around Edinburgh with an interviewer and just seeing the value that that person was getting is the understanding of what they are trying to achieve and how they can tease that information out when it's not the person who's been interviewed doesn't maybe understand the question if you present that in an online form or if you ask that over the telephone you're not getting those nuances of the kind of suggestive facial expressions or things like that and I think those are the parts that face to face interviews that absolutely trump everything but I think we've had a big experiment haven't we we've had COVID-19 we've had to move away from face to face and look at telephone interviews and if I look at telephone surveys we know that we're getting lower response rates we know that it's more difficult to achieve satisfactory sample sizes we know that it's increasing the bias in the data and that we know that it means that we're getting less accurate less representative analysis an unclear picture of Scotland's population as a whole what I don't think is that we can carry on doing just as we have done in the past and just use face to face I think there has to be a multi-mode system where we think about it but I mentioned earlier on I'll just leave it at this point I mentioned earlier on the Crime and Justice Survey and that can be touching on really sensitive issues like sexual victimisation partner abuse those are thorny issues you can't do that justice without putting the end there to ask those questions sensitively I'm going to bring Andrew in at this point because of course when face to face was the primary data collection mode you didn't choose face to face for the active people survey I know a little bit about the background for that but I'd like you to share that with the rest who's given what Ali's just been saying about some of the problems with telephone you decided back then to opt for the telephone method rather than face to face interview can you explain something about the trade-offs that you were making making? Yeah and I think you're right Jerry to describe it as a trade-off I think fundamentally we had this central requirement that we wanted to be able to provide local authority level of estimates and we had a finite budget that we could justify spending on the study that we had when we looked at the various sort of modes of data collection that were available then and equally when we revisited this when we made the change from the active people to the active live survey face to face interviewing was prohibitively expensive for us to do that we sort of broadly speaking thought it was between 5 and 10 times the cost per respondent to do a survey like that and there's no way we could sustain that kind of investment so that's been at the heart of I guess our more pragmatic approach to it as well whilst I think we probably accept at times the response rates one can achieve through face to face data collection less so now but the sampling frames that people were able to use for the face to face data collection versus the telephone design back in the sort of mid nauties there were compromises being made there but there was no way that we could get the scale of data that we wanted to produce the granularity geographically of estimates that we wanted to if we went down that approach so we went for the best data collection method we could afford and we were quite strong in other aspects of the survey design so again having a strong sampling frame and a good probability sampling approach felt an absolute central requirement to us again quite expensive even when you apply it to those cheaper motor data collection but it was yeah we were pragmatic in the choices we made to be able to deliver the objectives of the study as we saw them within the constraints of the budgets we had great thank you very much so before the pandemic we were already witnessing a gradual shift from intermediate mode either face to face or telephone to online I mean the community live survey was the first high profile survey to use a web to push to web approach rather than face to face and then of course the switch from telephone to push to web for the active live survey and also the considerable development work being carried out at ONS for a web first approach for the future labour market survey but on the whole there was still some hesitancy to move towards online data collection for most high profile government funded surveys and so thinking back well I don't think we'll miss that one I'll move on during lockdown when most face to face data collection was suspended we saw survey commissioners responding in different ways Andrew you were lucky you had your online postal method so you just carried on but data collection was paused on the health survey for England for example web telephone follow-ups were carried out among previous responders to the national survey for Wales and the crime survey push to web was used for fresh address samples but possibly less than what we would have expected whereas push to telephone was being used on quite a few government funded surveys such as the family resources survey national travel survey and the English housing survey so again even when face to face was suspended for some of these high quality high profile government funded surveys they didn't move online instead they opted for the telephone why so Mike I think we could see you're heavily involved in the FRS maybe you could explain why push to telephone rather than push to online I'm not absolutely sure of all the all the considerations there one of the things I was going to say is that the question about why we stuck with face to face the experience of having to do it by telephone has to some extent confirmed for us that yes we were absolutely right to stick to face with face to face face so there'll be more in the publications tomorrow but the FRS was significantly impacted by having to do things by by phone I think probably it's something where there's perhaps more expertise to to translate an interview quickly but I think if you've got an existing face to face survey and an interview design then you can fairly straight forwardly say well we'll just phone somebody up and ask the same questions on the phone instead of standing in front of them or sitting in their living room turning something into an online survey takes a very substantial amount of development which I was going to say a couple of things about the face to face versus other modes and why we stuck stuck to them and one is the long standing result that you get better response rates face to face and although response rates are not for be all and end all they are an important consideration not just for the survey quality but for the survey credibility so that's always been an issue there's been an issue of sampling frames so we have for many years used the postcode address file as the starting point for our face to face surveys there isn't the equivalent for telephones so random digit dialing does in principle give you some of the same the same strengths but has got a lot of problems as well and if you look at the history of surveys the experience they had many years ago in the US of what happens where you run a telephone survey by looking at telephone book and just assuming that people with a telephone in 1940 were a random subset of a population you can go very wrong so the sampling frame is really important and also interview length I think it was always the received wisdom that there was a definite cut off to how long you could make an interview on the telephone I think that's probably expanded a bit in in later years but nevertheless I don't think anybody would contemplate running a survey over the telephone if it's going to take 2 or 3 hours to complete so those are all the reasons why we tend to stick to face to face and likely to continue to face to face has a pandemic changed any thinking around the mode moving forward or if anything just read I think you already said it earlier on it's confirmed that face to face is probably the best method for FRS I think in the FRS it's tend to confirm that yes we were right where there are other surveys where people have said actually it's worked quite well on the phone or online we'll stick with that so yeah I think the experience varies from one survey to another but I think that's the that's the whole point of the new survey data collection collaboration is to try and bring bring together that evidence from that's been gathered over the last few years and see what we can learn from it collectively great thank you if I could just check with the other panel members as anyone's position on the use of online change or telephone changed because of what what happened during the pandemic has anyone's opinion changed about maybe the need to move away from face to face at all I think I would say from the Scottish perspective what I would say is that face to face has got an extraordinary cost especially in extra rural areas I mean if I go around England and look at rural areas in England they feel like urban areas in Scotland and you know we have more sheep than we have people in certain parts so you need to still do the face to face and I think telephone I think we need to think about what is the extra cost of doing face to face over telephone and I think the other thing that's kind of coming out of what we're doing in Scotland and the Scottish Government announced a resource spending review as well which is having created a lot of us to think about carefully how efficiently we deliver services, how efficiently we gather data and things like that so I think there's a natural progression we're going to have to think about mix modes or thinking about how we do it so when I say mix modes and go back to what I've said earlier as well is that you can't get away from that value that you get from face to face but we just need to think about it in the mix I've got a few things to say as well Gerry if that's the kitchen so I think once the question there's an assumption that face to face is default in some way I don't think it is it's all project specific stuff and yeah there's the cost element even if face to face is good value we are still operating with fixed funding envelopes right so if you it's a zero sum game if you spend a lot on face to face in one one project then another project is going to be constrained to correspond with that but yeah we don't just commissioners I believe don't naturally gravitate towards face to face unless there's a very strong pace for it and I don't know where what happens within departments when commissioning research and surveys but essentially we have to be very clear about what an evidence requirement is in relation to a particular policy requirement and go through various iterations of coping a document that eventually comes to the market and we say this is what we think we need in terms of sample sizes and mode etc and either a lot of qualified people feeding into that decision which mode is most cost effective or will be advised to ask the market which mode is best suited to what we're trying to achieve and so I think when we do choose face to face it's not something that's done done lightly and there's normally good good reasons for it and you know would we be even having this debate if you know if we didn't have the pandemic in the problems with field forces recruitment etc I think we probably might just because the world's moved on with the technology was sort of accelerated by the pandemic but people just now are very much used to talking to faces in boxes on screens and indeed that may be the preference of some people as well so people may now prefer to have remote modes rather than have people in their living region and that sort of thing so I think that will battle as well but there are just some types of data collection for some types of studies that require face to face and I think that's always going to be the case to a degree just one final last point where I'm hogging the mic there's just something about conventional data collection means that it's well sort of aligned to the social science sort of the social science type work and the more robust work some things can be we need a certain standard of evidence and that's fine and other things well billions of pounds worth of policy decision might be based on it you need the highest standard of evidence and so people will be prepared to pay for the best types of data collection Okay great thank you Andrew you've got your hand up Thanks Geoff I just wanted to really build on the point that Michael was just making I think choice about mode should be based on as others have said the context and the requirements of the particular study and I also think the pandemic probably has accelerated a few things that are observing before but it does feel to me like some of the more traditional modes of data collection have been increasingly finding it challenging to retain the response rates they perhaps historically have and that's been a challenge for them and I think one of my observations of having worked with a push to web survey now for quite a few years is it still feels like it's a methodology that we are improving and refining year on year as we learn how to do it better and the natural defaults of the population as people say they are increasingly connected and digital and more comfortable giving information through these methods so I don't think it's necessarily the right answer in all instances but it feels like a methodology that's sort of on the up as it were rather than one that perhaps is being threatened by perhaps changing patterns of behaviour lifestyles the use of technology in households so it's just an observation Great, thank you I am going to because we are running a bit out of time so I am going to push forward a little bit because of course in addition to shifts in mode we've also seen an increasing interest in the use of new technologies or maybe not so new but new for survey research at least such as mobile device data and meters that can be placed with respondents we're seeing a lot of this work a lot of the work in this area being carried out for academic surveys including the work carried out at the centre for longitudinal studies and ISA but the use of new technologies and government funded surveys is still quite limited and given that they use new technologies with the potential of reducing respondent burden and potentially improving accuracy what do you think the barriers to using new technologies in your surveys and how can survey suppliers help you? Who wants to go first? Maybe I can jump in given that we are looking at some of these issues in our living costs and food survey so I mentioned earlier we got this diary it's always been an on paper with the pandemic we had to quickly make it in a slightly different mode so at the moment we are getting people to basically take pictures of receipts and send them to us and basically there has been a project carried out for a number of years in collaboration with other European countries in particular statistics Netherlands to look at the development of up and up essentially to try to track expenditure I think something in terms of government surveys we have a bit of a barrier or you know it can be seen as a barrier or a barrier to work in is clearly there are some guidelines in terms of government digital services we need to adhere to and accessibility that does create some additional work maybe as compared to being a private survey provider and you can just go on and try some new technology I think there is a lot of potential so on expenditure in particular clearly people do a lot of purchases online you know I myself only use my Google pay to pay anything at the top of the phone so there is really a lot of potential there to explore how you go about linking the sort of information back into the survey I don't think the new technology can completely substitute the survey because as Adelaide mentioned in terms of the limitation on the admin data you still don't get to the rich really is what people want out of our survey data but there is definitely a lot that we could explore and it could really help to improve respond and burden as well plus there is some evidence as you noted I read that the younger generation have been my kinder to you know cooperate into this survey which is always a demographic we struggle a little bit to capture Okay great thank you for your thoughts on that and also maybe thoughts on what survey providers could be doing or academic researchers to help you make these use of these technologies Just a few maybe slightly random thoughts and one is that clearly you have to have a need for data which which can be collected through new technologies and there are some things which is appropriate for people who are not seeing us talking about detailed expenditure data others that have been used quite often around activity monitoring I think there is the sort of pitfalls you need to understand firstly the accessibility issues who is actually able and willing to use various devices to collect data sometimes there are issues around selection bias there have been surveys I've seen done in the past which are essentially limited only to people who happen to possess an iPhone which gives you lots of nice interesting data and a huge big data set you can do exciting things with but is by no means a robust survey data and you also need to think that the accuracy of the data there is a potentially apocryphal story about people trying to collect activity data by issuing monitors to people and some people fastening their fit bit to the dog's collar to make sure they recorded more than the usual number of steps it goes back almost to carry on films and people putting stirring the thermometer in the cup of tea to give a force reading so as with admin data the assumption oh this is a tip top high tech device so the data must be completely accurate might not bear close examination in all cases it's a bit of a warning to you Andrew if you want to use accelerators on the active live survey across tabs against people who've got dogs in the house Michael you've got your hand up yeah so I think two quick points I think the first one is the most important and I really do think this is an issue there's we're talking kind of about potentially future innovations in technology there's been a massive innovation over recent years we're also talking at screens looking at each other having a virtual meetup and there's a world in the future where face-to-face doesn't happen and this happens instead and I naively assumed that that was all going to unfold quite quickly after the pandemic and it didn't so video interviews didn't fill the gap if you like and for very good reasons it turns out well there's some very good reasons not so good reasons and I just think if somebody gets there first amongst you and you agencies and starts to make that happen and work at scale with good data quality reduced costs etc comparatively reduced costs then I think that's going to be a really big advantage yes some of the reasons were we can only video interview people if they've got teams talking about teenagers with MS teams on their phone because it was GDPR compliant or something like that and yeah I think another one was we can't provide something to do wiping down phones or something like that during the pandemic or something like that but yes so it seems to me that could be sorted out perhaps I'm wrong we'll see how it unfolds and the other one is apps so we ran sort of an experiment in children between 20s the early years cohort study where we asked participants to do face-to-face data collection but also download an app and take part in activities by logging their children's milestones and observations of their children and parents feeling that sort of thing and I was prepared to write it up as an experiment that hadn't worked and the department shouldn't do that sort of thing again in that context it was quite successful so we've got a system representative subset of our early years cohort who are actively using the app and we're getting a lot of very rich data in much more sort of frequent instalments than the typical sort of annual periods and yes so apps can work and might be of interest to Colin Great thank you Andrew Just wanted to pick up again on the video interview point I mean just because we have the technology to do something doesn't mean people will do it I think is probably the lesson there and there's a parallel perhaps to something that we've initially tried with the active live survey and have gone back to and has actually been more successful the second time which was QR codes for people to access the landing page for our online survey really didn't work very well in 2015-16 when we first tried it the uptake and much more expansive use of QR codes through the pandemic we gave it another go as we came out of the pandemic and back into sort of normality and it's worked much better and I think it is now having a positive impact on the response rates for achieving through the study so I think there are some things we just need to keep an eye on and actually understand not just when the technology is ready but when people are ready to use the technology as well so I think that's an important consideration and then yeah the accelerometer example of strapping it to the dog is is an interesting one I think we probably also need to be kind of cognizant that there will be a few people that tell a few porky pies on any questionnaire we will ever send them as well so the idea that you know you can have complete faith in all the data from any mode of collection is probably we have to be strict about validate data but there are some other things that we're grappling with around accelerometers as well and how we might use them they can tell us something really quite objective around the intensity of people's activity and the duration of it perhaps in a way that people can't self-report but it leaves gaps in other areas and are sort of understanding so we don't know necessarily what the person is doing other than they are expending a certain amount of energy we perhaps know less about how they feel about that activity which are really important bits of information if we're going to make sensible policy responses to some of these things so I think for us accelerometers and technology have a place but we need to figure out where it is and I don't think it's necessarily at this stage to wholesale replace our self-reporting data it's how it sits alongside it and complements and strengthens it. Great thank you Allie you've got your hand up too I just wanted to come back in your question Jerry about what can survey suppliers do to help with new technologies and I think I don't think quite often people say our governments risk a verse I don't think we are risk a verse at all there are just a number of hoops that we have to go through before we can go and spend money and so when we're spending money on surveys we need to justify why we think this is the best way of doing things so I think the job that survey suppliers can do to help is if you're suggesting suggest a way different ways of doing things but what certainly the guys in the Scottish Government know how busy they are and they don't have time to do this maybe you don't have the time to do it but if you're wanting to embrace new technologies bring with us the evidence of why that will be to deliver either a better outcome or a more efficient outcome or something that will deliver another benefit that we maybe haven't thought about so that's the thing I think is not just bringing together the new technology but why that will work as well and that will help whoever it is that is trying to procure the survey to think about the case that they're building when they have to go and get the 10th sign off that month to get that through great thank you I've got quite a few survey providers listening at the moment so hopefully they're hearing you okay great in the interest of time I'm going to move on a little bit more quickly now we're starting to notice an increasing interest in reducing the carbon footprint of surveys for example the Welsh Government specified that this should be a key requirement for the recent redesign of the National Survey for Wales and Andrew you wanted us to cover this at this meeting as well so it's obviously something that Sport England is interested in so perhaps Andrew you could tell us how important is carbon reduction when you're thinking about the design of your surveys for example you could drop your paper questionnaires on active life survey but that would increase the bias to what extent is that an acceptable trade off how do you make that decision how does it weigh up thanks Jerry I mean I guess probably the outset for me to say is I think it's a question that we are grappling with rather than we feel we've resolved we've recently launched a new strategy and I guess probably in common with pretty much every other public agency that would be doing that thinking about the sustainability of our strategy, our organisation how we interrelate with other organisations everything we do needs to consider this and I've just been struck by other sessions I've been in the last few months about other commissioners of surveys thinking about how they deploy field work interviewers to try and reduce the number of miles they're doing to get around to interviews for us it's about essentially the kind of volume of paper and postage that goes on through the survey and yeah if we can reduce that it feels there's an absolute imperative for us to do that it's the responsible and right thing to do I think the thing that's coupled with that for us as an organisation is I'm sure we're not unique in this but the inflationary pressures on some of these costs as well so it's potentially good for the sort of business model of the survey as well as for the environment to reduce the amount of paper that it produces to reduce the amount of postage and mailing that goes on through a study so it's something we're actively thinking about we haven't resolved it I think if any of the other panellists have a question on the call has some brilliant ideas on it I would love to hear them Allie, put your hand up Yeah thanks I think I mean absolutely we should all in our day to day lives we should always be trying to look at reducing our carbon footprint in all areas of our life but there's nothing going to reduce carbon output from the UK like turning off a coal power station those are the big things that we need to do and I think is not as big and this is going back to my agricultural days that transport is not as big a carbon emitter as other parts of supply chains and I would just point to I think it was in the BBC last week about plastic in food and actually you know stopping plastic has unintended consequences like food spoils quickly get me through the emissions so what I'm trying to say is that in all of these things there are always trade-offs and I think they're not well enough understood I think one of the things that we would want to do through the surveys though is think about how do we design better policies that lead to systemic reductions in carbon emissions from society so I'm not particularly concerned although when I say not particularly I would like to think that everybody is trying to reduce their carbon footprint but I think the scale of what we're trying to achieve through the data that we get is more important but there are trade-offs there and we should absolutely try to reduce our carbon footprint at every step thank you can I just check with the other panel members is this something that's featuring in your departments as well I can say quite a bit, no as far as I'm aware we've not really thought about this there is an overall question obviously the carbon footprint of a face-to-face survey compared to an online survey is very much greater but that I can't see that as being likely under the very marginal cases to be a reason to go down the online route if we genuinely thought that face-to-face surveying was necessary I think the question that Chris Martin's raised in the chat is an interesting one how you actually organise your interviewing and it's it relates very much to a case I saw some years ago where there was an alternative survey design proposed which used smaller primary sampling units for a face-to-face survey which brought about a significant cost saving but in fact the effective sample size was reduced to the extent that the cost per effective survey response actually went up so you might actually find that it's not as easy as you might hope to reduce carbon costs of an interview survey so I'd be really interested in any strategies that field work agencies had talked about, tested to bring about reductions but also how to measure that carbon footprint reduction sounds like it could be more complicated than we think Michael I didn't think we're doing a great deal on this in DFB to be frank part of this you might be around the evidence as to which modes and which scales research are more or less environmentally friendly it kind of feels like there should be some cross-cutting work either across government or across agencies where we have a ready recommender that helps us try and understand what the environmental impact of our proposed research model would have compared to alternatives because obviously it's a critical thing to want to address we have ready recommenders for costing research based on past contracts and invoices it would be nice to have a similar thing that would at least give us ballpark estimates as to how damaging or not our methods are I think there's a point made in the margins about does running something online on servers cost with the size of the world is that better than having three pages of April go out by the post and what's the tipping point where one becomes better than the other it's quite complicated there's a risk here that just becomes a box ticking exercise isn't it this carbon reduction thing and I think we should be wary of that and try and find better ways of measuring it before I move on to the last question that I have can I just check if anyone else has anything else to say about the carbon footprint I've seen quite a few comments coming up in the chat I can't absorb them though because I'm focusing on what you're saying but I'm sure we're going to pick this up in the discussion again but anything else which case in 30 seconds or less for each of you to answer what is the main thing that survey suppliers researchers, methodologists can do to help you make better informed decisions about the commissioning and design of surveys who wants to go first do you need a bit of thinking time I can do the cop-out answer to buy everyone else some time which is maybe it's for us to involve the market and suppliers earlier and have more open tenders to allow the expertise from the market to shape what we're doing rather than coming to the market with a highly specified project which we think we've thought through well perhaps it's better to have people on board we do that to a degree market warming exercise sort of thing for pretender exercises but yeah perhaps a more collaborative approach to commissioning particularly in the current context where we're asking for the moon on a stick in a very difficult industry conditions perhaps that would be some sort of forum on those lines of contents I like collaboration that sounds good who wants to go next well I'll use the collaboration word as well which is to work with the survey data collection collaboration because what what we really need I think is it's all qualitative readily accessible and readily understandable information on the pros and cons of different survey approaches partly so that we can think of them ourselves and partly so that when we try and justify our choices to others we've got the ammunition to hand and a lot of that wisdom will come from the field work agency so getting that crosses it's going to be huge thank you I was good I mean the collaboration one is absolutely key I'll try and do an extension of that and tell us what you think we should be doing as well as part of the collaboration tell us where we think the efficiencies can come from and I think yeah I think the other thing as well I would say is that we're moving into a world where we have to be much more efficient and we've all spoken a little bit about that efficient either in our carbon footprint or the amount of money that we can spend or the way in which we get response rates so let's let's not I'm encouraging in Scotland is for us to not think about just you know carrying on as is think about how can we get smarter so work with us on that as well please right Andrew in a similar vein I think some of the things I've valued most about working with suppliers over the years is when they've been able to share their learning from other studies I mean certainly we wouldn't have gone down they pushed a web route so quickly if we hadn't been sort of made aware of some of the work that was going on elsewhere in government and that was a very good move for us and some of the things we've done with suppliers over the course of contracts to innovate develop improve over the life of those studies as well has been really really valuable to us so again forms of collaboration but sharing learning best practice so so we can ask better questions let better briefs basically great thank you Martina I struggle a little bit with this one because I can see both sides of the question here but I think you know I reiterate really what others have said I think the collaboration is quite key and in Subyajan we in collaboration but actually I think the commission certainly to trust the field work agency that they don't have secret agenda to try to squeeze more money out of it there are certain things that they've learned from other studies the work and some they don't and being able to share that sort of research being a little bit of alternative in terms of look this is really not going to work for you to come back and think about it I think we just need to understand each other position a little bit better and we'll find sometimes as well there is there is not a full understanding of actually what you can get out of people when you go out interviewing them and we just need to be a little bit more considerate of both sides of the house right thank you so the if I could be as cheeky as to ask a question back which is what does Survey Suppliers think we should be doing better what else do they need from us as well if people won't put things in the chat it would be great to see I think that's a good idea so that's a question so first of all I was just going to say the title for our next meeting was going to be collaboration collaboration collaboration collaboration we had five times collaboration there which is one thing I wanted to pick up with the Andrew said it was not just collaboration between the commissioners and the suppliers but you also mentioned Andrew that you learned a lot from other government departments as well and I was just wondering because that's a bit of a mystery to me it's sometimes to what extent is there collaboration happening knowledge sharing happening among the different departments yeah I mean they're gone Andrew sorry the question was for you so for anyone Andrew please go first yeah I mean groups do exist I think they have probably been more active and less active over periods of time depending on other priorities and pressures of individuals that have been central to those I mean the particular example I was giving Jerry was actually interesting it was a supplier telling us about what work they were doing with another one of their government clients so it wasn't it didn't come through a formal government network but I think we want all of the sort of eyes and ears and arms reaching into all of these things because yeah no network individually is going to be complete it's how you connect into a series of networks that we're probably going to give you the most useful and kind of complete picture or something yeah now I agree because this is part of the answer to Ali's question then for me as one of the suppliers it'd be a lot easier to have a network of government survey commissioners to talk to rather than individuals having a bit more of a joined up picture happening then and a proper sharing of knowledge but I think I'm going to have to draw this to an end because we have gone over time a little bit and I want to give the audience a chance to ask you questions as well as well as answer Ali's question so I've given the very difficult job to Olga she's been keeping an eye on the chat there's lots of questions I'm going to have to try and capture those before before we close the meeting later so that we can respond to that but Olga have you had any luck in trying to articulate and pull questions together well I was looking I was watching the chat and it was an amazing discussion going on so to be honest I think there were only few questions which then were responded by the people from the audience so I'm not quite sure well what I noticed obviously there were various examples from various surveys about mode switching and different experiments and things like that then discussion about barriers what were barriers to transitioning for example to online data collection these things like biomeasures were mentioned length of the questionnaire then also interviewer so interviewer roles changing and availability of the workforce of the interviewers after the post pandemic also what I was really really excited to hear the collaboration as Jerry you said it was mentioned five times and I will mention at the end obviously this is a plan and I'm really hoping that this plan will be going ahead very very soon so I'll mention at the very end of this meeting and I think we will definitely take forward many of the ideas which were discussed today so going back to the chat Jerry when I was looking at it I think that maybe the best idea is if there were some questions which were not addressed yet maybe if we open the floor to the individuals and then they ask them because literally there were not that many questions there were questions and ideas so I think the best way is just to open the floor to the audience that's like a good idea to me so if I could just ask the people who are in the audience if any of you have got an answer to Ali's question in the first instance could you raise your hands I know Ali if you want to repeat your question just what would survey suppliers say to us what could we do better what could we I see more time somebody said more time to allow R&D and tenders yeah unfortunately I think we're probably hamstrung by procurement rules there a lot of the time but I don't know if there's something Ali's frozen for me or is it me who's frozen oh you're back Ali you froze for a second there that question came from well that point came from me as you probably guessed Jerry reflecting upon I do a lot of work across government and then also you know do speak to suppliers as well through that work and a lot of time they would like to be doing more further R&D but the tenders don't allow them to do that I think really that change needs to come from within and we need to allow those suppliers to be able to do design and if they want to advise and be in that collaborative space actually we need to allow the time in our timelines to enable them to be able to advise and to say we need to do more R&D as well so I think just to really improve the quality of what we're getting at the end of the day see someone's hand up Fiona Johnson thank you Jerry my name is Fiona Johnson I work at the competition and markets authority and I had a question for the panel about push to telephone methodology I'm familiar with push to web our recent experience of a telephone survey was not great we wanted random digit dialed random probability sample the agency proposed something that wouldn't have been anything like that because they were going to top up the RDD with panel leads they implied it was a top up but actually it would have essentially been a panel survey using the telephone numbers that panellists have provided when we pushed back and said no we definitely only want RDD we found response rates were terrible it was really hard for the interviewers to persuade people to take part our response rate was terrible our achieved sample size well our response rate wasn't too bad but our achieved sample size was nothing like what we needed so my long-winded question is how well push to telephone actually works because if we believe the agencies people don't want to use the phone to talk to researchers anymore okay I'll check in with yes Mike yeah slightly some to the question but I think that there are different problems with different ways of doing telephone surveys I was reviewing some of the surveys that we've been involved in that's a couple of years ago now and one of the clear difficulties is that most of the numbers that people use are mobile telephone numbers and an awful lot of people will not answer a call which comes from an unknown number so random digit dialing I would imagine is hugely difficult whereas something where there is some possibility of an initial contact saying please call this number or this number will call you I think would not necessarily suffer from the same problems so it's a point I thought of making earlier that things can go in both directions that a one-point telephone surveys looked like the future and then with people so many people moving to mobile phones suddenly they don't look so clever anymore there's a time when postal surveys were almost impossible because people were inundated with junk mail coming through their letterbox now they don't get that what they get is they get spam email instead and a letter through your letterbox is a rare event to be celebrated so you have to think about the context in which people are contacted I would imagine that push to telephone could be effective if you think carefully about how you make sure that people's initial contact is one that they will trust I don't think it's about we don't want to talk on the phone it's about I do not want to answer the phone to somebody who I do not know who is probably going to try and scam me Martina you had your hand up yeah I thought you would say and I clearly moved all its surveys on the telephone you're in a pandemic and we were expecting a drop in response rate which we observed but it wasn't as bad as we were fearing and it goes back to what Mike said it was a mode of contact we didn't do a random digit dialing it was literally a letter where we were asking households to get in touch with us it's got issues in terms of bias we did get quite a different profile of respondents but there is something in there for the telephone you know going back to something you asked before in terms of some of the changes that we are carrying on we are going back face to face that is mainly driven by the fact that our surveys are just too long for the telephone we got to 60 minutes on the telephone it's still slightly too long but it's interesting because for example we're being a little bit more flexible in terms of where people are busy and the country limit an interviewer face to face the pandemic wouldn't really allow that interview to happen over the telephone these days we got a little bit more flexibility because some of the concerns around the quality they're not as high anymore the main concern at the moment is how much we can squeeze through and being able to reach people Great thank you and Andrew if I remember correctly one of the reasons that the active people survey wouldn't change the active lives moved from telephone which was random digit dialing to push to where it was because the random digit dialing method was not delivering how it initially delivered That's right Jeremy this is obviously nearly 10 years ago now and I think probably more than that when we started to thinking about making that change but yeah I mean the coverage that the RDD sample was achieving was diminishing and the profile of response we have was getting ever older and more mature which for something like sport which is predominantly done by younger people is problematic as well then we had the whole complexity of dual sampling frames because of the switch from landline phones to mobile phones the potential to select people twice within your sample for us we wanted to achieve that local authority level estimate how on earth do you know where a mobile phone is going to ring in the country if you just are randomly selecting them or even if it is going to for us ring in England let alone you know so there were a whole range of things which made it difficult for us but I think it's slightly different perhaps to the sort of push to bone approach that was perhaps originally in the initial question but yeah it was a there were a range of reasons it was right for us to move away from that landline design yeah And I think for anyone who is interested in the push to telephone approach I believe that O&S has produced a report I think it came out last year which compared the sample profiles across a number of surveys and moved push to telephone so I'm afraid I don't have the link to hand but there is information available on that if you want that. Any other questions from the audience I think Claire Vardman she placed a couple of questions and I saw the hand was raised but then it disappeared so I'm not quite sure if you would like maybe I think I can't No I can see Claire Hi I'm here I can't think that my question was to be honest In the discussion you've answered quite a lot of questions I know I've been to seminars where O&S have been talking really helpfully about the differential response rates to different modes which has been incredibly helpful and the other thing that I've put in the sidebar was about when I used to work at the Scottish Executive and we used to have these open days I can't remember what they were called where we had suppliers and academics in and we discussed our upcoming research needs and they would explain their thinking constraints and we would explain our thinking and constraints and our ministers kind of priorities and they weren't they were quite difficult not no not difficult they were what's the word? Challenging I don't even know whether challenging is the right idea they were constructive I think it was different people explaining their points of view on the same kind of problem but it was so so incredibly helpful for everybody understanding where we were aiming towards and the constraints within which we were operating and they really were so these were just these were with potential suppliers they weren't existing suppliers they were all the big companies all the big research companies agencies consultants and academics and it was just so so helpful to listen to what their views on the existing evidence and forthcoming policy issues were versus governments and what our priorities were and our constraints were and I don't know whether we do that anymore and obviously in the Scottish Executive way back when that was a pan nation thing so you know there's not loads of different departments in Scotland I don't know whether there's any capacity at all inclination to do that across government I just know that Ali seems to disappear so he might have had another meeting to go to I'm not sure or there's technical problems because otherwise I would have asked him to jump in and we believe that last week the Scottish Government it might have been the Scottish Government or some other group hosted a session and suppliers were there as well talking about these kinds of issues and what the long-term impact of the pandemic has been on survey data collection in Scotland so I think there is it seems to be a lot of appetite for this as demonstrated by the panel members and there's a collaboration and I think this is probably a good moment to hand over to Olga because of course Olga you're going to say something about what's happening after SDCnet because I think that there is definitely a need for more collaboration and sharing of knowledge Olga Thank you so much. First of all I really would like to say massive thank you to Jerry for leading and sharing this fantastic discussion and to all panel members that was really interesting so many interesting points and I'm sure lots of things we will take forward in the next project hopefully. Thank you all very very much and thank you to the audience because it was really interesting discussion happening in parallel on the chat and I'm hoping to be able to read through it in detail after this event so thank you all very very much as Gabi mentioned obviously this is our final event for SDCnet however Gabi already mentioned this as well we have lots of plans and lots of ideas for the next activities what will happen next and we are currently in the process of finalizing the large grant which is called survey data collection methods collaboration and hopefully will be funded by SRC which involves more than 30 colleagues from 16 institutions in the UK and so what we are hoping that this exciting project will start sometime mid-April to LMA and then we will be able to start announcing various events and various activities but we are waiting for finalizing the contract with the SRC so we really hope that all of you will continue contributing and supporting our first coming activities but I guess I'm not quite sure I can say more at this stage because we are still waiting to hear final details but for now I would really like to say to all network members, to all people who attended our events, who contributed to our events, a huge thank you for the contributions over the last year and a half and I really hope to see you all again very very soon in our forthcoming events thank you all very very much