 Hello everyone and welcome to the webinar. I'm Sarah Kinghill and I work for the UK Data Service. Our main presenter today is Natalia Stutter who works as a Senior Research Officer at the ONS. Thank you Sarah for the introduction. So this webinar is going to be about the approach we've taken here at the Office for National Statistics or the ONS to develop materials. So when I talk about materials, I'm talking about letters, leaflets, emails and other communications that we send out for our online and household social surveys. So the focus will be on materials that we've developed for the online labour market survey today. So it's important to bear in mind that the findings that I share with you are specific to this context, so specific to the UK and specific to household surveys. However it's not to say that they won't work or aren't worth experimenting within your own context. I've tried to pull out examples that I think are more kind of universally applicable, so hopefully there's learning to take away for everybody. So first of all a brief overview of what I'll cover today. In this session I want to cover three main areas with you. First of all the why. So I'll talk a bit about the context and background for this work and explain why it is important. I'll then talk about the how, so I'll cover the approach and some of the tools that we've used to develop our respondent materials. Following that I'll discuss the what and I'll do this by providing some examples of a work from both the qualitative and quantitative tests that we've done in the past couple of years. So let's begin, so first of all why. So for a bit background and to set the scene, the government has a digital by default strategy to move all services online by 2020. So this includes things like paying your tax or getting a new passport and everyday things that we all might have to do. They also applies to us, the ONS and our surveys. And this strategy is underpinned by the government digital service or GDS, which provide principles and guidelines and best practice for developing online services. And GDS not only considers the online parts, it's recently revised its guidance so that it now applies to the offline parts as well. So GDS looks at the end-to-end user journey. So how do we get people online in the first place and what might happen afterwards. Coinciding with the strategy at the ONS and as a cross industry, we've seen an appetite for our surveys to be available online. So we know there is a public expectation on us to do this and we're also trying to reduce the burden of our surveys on respondents and providing kind of more choice and alternative modes for people to take part. As often, you know, the traditional modes aren't always suitable. In addition to this, we've seen a decline in response rates and need to look at ways that we can address this. So at the ONS we're undergoing a transformation programme which includes taking a push-to-web approach to survey development with face-to-face follow-up in the first instance, which is where this works. So other aspects of this transformation work include questionnaire redesign for online surveys as well. And this works with my colleagues. So Alex Nolan did a webinar last year about labour market questions. And Emma Dickinson, another colleague, is running a webinar on Monday on socio-demographic questions. So if you're interested in hearing more about that side of things, tune in. And links to these are included on the final slide of this presentation as well, which Sarah can share afterwards. So in addition to the questionnaire design stuff, we're also looking into admin data across the office. To help reduce burden on respondents in self-complete mode. So all of this work aims to lead the ONS to being able to produce better statistics, which ultimately lead to better decisions. In developing a push-to-web approach for our surveys, we face some barriers. So as I've mentioned, our surveys are currently conducted face-to-face. This is often in people's homes, but also on the telephone, as is true for the current LFS. And the addresses are randomly selected, and this means that we don't know who lives in the home. So one of our key barriers in a push-to-web approach is getting people to open the envelope and read the letter, as the interviewer won't be there in the first instance to convince people on the doorstep to take part. So the letter and communications that we develop need to fulfil this role as effectively. But even if we manage to get people to open the envelope, which has been a part of our engagement strategy, they still need to read the letter. And even if they read the letter, they still need to be compelled to go online. And even if we do get them online, we need them to go on and complete the study ideally and full. So our aim is to create a frictionless user journey that spans both offline and online experiences across all touchpoints. And our challenge has been about designing out the barriers and exploring different ways to get as many people to the next step in the journey and focus on, and this materials work in particular focuses on the first point in the above diagram. I've mentioned the word user a few times already, and I just want to clarify that when I talk about a user, I'm not talking about the data user, it's normal when we're talking about statistics. I'm actually talking about the respondent as a user. And recent guidance provided by GDS, the Government Digital Service, gives quite a nice summary about people or users interacting with government services more and generally pick up on the role of offline communications. So they state that letters are an important part of how government and agencies interact with users. A letter will often be about a thing that a user's never heard of or didn't know that they needed to do. And this is exactly the space room when we're asking people to take part in an online or any kind of voluntary social survey. So this means it's crucial that the letters we send out are clear and understandable, especially if the user has to act on it or it's explaining new concepts. And as I put on the slide, they go on to say clear letters are better for government too. If a user can understand a letter, they're less likely to get in touch and ask questions via another channel and more likely to do the thing that the letter is asking them to do, such as pay or register for something online. And this obviously has a benefit to organisations as well by reducing burden on them. So linked to this wider GDS principle is the approach we have taken to this work, which is focused around what we call or what is called user-centred design. And for us, this is about understanding user needs and encouraging response and making the experience better for users, so in our case, the respondent. But how have we done this? So first of all, we need to understand the organisational goal and the aim of the survey. So essentially, what are the business requirements and what are we designing for? We need to establish our legal basis and understand, for example, what the GDPR requirements are and establish who our users are. Is it an individual? Is it an entire household? Is it a subset of the population? Who are we trying to get to take part in a particular survey? We've conducted literature reviews and we've also looked at what other national statistics and institutes and other survey organisations are doing, such as Natsen and Ipsosmori and Understanding Society, because this is looking at materials and physical objects. In particular, we've looked at design community best practice for print materials. We've explored accessibility guidelines and also draw on behavioural insights literature as well. I'm not going to go into all of these, but behavioural insights and accessibility is something that I'll touch on next. So as part of our desk review, we have explored behavioural insights literature to see how we might be able to nurture people into carrying out the desired behaviour. We used the East Framework developed by the behavioural insights team, or BIT. This was the department set up by the UK government, but it's now an independent company. The framework they use is made up of four core values. Make it easy, make it attractive, social and timely. This might include things like personalising letters, making your messages salient, trying out the messenger effect by sending communications from different people. The overnighter are very lucky to have some colleagues or experts in this area to draw up on and have close links with the behavioural science team. So that's been really beneficial for us, but there are lots of interesting experiments that have been carried out that you can find in the literature. There's a lot of work being done by HMRC and DVLA, and you can find more on the BIT website, which I've included in the notes on this presentation. Accessibility. This is something that we need to think about in order to address the needs of all of our users and should be considered throughout the whole design and testing process. This is just some things that I've picked up throughout the process of developing our work. Some key things to consider are to keep it concise. We aim for a reading age of nine years old, and I'll talk about that in a bit more detail next. Use high contrast colours. Don't rely on colours to relay important information, and the same goes for bold. An imagery should be illustrative and not aimed to convey meaning as a standalone, so it should be secondary to what you're trying to communicate, and test with users with different needs where possible. As I say, I'm not an expert in this field, but these are some of the things I've picked up along the way from doing this work. At the O&S, we're very lucky to have over 600 field staff including telephoning on the ground interviewers who have a whole lot of knowledge that we can draw on, so they form a really important part of our discovery phase. Often in the initial stages of developing a new product or a piece of material, we'll run workshops or focus groups to explore current challenges and understand the techniques and messages that our field staff use. In the current mode to explore how they use those techniques to overcome those barriers. If you work in an organisation with this kind of setup, then field staff are a really good starting point for this kind of work, and we do this as part of our questionnaire redesign process as well. Moving on to talk about the design process itself. First of all, to ensure that we are researching and developing a user-centred way, we follow the 10 principles set out by the Government Digital Service. Although the word digital is in there, as I've already mentioned, the guidelines have recently expounded to include offline products as well as online, covering the end-to-end user journey of a user. Whether it's filling out a tax or applying for a new driver's licence, it still often involves some kind of paper communications too. For our surveys that involve sending people a link to get online and log into the survey to do it. Some of the key principles that apply to the materials are starting with user needs, iterate and then iterate again, be consistent and not uniform, and making things open. Part of this is sharing our learnings with the wider community, such as the webinar we're doing today. Hopefully some of this will become a bit clearer once we move on to talk through some of the examples later in this presentation. When we started this work, we obviously had some existing business as usual letters. However, we took a blank page approach and started from scratch. We went to our legal department and asked them what do we legally need to say to people we invite to our studies. Then we went out and asked users, potential respondents, what they would need to know when invited to take part in an online survey. This helped us build up content for our materials that we know users need to know about rather than what we think they need to know about or want to know about, which is the approach we've taken in the past. I'm going to talk briefly now about some of the methods we use to test our materials. Hopefully most of you will be familiar with these research methods and there are plenty of resources available to read more about them if you don't. But to start with pop-up testing is a method that we use early on in research and it's where you conduct a short interview or conversation with people, often public or semi-public spaces. It's best used as a tool in the early stages of the development process to gain quick insights, test the high volume of ideas, and to help develop the next iteration of your product, which might involve more in-depth or more expensive research like focus groups or interviews. Essentially, it involves you approaching members of the public and asking for their feedback. It's important that research questions are clear and focused, not too time consuming. In the past, we've used a box of sweets to attract people over by way of incentive, which always seems to work. Pop-up testing is a cost-effective way of helping you get some quick and useful feedback and help identify users' needs and develop your products forward. There is actual guidance from GDS on how to run pop-up testing and things you need to consider. I've included the link in the notes as well. If you're interested in finding out more about that, you can go there. Focus groups. These are a traditional qualitative research method that most people may have heard about. We often do quite big groups to get a variety of people involved, but we'll often do exercises so participants end up discussing things in smaller groups and feeding back. We found this to be a really good way of testing out our materials and getting lots of feedback from different types of people. Definitely one to consider. We also do interviewing. The type of interviewing we do is cognitive. Cognitive interviews are traditionally used for questionnaire design, but we've also applied this technique to the materials development as well. We often do this type of research in people's homes where it's a bit more realistic and a bit more comfortable for the participants. We observe them and ask them to think aloud when they're reading something or answering a question and then follow up and probe on things that they said or did retrospectively. We might ask them questions like how clear or unclear did you find that sentence or how easy or difficult was the information to understand and follow up with some more in-depth probes depending on their responses. Again, there's lots of literature and training courses available if you want to learn more about this technique and the webinar that I mentioned by my colleagues discussed this in a bit more detail. We then qualitatively analyse our data. We conduct thematic analysis so we'll transcribe and then create sort of memos as we go along, group the themes and then make recommendations for changes and redesign if necessary to take out for retesting. I'm going to talk now about some of the tools that we use to help us develop our products. First of all, as I said, when we talk about users, we're talking about our respondents who are more often than not the general public. These are a widely accessible group of people who we can use to make sure our designs, whether it's a questionnaire or whether it's letters, we send work for them. So where possible in our research, we try to establish what kind of vocabulary or words people use to describe whatever it is we are trying to convey. So by trying to write in a way that uses familiar language, it helps reduce burden and cognitive load. Once we've tested and established word and it seems to work and make sense, we do try to recycle this as much as we can across different mediums. So whether it's in the letters, the leaflets, the questionnaire or the website, it's kind of creating that common tone and common language that we aim for. And some of the tools we use to assist with this include the cheminwayapp.com and this is a readability calculator and you can use it to sense check where your sentences might be a bit tricky to read and help you identify where to make improvements. It's certainly not perfect. Our materials here are proofread by our comms team as well because we're lucky enough to have one and I'd recommend working with a colleague or peer or perhaps someone not as close to the work as you are to help spot mistakes. Microsoft Word is a built-in tool to calculate the flesh king-cade grade and we aim to write for the age of a nine-year-old, as I said before. And this is because by the age of nine, most people have built a vocabulary of 5,000 words and they start reading the words and start recognising the shape instead and this allows people to read much faster. I've linked to a good blog here on the slide which has more details about how people read and why things should be laid out in certain ways and things like that. And to help with understanding the kind of accessibility of your product, VisionSim goggles or VisionSim apps offer a way of illustrating what it might be like as a person with visual impairments to the product you're developing and this is for offline and online products. So obviously this is no substitute for testing with real users but it can help in the prototyping and design stage and highlight any major issues early on. So if you have access to something like that, that's a really useful tool. And CNAI is another application that I've come across recently. It's now available in the UK and it basically converts physical documents into audio so that it can be read aloud so you can take a picture of a letter and it converts it into a PDF type document. I won't demonstrate this now but if you search for it on Google or YouTube, Microsoft has made a video and it demonstrates all of the apps features. It's currently only available on iPhone I think but hopefully we'll see it on Android soon as well. And we've used this to see how letters perform and it's helpful in identifying places where we might want to move things like the instructions around or relabel them. So for example we have a kind of step one, step two, step three in our letters and if that's not in the correct order the audio will read that kind of in a bit of a jumbled, unlogical order so they're really helpful in identifying where you can make those quick improvements. Google Trends is another tool we've used in the past. It allows you to compare the popularity of search terms based on countries and time periods. So when deciding what to call an incentive for one of our experiments we compare different terminology to see if we can find out the best language to use. The incentive was personally what I would call a tote bag or a reusable shopping bag but other people had different ideas and we compared the popularity of tote bag, canvas bag, reusable shopping bag and just shopping bag into Google Trends and based on this we found that tote bag was the most common term. However when we took this out to focus groups the term was not understood by everyone and because of this the members of the group were asked what they would refer to the tote bag as and we since changed the word into reusable bag so it's clear what we're referring to. There could be a number of reasons why the word tote bag came out as more common but it was poorly understood by a particular demographic. Firstly we know at the ONS because we have data that younger people in Britain are higher internet users than older people and so we expect Google Trends would be biased towards them. Also it's important to think about why what Google is used for. So sometimes it's for shopping, sometimes it's for research and in both of those cases it would be more logical to search for a tote bag than other terms. So people might be looking to buy a tote bag for example rather than a shopping bag. Google Trends clearly has some limitations and there's lots of things you should think about before relying on it as a sole kind of tool. However it can offer a starting point with some evidence behind it rather than simply plucking something else in there. However it's important to validate with other methods like we did in our focus group. So just to recap our kind of approach and research approach so we use social and user research on our materials and engagement strategy to make sure that firstly they're understood secondly that they enable the user or the respondents to conduct the activity they're being asked to do and thirdly that they feel motivated to take action. Our innovative approach to putting users at the heart of the journey ensures that we're developing products that meet the needs of the business as well as the users and we're not leaving any sort of behind or having to go back and revisit them. So getting this right not only takes time it also takes failures and for us trying things out that don't work and start the game is still learning and we've taken an iterative approach to this work which is illustrated by the circular part of the diagram on the screen. And the final stage for us in my team is to produce materials to be quantitatively tested. And this often involves some kind of experiment where we test different conditions and I'll provide a few examples of those later in this presentation. But most importantly our research is about finding out what works not what's popular and remembering that we are not our users which is why it's really important to get on tests with people who are. So I'm going to talk and show you a few qualitative examples of things we've tested and I found which I found might have some wider benefit. So one of the severe things we explored in our qualitative testing was tone. So we took out three different versions of a letter one which was really friendly one which was more authoritative and one that was somewhere in between and for us we found that users expected us to be somewhere in the middle. Not too overly friendly in Pali which you find a lot of modern companies can get away with these days. But people said no that's not what we expect from an official organization but they also didn't expect us to be using the Queens English and be super authoritative. So that was really helpful. And in these letters as well you might be able to tell by looking at them. We also explored some behavioral insights ideas qualitatively and got feedback on different components such as on the letter on the left we've got a sort of commitment device that people can cut out and put on their fridge as a reminder. In the middle letter we use icons and we put headlines on all of the letters and then on the right hand side we use a picture of our director general at the time and generally people told us they wouldn't use the commitment device. Icons went down really well and you'll see shortly how this is something we took forward and the picture of the director general which is kind of that messenger effect that I talked about earlier was quite emotive for people and there are comments that it looked a little bit like a letter from a politician which is obviously something we want to avoid. We also looked at infographics and how we convey complicated processes in a digestible and easy to understand way. So in an existing leaflet we tested we had version one and we knew version one didn't fulfill its purpose and explain the process very clearly so we used our insight from testing to design a new one. Version two was a mock-up created in Paint to try and show the design team what we wanted and once we sent that to them they returned to version three. So we took this out for testing and subsequently made some changes. Step one obviously shows a nuclear family there so we changed this to a more inclusive tick because obviously people are not in that type of household it didn't speak to them they didn't relate to them. In step two it shows a singular computer screen people thought that they could only take part on a desktop and at the O&S it would be really hard to make the survey accessible on all devices designing for mobile first so we added the extra tabler and phone there to make it really clear that people could do it on different devices. Step three was a little bit too abstract so we added people to the dots to show the data going from people to O&S. Step four people thought it looked quite good but unfortunately not all of the public knew what a bar chart was so we replaced this by numbers which was easier to understand. And then in step five we had one call over England but we found that there were some... we test all over the UK so we found that there were some tensions that the south of England gets an undue amount of attention so we wanted there to be more targets reflecting decisions being made all over the country thanks to them taking part in the survey and we've had a really positive reception to this infographic and the image here shows how this diagram has been applied in practice on our A5 double-sided leaflet. We've had various iterations and we recently changed the colours to make it clearer, more accessible and throughout the testing it's something we consistently get positive feedback on even when we don't probe or ask about it. The padlock on the right-hand side on the back of the leaflet so the left is the front and the right is the back it's another interesting thing we found in our testing by including the padlock people felt really reassured our confidentiality statement is something that we were told that we couldn't change so a bit like terms and conditions people said that they would be unlikely to read it or just skim read it and by adding a padlock and highlighting the statement in the box and not making it super small people said that they felt reassured and didn't think that we were trying to hide anything you can also see on the leaflet where we've used sort of a nudge theory so in the box on the left at the bottom it says to take part all you need to do is complete step two and then we've highlighted a box around step two to join those two things up and this is an example of endowed progress making the process look easy for the respondent because all they do is all they need to do is one thing out of those five and they've already done step one so our envelope research also offers another example of reiteration so initially we took envelopes out to an expert panel and then onto some pop-up testing to get some quick feedback on lots of ideas so this is an example of where we've applied that method and the envelopes were then refined and tested and filled the focus groups and the final designs created for the quantitative test we've used behavioral insights nudges on these letters and this was some learning we took from the Australian Statistics Office who were doing some experiments on messaging across their different materials at the time and we wanted to explore this in our context our surveys are a little bit different in their voluntary in Australia, they're often mandatory so without doing this testing we might have ended up choosing what we like best or what we thought would best be received by the public rather than choosing something people actually felt that they would respond to so by listening carefully and considering how people felt when they saw the different envelopes we were able to avoid taking something forward that could have potentially provoked negative feelings and this could have been detrimental to our survey response so just to illustrate that for example in England the branding we tried we really struggled to find something that was received positively so we tried things like an English road and a lion but most had negative connotations so in the end we went with something that was much plainer and just had a simple call to action on them so these are the tests that we used in the labour market survey test one they were sent out in white and brown and I'll touch on that a little bit in a moment and we used such statements as I said such as Scotland make sure accounted wells make sure accounted and play your part in shaping the UK for the England envelopes I wanted to talk about something that is a bit more recent so in recent months we've been developing products for an online only attrition test and to explore whether sending communication in between waves of the survey can help retain respondents and we're exploring the difference between three groups and no communications group an email and a postcard and we've done quite a bit of qualitative work to develop these products so the first step was looking at what types of communications would be expected by respondents and we included some questions in the LFS stress rehearsal in 2017 and also did some telephone interviews of previous respondents and in this research we found that people wanted to find out about how their data is used in the study results and we also found that they'd like to be thanked so working with these ideas then we did a series of focus groups exploring different ideas around data presentation and once we had refined that we did some further one-to-one interviews so we ran a number of focus groups the aims of which were to find out what types of information respondents would engage with and establish the best forms of presenting data so here are some pictures from a focus group we did where we asked participants to sort on axes from clear to unclear and engaging to unengaging on the other and through doing this sorting exercise it really helped us narrow down the options for taking forward into future development and testing because ideally we wanted to aim for those ones in the top right hand corner which were both engaging and clear but from this research we identified a number of user needs first of all the feedback or the presentation the results needs to be more engaging so bar charts in particular were said to be very unengaging and users commented on the use of space so too much white space made it look like we hadn't put much thought into it or made much effort the facts needed to be consistent in terms of themes and topic and not jump around too much and it really helped us identify a kind of problematic terminology that they didn't understand the reference points we needed we used needed to be relevant to users so for example one of the facts we tested was did you know the number of people in employment in the UK is higher than the total population of Australia and we did this to try and make the facts a bit more tangible and a bit more informal however it only really worked for those who had been to Australia and could comprehend the size of a few different versions of that on varying scales and unless that person was familiar with the reference point they didn't really work and that's difficult to find something that works for everybody and really people wanted plain and simple facts not dressed up and just clear and accessible they did say comparisons also help make the data more meaningful so these are some things that we took forward with the different experimental conditions that we are creating these materials for there were some things we had to think about creating between work engagement for a quarterly household survey so first of all the timeliness of data release meant that we couldn't reference the actual quarter that the respondents took part in as this data wouldn't be released in time for printing for the experiment we needed to keep the content and message the same across the email and postcard in order to measure the effects so we had to really design for postcard first as well because it's the amount of space that was available we also had to consider accessibility online for the email because this affected how we could lay out the information so we had to design for both formats another interesting thing that we found was that people felt citations were important as it helped them validate the information however too many asterisks led them to believe there would be too many caveats so we've tried to stick to one data set and one citation and this also helps us with space as well so this is what we've ended up with this has just gone out into the field now so we don't have any results yet but we've tried to create a consistent unit user journey across waves so the incentive that respondents would have received had something similar in design and feel on it so we're hoping that that kind of allows the respondent to join everything up into one and then this is just a quick screenshot of the email to show how we've had to change the layout compared to the postcard very aware of time so as part of our discovery work for the between wave engagement we came across the opportunity to conduct a mini AB test on the subject lines using the office's email platform so these are the two subject lines that we tried we sent out to a sample of 400 people who had recently completed the online labour market survey as part of another test and the email platform randomly sent either email A or email B to each respondent so during the feedback from the qualitative testing we tried subject line A and then in discussions with the behavioural insights team at ONS we decided to try out something different so something much shorter so that when it appears in the inbox on a phone you can read the whole amount whereas subject line A you can't quite see all of it but also we tried to come up with something that was quite intriguing and then we had some results so this is quite interesting the sample was obviously quite small so it isn't the most robust AB test we could do however we linked some really valuable feedback so as you can see out of the two emails B you've been counted had the most opens so we're going to use this in our large scale attrition test but we're also able to get some other metrics as well from doing this test and we found 97% of email addresses given were valid and we only had 2.3% of emails bounced back the highest number of opens for email B was 13 and for email A it was 8 and those who responded online were more likely to open the email than those who responded face to face which is perhaps what we might expect we had no unsubscribes and no phone calls made to our survey inquiry line with questions asking why they were sent it or you know complaints about receiving it so that was really positive for us as well and we hope this is replicated in the large scale test so I'm going to talk briefly about some of the large scale tests we've done in the past in relation to the materials to illustrate what all this work feeds into and the test started back in 2017 so some of you might already be familiar with the results but bear with me so I can explain for those of you who aren't so in 2017 we did an update test to explore how many people we could actually get online and we tried our different mailing strategies as part of this so first of all we sent out an invite and reminder for group 1 group 2 got a pre-note invite reminder and group 3 invite and 2 reminders you can see that the third option produced a higher update rate however we actually found the second option to increase timeliness of response so being a given a heads up primes people for the survey and we've taken this option forward in particular for the labour market survey because the timeliness of data collection is really important as a quarterly survey also something to note is that in the qualitative testing we continuously get people to tell us that they don't want to pre-note they say that they just want to get on and do the study and not be told about it but obviously as our results demonstrate here it's proved to be a really effective strategy of priming people and highlights the importance of doing that quantitative testing if you have the opportunity because then you're able to measure people's actual behaviors versus what they say they do or they want so that's been really good and we've taken those findings forward so this is just a overview of the package of materials that we sent so this is the pre-note on the left of the leaflet the middle letter is the invitation with the incentive slip and the incentive which has been a tote bag and then the reminder letter on the right and so this is a matching suite to promote confidence and legitimacy in the brand and consistency with on and off line so we're trying to create a similar look and feel across the different touch points and as I mentioned earlier recycled that language as well where possible so as part of this uptake test as well we also trialled the different color envelopes and different branding on envelopes which I talked about the quality research we did for that earlier and we found the brown out performed the white envelopes and we found a statistically significant result for envelopes of Welsh branding on in Wales versus the plain branding and we didn't see this replicated in Scotland however it was around the time of the referendum that we conducted the experiment so that may or may not have had an effect but something to consider we also looked at it mailing on different days of the week so we trialled a Wednesday second class meaning the letter lands on a Friday or Saturday and we trialled a Friday where the letter would land on a Monday or Tuesday and we saw spikes in completion on landing days particularly over the weekend and we assumed that this is because people have more time available to complete the study as opposed to perhaps a week night or when they're in work and the later in 2017 we conducted another test which was focused on incentives again there's a link to a full report on the slide and we used the mailing strategy that we identified to be most effective in the previous test and we found in this experiment that the total £15 incentive value increased the highest response rate the unconditional £5 was also high but the tote bag actually was the most cost effective compared to sending an unconditional £5 and obviously £15 is a very high incentive when it's a large scale survey so that's something we've taken forward as well so this is the example of an experimental test we're currently running involving materials and this is where the email and the post cards that I spoke about earlier fit in so the primary aim was to explore the effectiveness of sending engagement survey response in between the waves of the LMS this is just the three wave experiment but in reality the current LFS is five waves and there's not much literature on between the engagement and so we hope that this is going to be a really valuable study in understanding the effects of that we're also looking in this experiment whether or not a pre-notification letter is needed at waves 2 and 3 or how effective it is as I mentioned earlier we found this to be effective in increasing timeliness of response which is important but if people have already taken part it might not be necessary so the other qualitative work we've got in process I'll just run through this really quickly because I'm very aware that we're running out of time and it'd be good to answer some of your questions so interviewer calling cards as I say we're building a push-to-web approach for our online labour market with face-to-face follow-up for those who did not or could not complete online so to ensure the user has a consistent experience throughout the survey journey we're developing interviewer calling cards that match the tone and style of our invitation materials these were trialled in the LMS test 3 which was an experimental test to do with conditions it was just an opportunity for us to create something that was used by Ipsos more interviewers so the next stage of this is to test with the public and test out with our own field force and make them fit for purpose for another smaller operational test again we did something creative and we developed this calling card as a one-off using learning from the 2017 census test where they carried out behavioural insights experiment as I say this wasn't used as part of a quantitative experiment it was more playing with ideas as we had an opportunity to develop what we've called a nudge-to-web card and this is where the interviewer just posted this through the door there wasn't a face-to-face follow-up so it kind of uses different experimental insights such as SCS to a commitment device salience and that endowed progress again as I say another part of the communications includes the ONS social survey website pages and we're in the process of testing and updating these pages using some of the research findings from testing materials and trying to recycle that language across all mediums so for example we've changed the title from information about our household surveys to our studies what you need to know which reflects the title of the leaflet that we sent out so okay just to recap then some top tips for developing materials first we'll identify who your users are establish what their needs are and what they need to know design, prototype mock up different ideas and get out and research them, get that feedback reiterate and don't be afraid to drop things that don't work recycle content where possible it not only helps you but also helps the user become more familiar with your tone your brand and your style across the different platforms draw on expertise from people outside your own field for example I've worked with the user researchers, designers communications experts accessibility experts and behavioural insight specialists to lean on them and use what they know about to inform how we've designed these materials and most importantly don't leave it to last minute research take time but so does professional design so does printing so does dispatch so it's really something that should be considered upfront and it's obviously really important to get in people online in the first place so I've included some useful sources and references linked to the various blogs I've mentioned and also just to say as I've mentioned as well Emma Dickinson's presentation is taking place on webinars taking place on Monday and that focuses on the redesigning the socio demographic questions for household surveys