 All righty. Welcome, everybody. Really want to thank everyone for taking the time this afternoon to hear from Catherine and myself as we reflect on the rebuild of classification.gov.au on GovCMS says. The site was launched about four weeks ago, which is pretty cool, because I was actually getting a little concerned that it might not make it live for the conference. But it was also good in a lot of other ways. It was a pretty long journey. I think Catherine and myself have got some scars from the project, but we're there now, and that's what really counts. And I think today we can share some of those trials and tribulations and more particularly the lessons as we went through the process. I know a few people in the room, which is cool, but obviously I don't know everyone. So by way of introduction, I'm Paul Morris from Solicitor Digital. Worked at Solicitor Digital for probably 10 years now. So as Alfred knows, it's a long time, which is all cool. But I've been delivering IT projects for sort of 25 odd years across different states of Australia, sometimes even international, US and Singapore and the like. So that's been cool. In this particular project, I was the engagement manager on the project and worked closely with Catherine as the product owner, which is, I think, we've made a pretty good team there, Catherine. And I'll hear more, and I'll mention more about that. A little bit of a fun fact. Some people know this because I'm quite proud of it, but I was actually born and bred in Hobart. So I'm really, really pumped at Drupal South's here this week. I haven't really lived in Tasmania for 25 years now, but I still call myself Tasmanian and identify as that. So I'm really proud of it, so it's unpumped. So hi, everyone. I'm Catherine Driessen. I work at the Department of Communications and the Arts, where I've been for the past 15-plus years working in various roles across Web, IT and communications. I spent the last few years managing or working on GovSEMS projects, the first one being the migration of the department site back in 2014, and the last one being the classification website, which Paul mentioned, launched a few weeks ago. My role is often as project manager and product owner, and for the classification project, I spent quite a lot of time doing both of those roles. So here's an overview of what we're going to cover today. I'll try and give a really quick rundown of the project. What is classification? What were the key issues that we needed to solve? And what were our goals? I'll then hand over to Paul who will touch on just a few components of the project. He'll cover the GovSEMS assessment that we conducted before jumping into the bigger build, the user testing approach that we applied at the design stage, how we applied the Australian Government Design System, and the delivery methodology. We'll then briefly touch on some of the key challenges and lessons learnt along the way, and we'll finish with a demo of the site just to highlight some of the key functionality. Also happy to answer questions at the end, or feel free to approach Paul or I if you want to discuss anything in more detail. So what is classification? In short, Australian classification is responsible for the classification of films, computer games, and publications. I'm sure most, if not all of the people sitting in this room, have come across classification ratings in one form or another. When you've been to a movie, you might have watched a show on Netflix or bought a computer game. A film might be G-rated, MA15+, and the ratings might be accompanied by consumer advice, such as mature themes or course language. It's these ratings and consumer advice that assist people in making informed decisions about the content that they engage with. The classification site provides a record of all classification decisions, dating back to 1972. So that's approximately 1.8 million records, which is a lot. It provides information for both industry and the public, and both audiences have a need to view the records as well as the ratings. So going back to the start of the project, we were facing some really clear issues. The site was last redeveloped 10 years ago, and we knew that it was not meeting user needs or providing a positive user experience. It was dated, content was unstructured and duplicated, and there were a number of accessibility issues. We kicked the project off by doing some user research, which confirmed what we already knew, but it also hugely informed the requirements of the new site. It gave us a better understanding of who our users were and the information that they were looking for. In addition to the external facing issues, we were also facing some internal technical challenges. The existing site was on a legacy SharePoint site, and it was connected to a number of interrelated legacy systems as well. So our goals were pretty straightforward. The overarching one was to redevelop the classification website to meet the DTA's digital service standard. By building the new site to the DSS, we could ensure we were addressing the key usability and accessibility issues. We also wanted to create a modern and relevant design, improve the IA and content, and enhance the key user journeys, particularly in relation to the search and the experience associated with discovering and searching for classification titles. We also knew that we wanted to move the site to GovCMS. The department had a number of sites on the GovCMS SAS Struple 7 platform, but what we didn't know was whether or not we could achieve what we wanted to with SAS Struple 8, particularly in relation to the data integration and the possible use of an API. It's at that point that we engaged Salsa Digital to conduct a GovCMS site assessment, which set us up on our journey to build and migrate the classification website. Thanks, Catherine. So as Catherine mentioned, we did embark on a site assessment. Now, Salsa's got a fairly standard methodology for site assessments. We've made assessments on GovCMS SAS before, also other platforms. Essentially what we do is make a list or matrix of project requirements. In this case, we had 79 requirements to assess. You see on the right-hand side is a number of rankings of complexity, and we apply each requirement a ranking of complexity from low to extreme. The green ones are obviously the low, and the yellow is mid-range. In the case of GovCMS, the greens are all config-type exercises, so completely achievable with GovCMS. As is yellow, it's a theme-based change. If we start getting into orange or we start getting into red, that's when things get really, really hairy for GovCMS SAS because it involves highly complicated work, and that's what we really wanted to weed out in the assessment to try to work out have we got any orange or reds, and if so, what do we do? In terms of the actual assessment results, firstly, most of the requirements were a fit to GovCMS SAS D8. 57 of the requirements could be achieved with just configuration. 13 were theme-level changes, so it's standard sort of GovCMS work, so it's all good, but there were five gaps. So in terms of the gaps, we had to have a plan for those. Now, data import was a showstopper gap. Schedule publishing and site notifications were gaps, but they were very minor. In fact, we've only addressed the data import. The data import was a showstopper gap because this classification.gov requires a feed of recently classified titles, so those titles that have had their classifications recently are run into the system via the API that Catherine mentioned, and we had to prove this could be done and it's sourced from a national classification database. The movie posters that we'll see in the demo against titles are also run in the same API, so getting that working was absolutely critical. So how we set about trying to see if that could work is we did some detailed module assessment and we also built a proof of concept. Now, this is outside GovCMS at this point, but it uses the distribution just to make sure it works. So we built a working proof of concept. It could ingest the data that we were speaking about. So at that point in time, we were confident it could work, but proven it could work, but we still had some dialogue with GovCMS to see if our approach could be something it could be thought to be put inside the SAS solution. So the likes of Nathan and Toby and Theresa and others, we had to talk about the actual modules that we'd chosen. And in its detail, but we had chosen the Migrate module set for this proof of concept. Also looked at feeds and some other modules, but anyway, we had a good conversation and we did have an in-principle agreement for those modules to make the distribution. So that was really a relief because our one showstopper blocker was essentially ticked off and it would make in the distribution and we had a viable path forward. So that was the assessment. We also did user research and it was quite extensive. With the user research, we actually partnered with Today.Design and they were absolutely awesome in their user research and also their greater UX. And the first part of the UX was really, really cool, I thought. We actually created some paper-based templates of components and we had some books of paper, like you see these signs over here and we were manipulating these elements on the page to quickly arrive at possible homepage designs, listing page designs and detailed page designs. So very quickly being able to try permutations, discuss, et cetera, et cetera. Now, this is just the project team doing this, about four or five groups, trying to get the best ideas, but it did allow us to make an initial design and from that initial design, we could create our first interactive prototype, a clickable prototype. And this prototype is what we took to external testing and in the external testing, we looked at two user group types, citizen and industry. The citizen scenarios you can see here are really concerned with age-appropriate content. While the industry testing was very concerned with deep drill down to the classification details and trying to make sure they can see the details they need and we'll see some of this in a demo. But anyway, so we had a strategy. We had to do some recruiting. There was six citizens and six industry participants recruited and then we'll set with the prototype to go to tests. In terms of the test results, so we had really good general feedback. I'm not sure if you can read it, but all those post-its up there are good thoughts of the site. Love the design in a general sense and that built confidence of our design. The actual scenarios I mentioned for citizen and industry, they also tested well in terms of the flows through the system. I think we had mainly like 13 greens or something and maybe a couple of yellows, certainly no reds. So the flows themselves were good and valuable to understand and largely ratified. The outcome of this, we could compile a series of key recommendations and our design could be moved to a production design and something we could build upon. So we mentioned the design system. So I just wanted to talk about what is a design system just really quickly. There's a whole other track on design systems so I don't want anyone telling me that's wrong or I probably could. But anyway, I'll just give my perspective on my research and what we did in the design system. The design system is a set of existing components to enable a design to be assembled, but there are also reusable components to allow a foundation build to be compiled. Now the Australian government design system emanates from the DTA and we use that design system and I mentioned this with the paper cutouts for things like buttons, accordions, navigation cards, many others. I've ringed a couple but there's a whole matrix. As many more go to the site and you'll see them. These are really, really useful components. And why are they useful? And why would you use a design system more generally? Because the components are proven and they're standard compliant. They're user tested, they're accessible, they are maintained, and also there's a community that sits behind this and the community idea was really powerful and something we wanted to embrace. So we made sure that we'd go back to that community where we could. And since live and leading to live, things have happened. Our projects now on their community showcase thread. We contributed our design sketch files. The latest sprint they're doing is actually adopting our pagination component and some of our other components are also under consideration, in particular around search. I wanted to quickly talk about the delivery methodology. I'll probably fly through this. Anyone can ask me more about this after. There's a lot of detail there. Don't worry too much, but just know that Solstice only uses Agile. So this is all Agile. This was the initial sprint plan that we compiled when we first set up the project. This is even before discovery. So we tried to size the project. We were looking at two design sprints, the yellow cells on the left of the yellow columns and five build sprints, which is essentially the blue columns, the yellow lines on the bottom of the yellow columns. So we had to go live in a BAU. But this was just a plan. So when we get into an actual project, things change and we need to adapt. So what happened is we actually went with two design sprints, but we actually had eight build sprints. So we had to cater for a lot of change. And some of the changes that we had to grapple with were extra creative scope, some functional scope. The search itself was complicated to do to match the current site. The API itself changed as we went along, so that, you know, took some work and we invested in extra pre-production testing. In terms of challenges and lessons, I'll just fly over these because we'll get to the demo. Firstly, credits to Catherine. An empowered and committed product owner is absolutely key. In Catherine's role, she made decisions or broken decisions with the business and it was really exceptional in how, you know, quickly she could turn around those decisions and the type of decisions that could be, you know, turned around quickly to our team. I mean, a commitment was above and beyond, so that was great. Investing in user research, we explained that. That was great. Early integration and proof of concept, so I spoke about the proof of concept we had for the data import, the design system and all its virtues. Issues happen, we need to be agile. I think the final one, GovCMS says can support which featured site. So that was one of the goals, Catherine, that we had at the start, so, yeah. Now we've got the demo. Okay, so we'll just do a very quick demonstration of the site and what we might do is just jump straight into a classification record. So here we've got the movie Frozen because everyone loves Frozen and most of the content that you're looking at on this page is all being sourced from the API. Quickly jump across. Catherine loves this stuff. I love data. Here's all of the data coming from the API and basically it looks like what you can see on the Frozen page. That's how it's presented on the site. Let me skip back to the Frozen page. So if we do a quick scan, we can see that some of the key data fields are being presented, things like title, classification, date, duration, as well as the title's rating and consumer advice. If we skip down to the bottom of the page, this is where we see a lot of the industry detail, so things like year of production, category, producer, alternate titles. If we go to the top, we'll just have a closer look so where we can see the rating image. So this is being dynamically generated. It's pulling in the correct image for each rating, as well as the separate text for consumer advice. And each rating also clicks through to a general description of what that rating means in more detail. You can see on the middle of the page we've got what we call the classification matrix. So again, this is how we come up with the rating. It's assessed against all of the different themes, things like violence, language or drug use. And again, the matrix itself is all being generated from the data that it's getting from the API. So each matrix across every title will be different. If relevant, we pull in a film poster. This adds some visual interest to the page, but it also helps the user validate that they're looking at the right record as well. Multiple titles can also share a file number. So that means that the titles are somehow related. So we've been able to pull these onto the page, allowing users to see what variations exist. So for this particular example, we can see that Frozen has a Collectors Edition. It comes on Blu-ray, and there's also a 3D film. We'll skip across to a Search Results page. Yep. So if we have a look here, you can see that we've put in the key information, so title, date, format, as well as a thumbnail image. And we've also got a number of filters down the side just to help users further refine their search. This format that you're seeing on the page here is also replicated across some of the key listing pages, which includes Latest Decisions and Upcoming Releases. We skip to the Upcoming Releases page. Looks quite similar. So this is all of the films soon to be released. We can see the film poster, title, and a visual representation of the rating, and also have the ability to filter by classification again. So that kind of wraps up our demo. Hopefully everyone's intrigued enough to jump on the site and have a look for themselves, and we're happy to answer any questions you might have. If we don't have time for questions, we're happy to just stick around and talk to people if you like.