 Yeah and thank you everyone for the opportunity to present this. This is a yeah for sort of ongoing work and I'm very keen to see what discussions lead from it and how we can help contribute. So this is a bit of GA's perspective on matters about quality and starts with a bit of a story that in sometime in 2020 there was a geologist at a junior mineral exploration company needing to make some some very important decisions. We'll call them Jo for the sake of the conversation. So her company was interested in applying for a license to explore for mineral resources in an area near Tener Creek which is in the basically bang in the middle of the Northern Territory. It's looking safe of resources like copper which are sort of very important part of a sort of an electrified low carbon future. So some data that Geoscience Australia acquired in the region had just been released and she would have been very conscious that there were other companies interested in this. So you know time was critical to for her company to make it to put in an application for what's called an exploration license to go and explore in an area. So you know a lot of tension a lot of things riding on that and what it comes down to is basically a question of trust. Could she trust the data that Geoscience Australia had provided? Could she rely on that data and her decision making and if she needed to sort of poke and prod a bit to test the quality of the data did she have the information on hand to do so. So I'll walk you through some of some of the background to that story. So my name is Keith Surkham. I'm the Laboratory Director here at Geoscience Australia in slightly soggy Canberra today. So what I'll do is I'll take you for a journey of the quality of some of the conversations that we've been having around leading quality and more importantly perhaps is actually building the culture around having that conversation and about getting people engaged and recognizing it's you know it's an important part of the work we do. And again I'm very keen to hear your perspectives on that as well. I'll quickly run through what the laboratory actually does and so you've got some context of what we do as well as a quick taste of some of the things that we're working on now in some of the directions we're trying to go in. So quickly just sort of running through the core capabilities of the laboratory. So we have sort of four core capabilities across the laboratory and I'll run you through some of these but I'll really emphasize this is just a very quick highlights reel. There's many capabilities I'm skipping over in the interest of time. I'm sort of happy to talk about it more after the presentation but I just sort of wanted to give you a quick flavor of the variety and complexity of the data sets we need to work with when we're sort of defining and capturing you know what is quality. So we'll start with sample preparation because this this is sort of right at the start of the guts of things because basically you know we handle thousands of samples a year from all over Australia. So you know these things come in if we get it wrong here then everything else is rubbish afterwards so no matter how wonderful our analysis is no matter how precise our instruments are if we're analyzing the wrong sample we're always taking the wrong bit of the sample then the results are rubbish. So our team spent a lot of time here you know checking samples against lists of field numbers and read labeling things and checking them again and subsampling and checking again. So that's quite an important part of that quality journey. This area also specializes in grain size analysis so essentially taking a sample of loose sediment like this say from the seafloor and to measure how much is mud how much is sand how much is gravel and you know that's an important data set because it gives you some clues as to what sort of ecosystems might be living in that area of the seafloor so you can you know once you've got enough samples you can start building up maps like this across the northern part of Australia that tells you something about what sort of ecological systems may be living up there and we do that traditionally with sieves and also more modern equipment as well that measure the scatter of the light as the sample falls through the water column. So another area is organic anisotope because so that you know the grain size might give you some clues about what's living there today but the earth history particularly of life you know well over a billion years of life and particularly leaving behind these things which are basically what we call fossil biomarkers so when you know plants or animals die they leave behind chemical fossils basically which provide a fascinating insight into a history of those environments as well and how you know how those hydrocarbons in the region formed and how they're related to each other so we've got instruments like this the classic gas chromatography and just instrument and we can build up from the sort of data these instruments collect we can start building up quite these these sort of diagrams to sort of relate how different samples and in this case from different wells across what's the browse basin on the north west shelf of Australia how these are related to each other and again you could sort of see that you know this data as well as you know just being interesting science because it's amazing to think well you know we can go back and figure out what sort of environment was there hundreds of millions of years ago but it also has important implications for again how resources might get developed and what industry does with that data the fourth area is basically geochronology microanalysis and mineral separation is working at how old a rock is because yes the earth has been going for four and a half billion years a lot of a lot of rocks in that time that need to be dated and then once we know the date we can work out where they sit in relation to each other and what's happened to them it's a fundamental data set that Geoscience Australia collects we do that by getting little grains of like this called zircon out of a rock which contain trace amounts of uranium it's cased to lead and we can do the math and work out how old they are using very big instruments like this it's called it's tongue-in-cheek it's called a shrimp but it's actually you're looking at about 12 tons of stainless steel sitting there so it's anything but shrimp like but it does analyze very small grains so this is one of these sand grains you can see the little spot that the shrimp's actually analyzing for that uranium and lead they give you sort of context there's a human hair on average for scale so again producing another distinct data set with sort of isotopic ratios and ages and again you know very complex calculations that again all need to be kept track of and understood and through a data quality process so why do we do it all so I mean I started with that example of Joe from a junior exploration company but Joe could just as easily be an agronomist wanting you know soil chemistry information to support pastoral coppering production so you could be a government regulator wanting you know reliable baseline information to monitor potential environmental changes or she could be a community leader in a remote community wanting to understand more about her community's options for a new water supply and this pathway which is developed as part of the exploring for the future program at Geoscience Australia sort of helps map out how our data that we collect is used by people like Joe to come up with benefits and impacts right across Australia so we can sort of zoom in on that you know reproduce the data sets so all of that systems thinking all of that data we acquire all of the talent of people and the instruments we have all come together to produce a set of data that someone like Joe can use in the exploration case you know she will make a pick about where they go exploring if they get lucky they might find out something that's prospective get a bit more investment keep exploring if they get really really lucky they might find hey we found a new copper deposit you know and gather more investment and if they get to the point of actually mining you know in some of those situations it brings investment into that region that otherwise might not happen around you know like rail lines and roads and power and things like that but importantly as well for again for a low carbon future as some of these minerals that they're looking for like copper and rare earth are going to be quite vital for that low carbon future so the critical point in that pathway why I bring it up and why what it's linked is to quality and the discussion that we're having about quality is that that leap there between outputs and outcomes we produce the outcomes sorry we produce the outputs very poor to get that right but we can't make people use that data we can't make them make decisions using that data and trust is the sort of a key part of that leap from taking our taking our products and then using them and then using them well to make good decisions so sort of that gives us gives you the idea of why we're talking about quality and that's increasingly the conversations that we're having about quality at geoscience Australia laboratory with our clients and collaborators so this is work in progress this is an attempt to develop a framework that our collaborators can engage with to understand why we take it so seriously why we think it's important to them and to the broader community and I'll quickly run through some of the things that we we talk about some of the concepts that we're trying to bring to people's attention as well as a sort of a quick example of what that looks like the first is around quality uncertainty so this builds on the work of Simone Vizier he's the professor of psychology ethics and well-being at the University of Melbourne and her ideas in turn build on the Nobel economic prize winning ideas of George Akerlof about how quality uncertainty impacts the market so the concept is is you have a buyer in the seller and in our case okay geoscience Australia is providing the data for free but the the person who uses it still has to invest their resources in it so you know in a sense it may be free to acquire but it's not free for them to use and to process so that there's still a cost that they need to bear so they need to know that they're getting it right so in the concept of that market idea is that yes there's a buyer and seller but the seller has more information about the product quality than the buyer because obviously they put it together and in the George Akerlof model this if he used used cars as an example is that the person selling the used car is going to have more information about the history of that car you know if it's been in an accident if it's had major work done or you know something like that so the user or the buyer is going to be uncertain and they may be hesitant and if you know if you're experienced or bad luck they may come to distrust the whole market so what then happens is the supplier is sort of forced to sell at a lower cost few corners to save save on expenses and things like that and that eventually reduces the overall quality and you sort of get into this potentially into this vicious loop where people the user becomes more and more distrustful the way out of that is to improve the transparency of the quality information so try to offset as much as you can that imbalance between the seller and the buyer or in this case the provider of the data and the user of the data so that sort of takeaway number one is transparency of quality information is critical for good science the second concept I like talking to people about is expanding their horizons when it comes to what does quality mean most scientists will focus immediately on the product and I've done I've done little sort of informal surveys that sort of show that if you ask them or what makes a high quality you know geochemistry data set they'll sort of start waxing lyrical about you know particular elements and you know of that data set you know particular qualities or elements and things like that whereas I quite like this model from Kenyon and Sen where they sort of talk about qualities actually a bit a lot more than that it's about how you know from a user perspective how the data was in this our case how the data was delivered how that product was delivered but then also internally about you know what does the organization do to support quality what are the processes and how are you capturing that information about the processes that produce the product and importantly as well as like well what was supplying what supply lines did you have into that process and in some cases if you're outsourcing for example you know how are you controlling the quality around that outsourcing so takeaway number two I try to get across to people is that product quality you know isn't just about the product itself or just about the data set or you know the uncertainty in measurements of the data set is actually a from a user perspective is a lot more to it so concept three is about fit for purpose quality and we would be all familiar with this triangle where you know you can have any two of those options so you can produce you can have a product or a service that's good and fast but it won't be cheap for example you know and this is often used and you'd often think well perhaps you know being a high quality upstanding government facility good is not always an option so it always has to be between cheap and fast but as I'm sure many of you know when when the pressure's on good can sometimes become a little bit optional and there's always the pressure to find cheap and fast options as well so I try to have a conversation when we're in those sort of situations to talk about this this particular model which again I'm sure some of you be familiar with about what's called the prevention appraisal and failure cost model which describes how you find optimal quality so if you're analyzing something in our case the level of quality will go up but it'll cost more you know it'll be a more expensive instrument it'll be a longer process it'll be a more detailed process so it'll go it'll become more expensive whereas the cost of failure so if we collect the data that there's something wrong with it that we need to go back and reanalyze it or someone makes a decision based on that data and it's wrong you know it's the cost of failure will be higher for the lower level of quality so the idea is that somewhere in the middle there'll be a sweet spot which is the optimal quality which is right here and now what I like to talk about is well that's actually going to change over time what's optimal now may not be optimal 5 10 20 years from now and presumably there's some lower limit at which point you know the user no longer wants that data it's no longer used to them and we can actually show this this is a real example and a good example of you know how capturing quality information that the sort of metadata behind some of these results is really important so this is going back to that shrimp instrument I showed you Geoscience Australia has been operating these instruments is basically 1990 and we've collected all of this data we have quite comprehensive data sets and in this particular case it's measuring a thing called the uncertainty one sigma so this would be one of these things scientists get excited about and understanding quality it's basically a measure of how well the machine's measuring the same thing so we give it a homogenous standard and how well it's measuring it again and again and you can see from this data set that yes our quality has improved over time you know there's been a steady on average decline in in that uncertainty which is fantastic and in fact the last few years a lot of those numbers are actually been arbitrarily set is that the the analysts couldn't quite believe that it was that good so they've just left it at one percent for example whereas actually the data was much better so yeah quality doesn't prove over time and we have to account for that so well the concept is is that well we try and have the conversation is like well how about pushing you know rather than sort of saying I need it now and I need it fast and I need it cheap it's like well let's consider what what you want this data to what longevity you want the data to have do you still want this data to be good in 20 years time and you know extend that window of that data lifetime and it's also important to point out that isn't just necessarily meaning oh we need a more expensive instruments or we need a more detailed process is that you can improve the quality of that data set by doing things like capturing some of that metadata that I just showed you from the shrink for example um you know by investing in capturing that so that data lives with the with the actual analytical data as well uh and you know and from a user point of view is about the delivery of that data is also a function of quality as well so you know that the quality cost is a bit more nuanced so the takeaway there is consider quality now and in the future uh one thing we do because we do one we actually now outsource some of our inorganic analyses is that we spend a lot of time thinking about how do we manage quality control when it's outsourced so this is part of that supply aspect of quality as well so one of the things you do in inorganic geochemistry where you take rock samples and you want to know their whole rock geochemistry is you take splits of samples at various points so you split things in the field you take samples from either side of the the outcrop for example and you split them and you split them and you know to get a set of analyses and then you add in things like project standards standards as well uh in our case also blanks because you also want to check for contamination through the analysis so this work comes this particular diagram comes from the Norwegian geological survey um who have uh you know faced similar issues as well so you take all of those samples and then you randomize them you know you're basically doing a blind test with the uh with the uh the analytical laboratory that you're outsourcing it to so you mix them all up you you know put everything in there um but you know what the num you know you know what each sample is you know which ones are the control samples you know which ones are the the standards for example but the takeaway there is you can't outsource quality control and that's a particularly important one because it's like we if we buy data from an external laboratory for example and we publish it and do all of those sort of things uh it's our data and if something's wrong with that data the user isn't going to go complain to the laboratory where it came from they're going to complain to us so we own it is and then there's a question which we've we're struggling with I guess we're not struggling we're starting to really engage with uh is you know how do we capture that meta quality metadata and I know and I'm very keen on being part of the conversations about how perhaps we standardize some of that across various commercial labs and academic labs and government labs as well so that you know that makes capturing that information and passing it on to users that much easier um and there's also a question about how we do this sort of work for um high cost or analyses that mixing them up doesn't work where you actually need to know what order the samples came in for example um and or if they're high cost because you know each one of these geochemistry analyses might be like $30 for example so you know your project may be able to afford to do hundreds of them but for example each analysis was $3,000 you're not going to be able to do that that same degree of quality control so you know there's also emerging issues there um and finally when all else fails I just appeal to that it's good science and I love this quote from Terry Bratchett uh about science not being a body of facts it's actually a method for making sure we're getting it right and just not believing things that would you know give us comfort and I think that's very important that a key part of science is that science is self-correcting it checks itself and checks itself again um and I think that's you know quality is a critical part of that and there's also opportunity in that is when you really hone in on the quality and you really understand the measurements you're making in your you can get those uncertainties down and you can compare and contrast meaningfully between samples between different labs and things like that you start finding that there's gaps in the science so this is a great example from Alisa Ocolic about um the carbon 14 um measurements that the the age that you measure of carbon 14 and the actual age that the sample came from there's a bit of a gap and this puzzled people for a very long time um but was only made apparent when they really really honed in on that quality uh it turns out that in large it's something to do with how how radiocarbon cycles through the oceans so that gap is actually telling us something about how climate changes and so you know there's an enormous amount of science in just driving the quality to into those gaps and looking into where things don't quite meet up with what our expectations are. So what are we doing at Geoscience Australia? We're building a laboratory from Geoscience Australia as a big strategy strategy 2028 we're part of that so some of the things you know the laboratory contributes to these are sort of key parts of the Geoscience Australia's laboratory around resources well for water resources our marine jurisdictions as I talked about with the the seafloor samples um but also in us namely in informed Australia that the data that we provide is as good as it can be and it has a long shelf life that someone can use that data today and come back in 20 years time and it's still useful it still has meaning. Very conscious of of things like you know the fair uh principles for data management and I'm sure this group is well aware of those as well um you know we're also looking at things about how we automate some of our systems and there's the last 10 years in particular we've just seen a huge swelling of new instrumentation particularly in the sort of microbiology side of things but they're starting to flow through and other science systems as well about how to miniaturise and automate a lot of the analyses that we do which again has data implications about how do we you know capture that fire hydrantive data that's you know these machines are capable of producing now um and then ultimately it's also back to people like Joe you know what do they want what do they see as quality uh in their samples and I think that's it's quite a critical part of of that is that yes we can get very hung up about you know the uncertainty and measuring a standard on the shrimp for example but you know perhaps the the user of that information is actually interested in something else about the quality um and they want to see that in the data set or in the metadata as well um so we're uh I won't go through all of these so we're we're developing a strategy for the lab that's that's looking at a lot of trying to bring it all those sort of threads together and I'll sort of quickly run through some of those things particularly we're building a new laboratory um as we speak for just about the we are packing up the the old laboratory the new ones being built uh and all going well uh you know COVID and supply lines willing it will be finished by mid-May and we'll be moving in in mid-May as well as a new physical facility we've also got the opportunity to update some of the the network stuff so as we're talking about those that automated data management as well we have the opportunity to upgrade the network so all of the instruments are plugged in and we have the appropriate software sitting on a server that the the data can automatically float so we can start capturing that and but again it's like very interested to find out how we manage that you know going forward how you know what parts of that information data are useful for people and how we get it out to them it's also a chance to review our workflows and really hone in on how we um recommit and up our quality management game as well we've invested in a laboratory information management system about three four years ago and uh have been steadily working on that to implement it across all our workflows and processes and bring all of that uh online and also in this particular case of this this um starlimbs application is uses all all its capabilities because there's actually there's a lot in the back end that we we still haven't fully utilized yet again for capturing a lot of quality information about a lot of laboratory processes and what's happening with those samples and those processes the ultimate aim there is that we want to get Nata accreditation so Nata is the Australian STAM standard authorization body we want to get to a point where we can do Nata accreditation for some of our processes we've had a look at this over the last few years and in many of our processes we actually are capturing the data that would be required for accreditation it's just not usually again in a in a way that's accessible or easy to find or sort of brought together in a central place so yeah we we do have some data management issues there as well to to work through to to get to that point which is yeah what I'm saying is how do we make all that quality information accessible both to ourselves but also outside as well and then I think a very important thread and again this is something we're really working with to um within the laboratory team but also the broader Geoscience Australia is about that idea that quality um quality is actually sort of you know its own pillar right next to the product as I said a lot of a lot of scientists will focus on the product and the data on the information but uh it's no good without that sort of focus on quality as well um so some of you may be familiar with this you know these concepts are continuous improvement whether it's called lean or six sigma or total quality management but they all sort of have this kind of um uh sort of set up um particularly around quality and product that you obviously you know you want the best quality delivered um and you know on time of a reasonable cost and safely of course and on laboratories very important um but yeah finding the time and space to emphasize that work work on producing the product is just as important as the quality um as well and um that yeah they they both need to be well resourced and given the time to um to be worked on and uh yeah it's just a key part of that is building that culture um so one thing we've been doing um this is a particularly good exercise during lockdown as well when we couldn't get into the laboratory was going back and looking at our workflows and to try and understand you know where um where some of the the the issues were in the workflow where some of the inefficiencies were um but also thinking about well where where's the where's the value adding that we're doing you know where's the quality information that we should be capturing as we go through these processes as well so I hope that sort of gives you a whirlwind visit to Geoscience Australia laboratory um and gives you some idea about what we're doing to renew our commitment to quality management and and and data issues we're very keen to contribute to this community and other other communities that are involved involved in this work and keen to understand how we can help and how our our developments can align with what what what you are doing as well um I sort of hope those they sort of key takeaway messages that I try and um drum home to our clients um resonate with you or have some meaning for you as well um very keen to hear your thoughts um so the end of the day yes it's about people like Joe um and and getting the information uh that she wants that so she can trust um our data and make you know valuable decisions based on it uh both now but also who knows 20 years or beyond in the future um so that's partly why we've been uh we have that mantra about today's quality is tomorrow's reputation is that we you know we want what we do today will matter you know uh many years from now as well so so thank you very much um I'll leave it there and and happy to take some questions thank you