 Good afternoon and welcome to this webinar on novel synthetic opioid detection. My name is Ellen Mantis and I'm the director of the Chemical Sciences Roundtable at the National Academies of Sciences, Engineering and Medicine. For those not familiar with the Roundtable, it provides a neutral forum to advance the understanding of issues of importance to the chemical sciences and promotes the exchange of information among government, industry and academic sectors. This year we are excited to launch a series of webinars on emerging topics. This is the second in our series. Our first topic was on bio and organic hybrids and presentations and recordings from that webinar are available on the CSR website. Today we will be discussing the challenges of detecting novel synthetic or counterfeit opioids in the field and ways in which to address those challenges. The format will consist of one overview presentation followed by two in-depth presentations. There will be time for one or two clarifying questions after each presentation, but all other questions will be addressed in our discussion time after the presentations have been completed. Dr. Linda Broadbelt will be our moderator for this webinar. She is the co-chair of the Chemical Sciences Roundtable and Sarah Rebecca Rowland Professor in the Department of Chemical and Biological Engineering and the Associate Dean for Research of the McCormick School of Engineering and Applied Science at Northwestern University. She will be taking over for me after the overview presentation and she will be asking the questions on behalf of the audience. Questions can be submitted via the Q&A button on Zoom located in the bottom control panel. The chat feature has been disabled for audience members. For those tuning in via live stream on the CSR website, please submit questions by email to csr at nas.edu. With that, I would like to introduce our first speaker, Jonathan McGrath, Dr. McGrath is a senior policy analyst with the National Institute of Justice and the Office of Investigative and Forensic Sciences. He has experience with examinations of forensic evidence, including controlled substances and mobile field operation and currently supports several forensic science initiatives at NIJ. The floor is yours, Dr. McGrath. All right, thank you, Ellen. And I just wanna thank the NAS and the Chemical Sciences Roundtable for the opportunity to provide this overview for this very important topic. I also wanna thank the other staff of NAS and our presenters, Barry Logan and Marcella Naharo, who've been great partners in addressing these issues from both a forensic science point of view and a public health and public safety point of view. So I'm gonna try to advance my slides now and just wanna provide a DOJ disclaimer that any opinions or points of view that I express are those of my own. And so to start with in providing an overview of the opioid crisis and the role of chemistry to combat the emerging drug threats, including synthetic opioids, wanna provide a quick overview of some of the data that comes out of CDC in the mortality data sets. So CDC has identified several trends in opioid emergence, going back to the 1990s with prescription drugs and then also with a turn towards heroin in the early 2010s. And then we see a significant rise in synthetic opioid use and overdose deaths starting in about 2013. And you can see from the death statistics that are provided by CDC through 2018, there's been a dramatic increase in the use of prescription and other opioids. You see a slight decrease in total opioid deaths over the last year or so. But that trend has not been the same for synthetic opioids. Those such as fentanyl and fentanyl analogs and other analogs of opioids or other chemical structures related to those opioids have actually surged even further, even though we're seeing slight decreases in deaths. This is a very important issue from both the public health and public safety sides. So one of the considerations that we should look at for today's topic. So what does the drug landscape look like? DEA or the Drug Enforcement Administration schedules control substances between schedules one and five where schedules one are substances with no current except medical use in the United States, schedules two having some medical use but having high potential for abuse. And that same thing is true with the lower schedules as well of having lower potential for abuse. So between the DEA and the Office of National Drug Control Policy, national drug control strategy is issued every year and provides a really good detail as to the landscape from both public health and public safety data sets across the federal, state and local landscapes. So what information is needed to address these issues of novel synthetic opioids in particular for today's talk and looking at field detection and identification. So the information that's needed and why are that these chemical structures of novel synthetic opioids are similar to those of known control substances and these synthetic opioids are being designed at a rate and pace to stay ahead of federal and international laws that restrict the distribution and sale of the specific drugs. So traditionally in order to investigate and prosecute control substance analog cases the structural characterization of drugs or the metabolites is essential to demonstrate that the substance is both substantially similar in chemical structure to a controlled substance and has the substantially similar pharmacological effect as those controlled substances such as stimulant, depressant, hallucinogenic effects on the central nervous system. So going back a few years, DEA has been able to use its temporary scheduling authorities to issue new regulations on these drugs, these new drugs and their analogs. In 2018, DEA was able to schedule a regulation that is commonly referred to as the core structure scheduling act for fentanyl related analogs. And that was to allow additional actions to be taken to avoid imminent hazardous threats to public safety with these new analogs that are hitting the drug markets. So one of the operational gaps is in collecting this preliminary information predominantly at the field level. And who collects this information? This may be law enforcement or crime scene texts or death investigators. As we'll see from the presentations, we'll see the other considerations for what considerations are needed to collect this information in the field. And the goal here is to collect this information in a timely manner to then provide actionable information that others can use both to identify hot spots but then also trends across the country and then also within jurisdictions themselves. And it's not just the technology that needs to be developed but it's also taking that technology and implementing it into current workflows to ensure that the information is accurate and it's provided at the right time and can be used to then assist the drug chemists with their identifications of the drugs. So how can chemists get involved? The final two slides that I'll show in my overview provide a set of links to resources and further information to consider different new ways to advance research for these types of issues. And you'll see from our presenters where you can find additional information as well. So I wanna refer to DEA's National Forensic Laboratory Information System, which is a central repository for data that's coming out of our nation's crime labs at the federal, state, and local levels. And so as I mentioned before, getting the preliminary information within a case to the labs and to the medical examiners and corners and to this toxicologist are incredibly helpful to help initiate those types of testing strategies. The NIPLIS data demonstrated a significant increase in the number and diversity of new and emerging fentanyl-related substances in particular in forensic casework starting in 2015. And each newly identified substance requires additional research, development, and implementation of laboratory methods, testing protocols, and advanced technologies and equipment to ensure sufficient sensitivity and specificity to detect these emerging drugs in forensic casework. Increased education awareness is needed to standardize the analysis and reporting of these substances, particularly for drug death investigations and working closely with medical examiners and corners. The death certificate needs specific details of these drugs involved in the death to ensure the completeness and the accuracy of the drug death statistics and to avoid undercounting. This comprehensive toxicology data also comes from driving under the influence of drug cases and motor fatality, motor vehicle fatality cases to ensure that more information is provided on the prevalence and use of these opioids, fentanyls, and other analogs. In 2019, the Department of Justice issued a report to Congress alongside NIJ on the needs assessment of forensic laboratories and medical examiner corner offices. And I heavily encourage those in the audience to read the 13 pages in the report that discussed the opioid crisis and emerging drug threats in particular. It discusses the continuous emergence of the new drugs and drug mixtures that need to be analyzed by the stakeholders that I've mentioned. It also states that an additional $270 million was needed in 2015 simply to address the impact of these new drug and toxicology cases coming into the laboratories. And this number is a cost estimate is associated with the optimal balance of getting the request into the laboratory and issuing the reports back out to the customers in a timely manner. This report also identified that expenditure increases, increased for both drugs and toxicology in these years. And this 37% increase in expenditures was needed for drugs and controlled substance analysis, 25% increase in toxicology analysis. And this compares with about 3% increase in expenditures for other forensic disciplines in the same timeframe. There's been a dramatic increase in the amount of turnaround time to turn over the drug casework. Between 2011 and 2017, the turnaround time increased from 58 days to 79 days for drug cases. And for these same drug cases, the backlogs sorted from 385 cases to over 1,200 cases. So it was a 225% increase. So the preliminary drug testing that can occur in the field is incredibly important to help create efficient workflows and efficiencies within the laboratory environment to get those appropriate information about the drugs to the stakeholders. This report also identified that resources needed to help forensic laboratories coordinate with public safety and public health officials in order to implement field detection equipment to develop that actionable information and share data in a timely manner. And this report also identified a promising practice in deploying that field portable drug detection instrumentation to provide this preliminary results and also triage casework in coordination with the labs for operational management and oversight. I wanna turn to some of the forensic science research that's been conducted both at NIJ and across the federal landscape amongst the federal agencies. In 2015, NAS issued a report on the support for forensic science research at NIJ and our role in this area. And we also issued a report looking at the federal investments of forensic science R&D nationwide. And as I mentioned in these final two slides, I've provided links to additional resources that you can find. They're free resources and they're available to the public. So whether you're a student researcher practitioner or just want to learn more information about these forensic science issues are interested in applying your chemistry and scientific expertise towards these criminal justice system issues and their intersection with public health and public safety problems. Please visit these resources and funding opportunities for additional information. And as I mentioned, several of these additional papers have been attributed to some of the workshops and forums and symposia that we've been a part of across the forensic sciences to address these issues. And these are just some additional information to provide additional background for this type of topic. And I wanna point out in particular the opioid detection challenge that was funded by DHS last year was specifically to address the detection of opioids and other drugs within the environments in the field. So I encourage you to check out these resources and I thank everyone again. I'm happy to answer questions either now or at the end of the presentation. So I'll turn it back over to our moderator. Thank you very much, Jonathan. So this is Linda Broadbelt. And there is one question that was levied during your presentation. Perhaps this will be addressed in the additional presentations. So feel free to punt it forward if you'd like. But the question comes to us and asks, have you seen Fennethyl-4-ANPP? And if so, would you classify it as a precursor for fentanyl-like for ANPP? So maybe this is about kind of the range of compounds you're seeing with this specific one of interest. I'm going to punt that to my colleagues if they're all speaking next, but I'll also do some background research and provide additional information where I can on that particular chemical. Thank you. Specific ones are going to come up during the subsequent presentation. So this might be addressed there. All right, if there are no further clarifying questions and as a reminder, we'll save the questions, substantive questions for at the end during the discussion. But it's my then a pleasure to just thank you, Dr. McGrath and introduce our next speaker, Dr. Barry Logan, who will speak to some of the analytical considerations for infield detection. Dr. Logan is Senior Vice President of Forensic Sciences and Chief Scientist at NMS Labs and Executive Director at the Center for Forensic Science Research and Education at the Frederick Readers Family Foundation. He works with forensic toxicology and analytical chemistry to understand the effects of illicit and prescription drugs on drivers and drug caused and related death. His recent work has focused on the analytical interpretive toxicology and chemistry of novel psychoactive substances. So without further ado, I open the floor to Dr. Logan. Thank you, Linda, and thank you to the NES for the opportunity to share some of this with you today. Just by way of my disclaimer, I will mention some techniques and products here today, but they should not be construed as any endorsement of any particular product or organization. So I'm going to talk about the technologies that are available for use for detection of novel opioids in the field. I think many of you may have experience with taking technology from the laboratory out into the field and recognize that there are compromises and considerations that have to be accounted for when you make that transition. They can include things like what's an acceptable level of sensitivity for a test, what's the degree of complexity of a test that can be done in the field versus in the laboratory. In the case of testing for drugs, whether you're testing to a menu of things that you may be looking for versus the ability to identify something novel or new that hasn't been seen before, that may be better suited to a laboratory environment. And then considerations about the stability of an instrument in the field, its robustness, if it's being moved around, its portability, if it has to be moved from place to place, mean that we don't test for substances in the field the same way that we do in the laboratory. And many times we think of drug testing as being a forensic test, which often it is for a criminal justice purpose, but also there are numerous other applications that we'll talk about. And in the forensic environment, we are constrained by the expectations of the court in terms of having both presumptive and confirmatory data to support an identification. The field is increasingly and appropriately regulated to ensure the forensic and scientific validity and reliability of the testing that is performed. The accreditation environment in our crime laboratory is very different today from what it was even 10 years ago. The National Academy of Sciences report on strengthening forensic science in the United States went a long way towards accelerating the implementation of quality systems and validation in some of the science that's used in laboratories. But that's definitely a lot easier to manage in a laboratory environment than it is when you're out in the field. So if the primary use or application of the field testing is not forensic, then what is it? So that's a variety of different stakeholders and applications, they include personal safety. So first responders, and that can be law enforcement, it can be EMS personnel, it could be medical examiner personnel who are attending scenes of a death or a suspected crime to ensure that they're not going to be exposed to hazardous substance in that environment. It can also extend to harm reduction efforts. There's a movement in harm reduction to institute drug checking opportunities, particularly in Europe where drug testing facilities are set up at concerts or parties where people who are using drugs can have them tested to manage the risk they may be taking from consuming things they've acquired. That is not a current practice in the United States, but it is gaining momentum in Europe. Decisions about detaining or arresting an individual in the field, if you have a test, it can be performed on site, the officer can make that decision in the field. In Jonathan's opening remarks, he talked about the burden on laboratories from the opioid crisis. So if you can triage in the field, the evidence you're sending to the laboratory, that's going to reduce the burden on labs. Other appropriate places for field use of a drug testing equipment would be in deployable laboratories that may be moved from location to location. You may be looking for instrumentation that's more forgiving of being set up and taken down on a more frequent basis. That's more robust in terms of being transported from place to place. So that may be another application. And then if you need to be able to do a drug identification in a remote site, for example, a board, a ship at sea, or a mining operation on an oil rig, where you don't have access to a laboratory, that you may be willing to accept the capabilities that are available from field testing devices. Other considerations are who's going to be performing the test. Sometimes it may be laboratorians, but it can be a variety of different professionals, emergency medical technicians, death investigators, or law enforcement officers. In some of the harm reduction work, it may be a social worker. And then in some of the security applications, it may be customs or the military. So just as the needs differ for in field testing, so do the requirements of the operators. Some of these folks will have a higher level of technical competence, for example, not just for operation, but maybe for troubleshooting and diagnostics or repair of an instrument that's in the field. And then different capabilities and needs in terms of is it simply a black box operation or is it something where the results are going to require more interpretation? So that will impact the capabilities that are required of a field deployable technology. And then the user environment. We go to great lengths in the laboratory to standardize and stabilize things like power, temperature, humidity, operating on a stable surface. Certainly no impact from weather, having appropriate lighting, for example, to read the test. You lose a lot of that predictability when you take a test out into the field. So what is the optimum technology maybe that we aspire to? For those of you who don't recognize this, this is a Star Trek medical tricorder that was a device that was capable of making just about any measurement and any environment giving results immediately and it always seemed to work. In reality, we're not quite at that level. Although interestingly, some of the devices I'll show you today do bear at least a passing resemblance to the fictional device. So what are some of the options for application in field drug testing and opioid detection in the field? Progress is certainly being made and the technology is changing all the time. Canine detection is a whole science of its own involving detection of what's now called the volatilone, the volatile landscape. And canines are capable of detection of extremely small amounts, but even at the part per trillion level of targeted compounds, including drugs and explosives. From a chemical test point of view, we'll go through what some of the current options are chemical color tests, application of immunological testing that's being repurposed from a toxicological application to field testing for the identity of drugs. And then more instrumented technologies, handheld, FTIR and Oman, field-based mass spectrometry. And then we'll conclude just by coming back and considering what the contribution of the laboratory continues to be, even if you are relying on field preliminary or presumptive testing or screening. So the chemical color tests that have been a staple of drug identification for a number of years are actually things that were developed back in the 1800s in the very early days of chemical analysis. And while they weren't designed necessarily for testing for drugs, but more for the investigation of natural products, one of their persistent uses today is for field detection of drugs. One of the most common drugs for the detection of opioids in the field is actually designed or can get positive reactions for a number of alcoholized beside the opioids, the Mecuri agent, which is a mixture of concentrated sulfuric acid and saline acid. Gives a characteristic dark bluish-green color when exposed to a number of alcoholides, including heroin and morphine. So the general limitations of these tests are that they can be subject to interference from other things that are not controlled substances. And oftentimes the materials being tested itself may have some coloration to it. So for example, OKM is a blue-brown tarry material that can make it difficult to read the color change. There have been modifications to some of these tests. That's another common reagent, it's a marquee reagent, which is able to distinguish, for example, between fentanyl and heroin. And that is, for example, of interest to people working in the field, evaluating drugs that earn circulation amongst drug-using populations because of the significantly greater risk of overdose with more potent drugs like fentanyl. But the test has limitations. It's not good at differentiating whether both drugs are present when there are mixtures present. And because many of the fentanyls and its analogs and some of the newly emergent potent opioids are present at very low concentrations, at least in street-level drug materials, and may not give a distinctive color reaction. Also, the capabilities of the user are important and the experience of the person performing the test, because they may have to differentiate, for example, between something that's a deep purplish red versus a deep reddish-brown or a strong reddish-orange is the way these color changes are described. And issues with color blindness or just people's perception of color changes can influence the reliability of the conclusions that they draw from these tests. There are a couple of products on the market now that have attempted to make this less subjective by performing the color test on a cassette, chemical cassette, and then imaging that with smartphone technology, allowing the color to be interpreted by a cell phone camera. These devices are largely, at this point, proprietary. There isn't a lot of information in the public domain about their performance or evidence of their validation which would necessarily be needed for forensic purposes, but it is an attempt to standardize the interpretation of what's a fairly basic chemical color test. The next technology I'll talk about is the use of these fentanyl test strips. So they were designed to be dipped into urine samples as part of toxicological screening. And they are obviously quite sensitive, so they have appropriate sensitivity for detection of fentanyl, maybe in a solution of a seized material. And depending on the product, they do claim cross-reactivity with a number of fentanyl analogs, but that's going to vary from product to product, depending on the antibody that's employed on the test strip. The response may be concentration dependent, and particularly today, when we're actually, as I'll discuss later in the post-fentanyl analog era, and they're dealing with other opioid agonists, the antibodies that are specific for fentanyl are not going to give any kind of positivity or cross-reactivity with something that is sufficiently structurally different. So for color tests in the field, they're attractive because of their low cost, their relatively low complexity. However, that's at the cost of poor discriminating ability, the fact that they are not necessarily going to identify all compounds that may be scheduled or controlled or harmful. They're going to have limited sensitivity for some of the most more potent compounds, and then the fact that the newer approaches are not extensively validated would limit their application for forensic purposes. And all of these things in terms of sample preparation for color tests potentially expose the operator to the substance. And there are multiple stories in the media about adverse effects associated with that kind of exposure. This was a story from last year in Houston where an officer picked up a flyer from the windshield of a vehicle and started to experience some adverse effects. It was suspected potentially of being, people were suspected of being contaminated with fentanyl, and that was tested with a field test which gave a positive result. The deputy was sent to the hospital for treatment. But then a couple of days later, when the flyer actually went to the laboratory for follow-up testing, that was found to be a false positive in the field. So there can be significant outcomes in terms of decision-making, some decision points if you don't have the right kind of reliable technology in the field. So moving on to higher complexity testing, the two options really are FTIR, Raman, or mass spectrometry. Raman has certainly been more successfully deployed into an almost tricorder-looking device here. That is truly handheld. It can be transported easily in a vehicle. It's a robust piece of equipment. This particular device offers both FTIR and Raman. So you have some complementary analytical techniques. And you do get pretty good discrimination capability for controlled substances that are in the library or in the database of the device. With Raman, you do get some reflectance back from the container, but you are able, in many cases, to read through paper or through a small thickness of plastic to get Raman, our scattered spectrum, from the substance that's contained in the container. That reduces the risk of exposure of the operator to the substance, which is one of the things that has made this technology so popular. And while the device is capable of making identifications against a non-board library, some agencies use it simply to collect data and the data are then sent back electronically to a laboratory professional for assistance with interpretation. This technology is evolving. There's another device on the market also handheld that is capable of equivalent discrimination that uses spatial offset Raman. In spatial offset Raman, you have the separation of the excitation and detection features of the instrument that reduces some of the back scattering and interference and improves the signal to noise ratio and it allows you then to take a subtraction of the surface spectrum from the subsurface spectrum, which gives you better insight to substances that are contained inside plastic or paper or cardboard containers. So this gives better capabilities for reading or identifying the contents of packaging without actually having to physically open it. So the cost of the technology in handheld format is pretty high. It's in the high tens of thousands of dollars. It is of moderate complexity so requires some training and judgment on the part of the person reading the results. You can mitigate that by using that reach back approach. It certainly is subject to interferences from mixtures. So when you have low concentrations of the drug in the substance or in the exhibit, with other excipients or binders or adulterants, it can be difficult sometimes to see the forest for the trees. Not so much of an issue higher up in the drug distribution chain. Maybe where drugs are being imported into the country where their concentrations are high, but more of an issue when the substances is further cut and diluted when it's sold out on the street. And with both Roman and with mass spectrometry, you're relying on having up-to-date libraries in the device. And as the market changes all the time, it's important to keep those libraries updated. Handheld mass spectrometry or portable mass spectrometry has been more of a challenge to get into the field. These historically are high vacuum, somewhat fragile systems that certainly do invest in a laboratory environment. So there's had to be a lot of technology evolution to make them suitable for use in the field. Currently there are micro-scale ion trap platforms available which have reduced the need for high vacuum, but they do still require vacuum. So that has a significant power requirement and instrument integrity requirement. Samples are introduced through this, through thermal desorption from swabs. So you can swab at the environment and then put it in heated zone at the inlet of the instrument where the mass spectrum can be collected. And then there's an onboard user expandable library for identification of the substances. That particular device is purely a mass spectrometer. There are also some now portable gas chromatography mass spectrometry platforms. These are not handheld. The device shown here weighs about 40 pounds. So it's certainly portable, deployable in the field. And what you have in terms of a trade-off is the resolving power of adding gas chromatography to help with the identification of more complex mixtures. This device has a micro-scale source, is a quadrupole instrument, so it has a little more stability and less sensitive to overloading than an eintrap, which is also more forgiving sometimes of the circumstances and constraints in the field. Samples are introduced by injection, which requires a little more manual dexterity to operate. And this technology is less configurable in terms of changing column conditions or instrumental conditions than the instruments we're used to using in the lab. Also requires more supplies is gas chromatography, so it has a little micro helium bottle that needs to be carried and used with the instrument. So portable mass spectrometry, moderate to high cost, comparable costs to what the Raman platforms currently are. Very good discriminating ability, also reducing the risk of the operator to exposure. And again, limitation would be reliance on libraries and making sure that these are up to date. With full electron impact mass spectra, that does allow sometimes for the interpretation or identification of unknowns, at least a drug class that's not necessarily available with the Raman platforms. And then of course, the lab is kind of the gold standard. So many of the things that we have to compromise on in the field, in the lab, we have better separation science and options. We have a much more stable environment, which allows typically for more sensitive testing, safer sample handling options, greater computer power, analytical power, and then access to higher resolution platforms like NMR and accurate mass spectrometry. There's some work being done looking at ways to make something safer, such as this publication, came out of NIST last year, that has shown that you may, with more sensitive lab based technology, be able to swab the exterior packaging and get information without actually opening the package. So that's some interesting work being done at NIST. So just in the last couple of minutes, I just wanted to reinforce from a strategic point of view, what some of the challenges are with field analysis and how rapidly the demands are changing. In 2010, we were testing for a relatively small number of OPIs. By 2020, we've gone from testing about 10 to testing a number that changes on a weekly basis. New categories of OPI, new OPI dagonist drugs are being illicitly manufactured and distributed. And then analogs of drug classes like fentanyl that become established are manufactured and enter the drug supply. So it's extremely important to be able to change your, update your libraries, as I mentioned, but also change your analytical capabilities to keep up with that. This just demonstrates the turnover in some of the illicit OPIs over the last five years. And you can see that they increase in decline in popularity and prevalence on about a six to nine month cycle. Basically, as soon as the drug is identified as being harmful and is scheduled, the illicit drug manufacturers move on to other drugs. And so we run a program at our Research Institute identifying novel OPIs as they appear in the country. We identified 12 double OPIs in 2018 and another 12 in 2019. And as you can see, most of these now are not fentanyl related. There was just one fentanyl related analog that we identified as a novel OPI in circulation in the United States in the latter part of 2019. The other substances are either drugs that have been pirated from patent information or literature or publications from as long as close to 1970s or novel analogs of those substances that are now being introduced into the drug supply. So that's the contrast basically between what the capabilities in the laboratory versus the field are. There are definitely compromises in deploying technology to the field but sometimes there's a compelling need for that. Really have to understand the user needs capabilities and the environment. Certainly solutions and platforms are evolving and improving. We're really only in the first five to 10 years of deployment of some of these technologies for this purpose. Low cost is currently a barrier to have been more extensively used. And then irrespective of technology making sure that the menu of compounds you're looking for is kept up to date through investigations that are probably more suited to the laboratory. Last slide is just a reference to a publication from NIJ that does give a landscape study talking about some of those technologies and approaches that I presented in this talk today. So thank you very much for your attention and I'm gonna hand you back over to our moderator. Thank you very much, Barry. So there are a few questions that came in during the presentation. I think that the one about false positive negative results you addressed a little bit in the example you gave related to the flyers. But this will also I believe be very nicely addressed in the next presentation. So maybe if it does not get addressed in the end please feel free to recast it. The ambient MS question was I believe addressed or asked before the MS section. So hopefully that was covered well. In addition the direct analysis in real time of flight MS question was asked just before the example from the NIST work. So hopefully that gave a glimpse of that. But I will ask this one question. What is the typical excitation wavelength for the field deployable ramen device? That's something I don't have information on at hand I'm afraid. Okay, then we can follow up with that at another time. Okay, perhaps in the interest of time and to ensure we have a robust time for the discussion at the end we'll move on to the next presentation. So thank you very much, Barry. Like to introduce the final speaker, Marcelo Naharro who will speak to some of the data challenges for infield detection. Ms. Naharro is a research chemist in the surface and trace chemical analysis group at National Institute of Standards and Technology, NIST. She is responsible for driving the strategic direction and execution of the forensic science research program. She also manages the forensic science research program across the drugs and toxins and trace evidence portfolios. So I'd like to give the floor to Marcelo Naharro. Thank you so much for your introduction, Linda. Again, my name is Marcelo Naharro from NIST. And I will be discussing the data challenges associated with the field detection of novel synthetic opioids. Could advance my slide, please. Thank you. So I'll start off with a quick disclaimer. The points of view expressed in this presentation today are my own and they did not necessarily represent the official position or policies of the National Institute of Standards and Technology. Next. So before delving into data challenges, I'd like to briefly discuss quality management and forensic sciences and the differences that may exist in terms of data quality needs between presumptive and confirmatory analysis. So the application of a quality system and forensic science supports the validity and reliability of evidence in the court of law. Nowadays, as Barry mentioned earlier, the majority of forensic laboratories meet accreditation requirements that qualify the processes and procedures within specific laboratory systems. However, quality management requirements in the field tend to be more application dependent. So as you can see, for example, for an intelligence agent or officer who's looking to advance in investigative lead or perhaps get probable cause to open a package in interdiction efforts or make a control delivery under surveillance, they may only need a greater than 50% confidence that there's likelihood of a contraband being inside of that package in order to move forward. However, on the other side of the spectrum, if it's a forensic chemist or a forensic technician looking to do confirmatory analysis in the field, then of course, the threshold is much higher. They, on the other hand, need to demonstrate the validity and reliability of that tool that they're utilizing and the results that they'll get from that. Next slide. So regardless of the intended application, field drug data can in fact impact criminal proceedings. A federal survey in 2013 found that about 62% of crime labs do not test drug evidence when the defendant has already pled guilty. There's a myriad of reasons why defendants take pleas rather than take their chances in a court in terms of their sentencing. That's obviously beyond the scope of this presentation, but it's important to note that a large number of jurisdictions across the United States accept guilty pleas based on field results alone. And as Dr. Logan already discussed, those results could have been obtained by a gradient of technologies that differ in their capabilities, right? And so I think we could all agree that perhaps there's a need to raise the bar. So field technology needs rigorous validation methods and data to demonstrate its validity and reliability, much like laboratory quality management dictates. And as the scientific community, we understand that the way to raise the bar is through measurement science, through reference standards and standardizing methods. So as I continue to discuss quality management in the field, I will touch on how each of these areas that you see here can in fact affect data quality. In particular, when we're talking about complex samples, such as fentanyl, the first topic is sampling. And why standardizing sampling methods is in the field is important. Next slide. So the number one reason is because sampling can impact your data quality. If you're talking about using a trace detector, if you put too large of a sample in there, it can actually cause shift, it can shift your characteristic peaks and it could also produce additional non-representative peaks, which could lead to a misidentification. When you're talking about techniques, such as the color test, too large of samples can saturate the color changes, making it more difficult for the end user to be able to differentiate between the color changes and the interpretation of that result. The second reason is safety. And this is a specific concern with the fentanyl. Click. If you, a recruiter in our house recently developed a standard method for sampling both materials in the field using a fine needle probe. This is actually a biopsy punch that's used in the medical field to obtain biopsy samples. But there's a few studies who have already adopted this standard method of sampling and it provides two great resources, right? It provides exposure, it reduces the exposure risk of the end user instead of opening a big seizure and potentially aerosolizing particles and become exposed to the threat. So it creates a barrier there and it also increases reproducibility of your sampling. So on the next slide, a pillar in quality assurance is the validation process. And while instrument manufacturers may do a basic product validation prior to launching a new product or a new technology, end users are really best served by independent validation of field technology. So in this case, Angelini et al. They're from the US Army, Aberdeen Proving Grounds, evaluated the effectiveness of commercially available lateral flow immunosciences that Dr. Logan mentioned earlier. These are designed to detect the synthetic oak root bundles in either urine or saliva. In this case, the performance metrics that they decided to measure were limited detection and they had 10 test compounds for that. They also evaluated cross-reactivity. They wanted to see whether they could still detect the compounds after in vivo exposure. In this case, they used rabbits and they did a limited testing as to whether the LSI's would detect the bundles in adjudicated real world samples. And so it was nice to see that the authors went beyond testing pure compounds and in fact, also assess how they would do with real world samples, for example, in a forensic laboratory or in the field. A second example of a validation framework in the next slide is a work that recruiter and et al did for ion mobility spectrometers. So these are technology that's already been deployed to, for example, interdiction efforts because they detect explosives. Most of them are dual mode and can also detect drugs in the positive mode. And so they were trying to boost those efforts by adding these fentanyl into the libraries and also evaluate the capabilities that they may have for this particular application. There's a need for standard methods in order for end users to be able to compare and validate performance using common metrics. Something that I wanna highlight from this study is that the way they based their sample set and what compounds they were gonna test the equipment on was based on DEA's NIFLIS. As Jonathan mentioned earlier, that's the National Forensic Laboratory Information Systems and they have data regarding the drug seizure from crime labs. And so it was actually a really informed way in terms of selecting what people typically see in terms of drug seizures and what may be being infiltrated through our borders or through the mail. They also moved away just from testing pure compounds and added other additives. So in this case, we're talking about your heroin, your prograin, quinine. A lot of times those will be found in mixtures with the fentanyl. And they also added pharmaceutical formulation that may test for false positives. Finally, they identified a compound benzoylfetanol to use for sensitivity. And in this case, they did so for it because it's safer. Even though it's 17th on the list in this list, it is considered safer because of its lower potency. So this method proposed here provides a very basic level of validation for the detection of fentanyls and other fentanyl-related substances on IMS. It allows end users to continue to build under library, it has resolution measurements and also provided limits of detection in a very standard way. And the next slide, you'll see that standard terminology and a robust measurement platform allows for end users to be able to compare apples to apples when they're trying to decide which technology best suits their needs. So in this case, because there's already a method where the LOD is very defined, users can make data-driven procurement decisions. ASTN 2677 describes a web-based tool that's meant for manufacturers, vendors, testing labs and end users to estimate a robust limit of detection. This standard was recently updated in 2020 to include some of these fentanyls as well. And what I like about this tool, again, it's web-based, so it's very accessible. It's extremely user-friendly. It pops up with your statistics afterwards and your measurement uncertainty, but also, if you click one more time, this web tool has a data quality check. And so if the data quality is marginal that an end user has put in there to make their calculation, a message will provide suggestions on how to improve that quality. And once you've met that quality standard, then the web tool will perform the calculation and return a 90% upper confidence, LOD90, what we call an LOD90, including a measurement of a certainty. Next slide. So even though, as Barry said, we're really building on this foundation of improved testing methods and validation efforts, they don't all address the expected sources of error that arise when technology is actually deployed into the field. So here's a three ways that you can consider to bolster either validation protocols in the laboratory or further verification that may take place once the instruments have been deployed. One of them is potentially adding standard dirt to your testing. Yes, NIST has a couple of different types of standard dirt that represent different regions of the country. And the reason this is important is because dirt may be included in the matrix of when you do your sample collection. And so you wanna have a really good idea as to how this may be cause interference or maybe serve as a masking agent on your type of technology. Second, a study that was referenced earlier, Cisco et al were considering looking, proposing perhaps sampling the outside of the bag to reduce risk and to the analysts of opening these packages. And so could the outside contents of the bag predict what's inside of the package. So they also tested contaminants that are expected to be found outside of plastic bags to see whether that affected their ability to predict the inside contents. And finally, Forbes measured the contribution of environmental background to instrument response by partnering with a field deployed instrument and analyzing a large number of true negative samples in order to understand, to create rock herbs and better understand false positives. But following a thorough control validation in the lab, into verification must also be performed in the operational environment in which the instrument has been deployed. A lot of these instruments already have an internal calibrate that accounts for shifts that can occur as a function of temperature, humidity or pressure. But I think where there's still a need to improve is in our use of quality controls or quality checks to ensure that as a function of time, your instrument has not declined in performance and have those metrics available. And so of course in the laboratory, we were very used to running positive controls, negative controls, but a potential challenge with having a positive control here is that of course the safety, right? So having these materials out in the field with known fentanyl that I could provide some potential issues. Another critical thing to do once your instrument has been deployed to its operational environment is to continue to fine tune or even customize your detection threshold. If you're operating in an environment that's known to have higher drug background levels, then your detection parameters should really reflect that. This includes border crossings, international mail facilities. You would think that your drug background level there is gonna be higher. So you wanna have that into consideration in order to be able to reduce your false positives. One other thing that I will add is that this data is extremely valuable to instrument manufacturers. And so I would encourage end users to share this background data with them in order for them to create better algorithms and even smarter data analytics. There have been several studies that have measured drug background levels in public spaces. And so back in form end users, what types of sensitivity are they really looking for in technology and what should they be targeting? A lot of them end users will get trace detectors, but their background levels just do not, are not good for that. They need more of both material detection levels. Next slide discusses in the field of forensic science, evidence data management is critical because it ensures that the results are gonna be cord admissible, but also provides a defense with an opportunity to review the original data. And so the three main goals are to protect the validity and accuracy of the data, to maintain that chain of custody and to comply with evidence retention laws. These vary depending on your jurisdiction, but it's something to definitely keep in mind. Manufacturers, I think understand pretty well these unique requirements of forensic sciences. And so they've come up with data solutions that automatically store the results for reporting or for evidence submissions. And they also include some metadata included in the data files, such as time, date stands, and some even include coordinates as to where you were when you took that sample. In the next slide, I discussed something that I mentioned earlier, which is that ASPM standard that's been developed for field detection of fentanyl. I think this is gonna be a really great resource for the community. It's being drafted by the committee E-54 of Homeland Security and supported by DHS Science and Technology. It's mainly for first responders, but why I think it's gonna be so valuable is because it includes active participation from vendors and from subject matter experts. So everybody's coming together to really understand what type of requirement and document is gonna be needed that satisfies both the manufacturers and the end users. This effort is being led by Pacific Northwestern National Lab. And again, it will define your testing protocols. It will define acceptable error rates for instruments that are deployed and of course provide some safety guidance. So shifting gears a little bit now to discuss some detection strategies for the field detection of drugs and what increased complexities do we face by detecting fentanyl in the field. So as Barry mentioned earlier, there's definitely some interpretation challenges that arise from novel synthetic opioids. The data acquired in the field can differ significantly from what the libraries have. And so therefore there's a need for enhanced data analytics, especially for those applications that where the end user needs to be able to differentiate between analogs or isomers. So, and then Barry also mentioned sometimes these things are being operated by non-technical operators. And so some of the things that can add to those complexities is, as you mentioned, fentanyl are almost, unless you have a pure sample that's coming in through interdiction efforts, it's in with multi-drug mixtures. And so you have issues such as competitive ionization, peak resolution because of those trade-offs that you have to make for instruments to be able to be buildable. When we try and keep our end user safe and perhaps try and do the analysis through packaging, for example, then you need to be worried about clear versus opaque packaging and fluorescence issue that may arise from that. And of course, the environmental contributions that your contaminants, dirt, et cetera, have on your analysis. So the data analytics for the detection of novel synthetic opioids requires really a layered approach. First, it's obtaining that traceable and reliable reference data of these novel substances. And as Jonathan mentioned, this needs to be done on a timely basis, right? So we need to keep up with how frequently they're being synthesized and entering the drug market. That reference data can then be used as label data to populate threat libraries and to make them more robust, or you can use them to train algorithms. However, due to the structural similarity of these analogs and those potential resolution issues that we expect on field equipment, it's probably wise to also consider machine learning because it can aid in predicting that a novel substance is likely a spetnal derivative. And finally, there's a need to validate statistical models using real-world data in order for developers to continue to improve that initial feature selection that they used. So one of the challenges in the ever-changing landscape of synthetic opioids is, as I mentioned, the availability of reference standards. Luckily, CDC has seek to address this challenge by developing and distributing what they call pomkits, which are traceable opioid materials. They include 150 opioids and 100 spetnal analogs. And while these kits were manufactured, trying to support the needs of the forensic laboratories, I think they can also be extremely useful to manufacturers who are looking to add to their threat libraries and also end users. These kits are freely available to the US labs in a variety of domains, as long as you have a DEA registration for a Schedule 1 substance. So one potential challenge there is that instrument manufacturers don't always have this DEA license and therefore are not pretty getting these samples. In terms of housing reference data under one roof, NIST recently collaborated with the DEA and also the BKA from Germany. And they developed a web-based, community-driven, and analytical data repository. So here, of course, the thought is that there's a need for reliable data in order to identify these novel substances that are being synthesized. Here, we're using a community approach. And so the global forensic community here would help facilitate the identification of these unknown substances and at the same time, eliminate duplication of elucidation efforts. And so then you have to consider providing some guidelines in terms of what samples you'd like. So they've identified sort of three tiers of samples. Ideally, the reference data comes from a reference material that's been purchased from a commercial supplier with a lot number and a catalog number. But it could also be an internal control material where it's produced by an institution and hopefully also has a lot number. And finally, the last tier is an exhibit or a seed material. Here, you're looking for reasonable purity. But in either case, there's a validation step where quality metrics are in place and so there's more trust in that data. So once you have the reference data for these new substances, they can be used to identify, again, to test those algorithms that you already have, those existing algorithms and understand whether they have the right feature selection in order to predict and identify the right content. For instance, if your detection scheme is based on identifying two or three backbone structures, you can validate your feature selection using this data. And the next slide, enhancing machine learning. I think this is an important concept to discuss. The reason to perhaps enlist machine learning is because we need to build richer test data sets using real world samples. This can help improve or help you validate the robustness of your statistical model. So here somebody builds a statistical model, they test it with their clean data, but then they also use real world data that can help refine the feature selection of this model and then it continues in an iterative form. Of course, there's always challenges with this. This requires continuous data sharing from a variety of entities. As Jonathan mentioned earlier, partnerships here are very important because everybody has sort of a piece of the puzzle. Standard issues with any time you're gonna have some data sharing, what platform are you gonna use, standard data format and who maintains it. In this case, access could be problematic because some information could be considered law enforcement sensitive and public health sensitive. But here I think that forensic labs hold a big piece of the puzzle. They have whatever's being seized, they have that piece of information. Medical examiners also have a piece of information because they know what folks have been overdosing in, hospitals, non-fatal overdoses and finally intelligence agents have a really good idea of what's been coming in through our borders or through the mail facility. Anytime you're designing a tool and you're doing so from the comfort of your library and of your laboratory, you have to consider what is it going to take in order to throw this over the fence and actually operationalize it. And so there's two things, software in terms of what we're talking about right now. Software accessibility and education and training. So in order to continue to make strides in data analytics at the rate required, it's going to really take a village. Software, so software and algorithms need to be open and transparent. This allows for them to be freely available and it's facilitating accessibility, adoption and scalability. Here's an example of a Rune Moorthy from this recently published an open source software implementation. It generates mass spectral similarity mappings of unknown against a library of type one fentanyl analog spectra. Those are just molecules that differ from fentanyl by a single modification. The goal is that by making it open source, more end users can evaluate its utility and its current state. They can aid in the validation of that statistical model that supports it. And then the result is hopefully a better product for the community. Next is that trials of facts are already looking for decision-making algorithms to be more transparent to the public and have a higher level of documentation that discusses what the assumptions are based on the data, what their limitations are, and also what are the feature characteristics that support that model. A recent federal judge asked in your crime lab to unseal their algorithms they were using to analyze data. And so this tells us that that's where this movement is heading towards. And so of course, education and training needs to be a part of this. End users need to have a basic level of understanding of the decision-making tools that they're gonna utilize. They need to understand what sort of false positive and false negative rates they're gonna get using these. And then what are the limitations of those tools? All of those are, I think, a unique concern for forensic scientists who have to go and testify in a courtroom. And they need to have a good understanding when they're using these algorithms. So in conclusion, field data can be indirectly used beyond its intended scope. And so we need to try and bolster that quality management system just like we do in the laboratory into the field, when possible. Questions still remain of what that quality management criteria should be for criminal investigative purposes, right? So when we're only looking for probable costs, again, that threshold is lower than if we're looking to confirm a result in the field. How do we ensure that these quality systems that we're looking to uphold don't impede the adoption of new technologies, right? It's always that gray area when we wanna pilot new technology and we wanna see its capabilities in real-world situations. How do we adjust our quality checks in order to allow for that to come online? How do we manage the trade-offs of field technology and that very high threshold of admissibility requirements in the court of law? So as you saw, there's a lot of different researchers and all their academics that are working to strengthen field quality management and to mirror that of what's going on in the laboratory. I think, again, that ASTM standard of the field protection of first fentanyl, that's gonna be a really good resource for the community and I'm excited for that to come out and see how the community responds. And then finally, in order to enhance protection strategies, we have to think outside of just using libraries and consider using machine learning and then, of course, data sharing, open-source software and education and training go hand in hand. With that, I'd like to acknowledge my colleagues and the peers whose work I referenced and thank you for your attention. Thank you very much, Marcella. So at this point, we're going to move into the discussion portion of the webinar. Just to remind the audience, you can submit questions through the Q&A button on Zoom at the bottom of your screen. Or if you're not on Zoom, then you can email questions to csr at nas.edu as shown on the screen there. We've already had some questions come through and so I'll go ahead and begin the discussion portion. Perhaps throwing out a question, Jonathan really highlighted the pace at which these structures are emerging. And so maybe I'll open this up to anybody, but given the ever-changing landscape of structures, are there prospects for an assay perhaps based on biological activity rather than having to keep up with this pace of structures? This is very low, and let me go ahead and give you some response to that. So that is an area of ongoing research. As I mentioned with the repurposing of immunosci-testrips or immunosci-kits to try to identify opioids, the limitation of that is that the antibodies are a lot more selective in terms of their binding than the receptors are. So receptors are going to be activated by a variety of different kinds of chemistries with some commonality in terms of structure, activity and relationship. That's what makes them effective agonists. So if we could develop an assay that was based on receptor binding rather than on antibody binding, it can conceivably bind things that we haven't encountered before, but we know are still going to be active at as opioids. So there is some work going on at the University of Ghent. A colleague of ours here, Christof Stoll, is looking at ways to develop biological, or assays for biological activity of substances rather than identifying them based on their chemical structure. So in terms of a screening method, I think that has a lot of promise. There are issues to be addressed. For example, since the receptor is going to bind both agonists and antagonists, if you're trying to use a receptor-based assay to determine somebody's drug use, but they may also have been administered Narcan, then you're going to get a response both from the Narcan and from the active opioid. But it certainly has the potential to be useful for the identification of new previously unknown opioid agonists. Thank you very much. Maybe on this theme of sort of alternate detection strategies, there's a question, are there any viable electrochemical detection strategies that are being looked at and thinking about kind of continuous, minimally invasive, transdermal analysis using microelectric needles as a monitoring for relapse, can those perhaps be adopted and adapted for detection in the field? This is very Logan again. I can tell you that there are some commercial products in development that are based on electrochemical techniques, cyclical tammetry, and looking at characteristic potentials for the identification of opioids. It certainly is not currently deployed or deployed technique, and I'm not aware of any commercial devices that are in use or on sale that employ that approach. But the opioid crisis has spurred a lot of research into alternate technologies, particularly those that may be more amenable to field deployment. Jonathan mentioned in his opening remarks the opioid detection challenge that NIST and NIJ and Homeland Security had launched last year, which was really to encourage technology developers and academic researchers to apply techniques or technologies that maybe haven't in the past been used, particularly attractive as the idea of passive sensing. So being able to detect somehow, for example, in the mail or in with packages on a conveyor belt in a mail sorting facility, is there a technology either that can see through packaging and detect drugs or that can detect trace amounts of drugs in the environment? NIST, I know has done some work on techniques to dislodge surface trace materials from packaging into the air where they may be detected by a variety of techniques. So the challenge is new and the need for innovative technologies is new. And there's definitely interest on the part of organizations like NIJ in trying to bring some of those technologies to bear on what might be historically have been considered more forensic purposes. The second part of your question, I'm probably not able to answer. Yeah, then I can move to a different topic that seems to be of great interest to the audience. Marcella, you really perhaps covered this in its entirety, but let me pose the two questions that came in related to this idea of false positives and false negatives. So how are the risks of false positive negative results assessed and established for both field and laboratory tests? And then related to that, is there a standardized decision tree for recommending confirmatory testing after a positive field test? I think you gave us a summary there, but perhaps this would be an area for additional comment. Sure, so I think what I could add is just making sure that your validation protocol includes as many compounds that are known to potentially cause a false positive. And because they're gonna be found in drug mixtures and even if you're thinking about counterfeit type of samples, there's gonna be other pharmaceutical formulations associated with your sample. You really need to have a really thorough validation plan that includes all of these compounds. And just like anything else, it's just important to measure that and to understand what some of your limitations are gonna be. I think that that validation framework that I referenced from Burkut or Nadal actually takes that into consideration and gives some metrics in terms of what false positive rates to expect from that particular technology. So I can definitely see that being used to validate other types of technologies. And then secondarily was, you know, when I touched on really having a good understanding of what your operational environment, the challenges that that's gonna bring to you in terms of chemical background. A lot of times, you know, the dichotomy here is that fentanyl is found at a much lower concentration because of their potency. And so you have to have an analytical technique that's gonna actually be able to pick that up in the presence of other drugs as well. And so it's difficult for end users sometimes to gauge what level of detection they need, whether they're looking for trace detectors or something mid-range or even bulk. And so that's a particular challenge that these types of samples bring. But I just would highly encourage that validation to go beyond what's done in a laboratory and to also do some testing once the instrumentation has been operationalized. And again, customize or fine tune your detection parameters based on that. The only other thing I'll add is that your background tends to change as a function of time. And so this could perhaps just sort of be part of your quality management that at a certain interval, you're gonna continue to monitor what your background looks like and sort of continuously tweak your detection threshold and monitor that signal to noise so that you can minimize your false positives. Just to Marcel's point, another consideration is considering what the application is and what your tolerance for false positives or false negatives is under different circumstances. So for example, if you're a first responder and you want to know whether an environment is safe to go into, you would have fairly low tolerance for false negatives, maybe a little more tolerance for false positives, the decision just being either to apply more personal protective equipment, as opposed to if you're using the test to make a decision about whether to place somebody under arrest, you may have a different tolerance for a false positive, particularly if, or if you're using the test as a screening test and you know that there's going to be a more discriminating confirmatory test done in a laboratory. So that's something that the user really has to determine based on their application also. Thank you both. Now I'll pose a question that came in that looks very globally at this issue and ask what is the current status in the area of source apportionment of illicit drugs? If anyone can comment on that? I can comment on it a little. There are programs in place by one that has been run at the DEA, has special testing lab for a number of years looking at signature analysis of traffic drugs based on all kinds of properties, everything from presence of adulterants to the presence of byproducts that might tell you something about the method of manufacture to trace metals that are present in different batches of drugs that may allow you to identify things that are coming from a common source up to and including the use of stabilized adult racial mass spectrometry for identifying where natural products may have grown in what regions of the world and that discrimination can be made based on the incorporation of different stable isotopes. So there are a number of ways to make that assessment. There are some things, for example, stable isotope analysis that are more applicable in the analysis of natural products than they are in synthetic drugs because the synthetic precursors could come from all over the world and be shipped someplace and then assembled into the final molecule. But that's definitely an area of active research and I think there's a lot more that can be done there. Thank you. So I might want to build on Marcella talking about machine learning and data analysis and computation. There was a question posed along these lines. It's possible these days to use computational chemistry, not machine learning, but computational chemistry that can give spectra with almost experimental accuracy. So is this also being brought to bear in the toolbox for drug detection, complimenting experimental efforts? I'm not aware of that, although there was somebody I think at UC Berkeley who was considering that application but I'm not quite sure how far they've gotten into those efforts. Do you know of anybody, Barry, who's using that? No, I don't know, Carly. But I think it's an excellent idea and that, again, that could be another tool in your toolbox, as you mentioned. Yep, sounds good. There's a question that came in. What are the current efforts leading to the creation of robust measurement infrastructure for untargeted analysis of emerging drugs? I think this really can speak to anyone on the panel. This is Jonathan, can you hear me? Yes, go ahead, Jonathan. Yeah, one of the links in the final slide of the overview included a paper that was published, I believe, within the last year, tomorrow at all. And I believe Barry Logan and myself contributed as co-authors. And it really talks about what a holistic approach to developing a robust system can look like, simply based not only on the types of technologies we've discussed today, but also looking at the stakeholders and when and how they use that information. So I believe that that might be a good place to start, as well as the needs assessment. But really, this comes down to this concept of developing systems-based approaches across all stakeholders, customers, and users of the data and collectors of the information. And in some ways, it needs to be somewhat jurisdiction dependent based on the needs and the resources available within the community. The New Jersey Drug Monitoring Initiative, or DMI, is a really good example of how New Jersey has reached across all this public health and public safety stakeholders to share information. So those types of resources might be a good thing to check out. That sounds like a great recommendation for the audience. So we're nearing the end of our time, the end of our webinar. There was a question that came in from via the email, related to Barry's presentation. What is the best color metric app and available on various platforms, iPhone, Android? I think I'm going to pass on, I comment on any particular vendors technology. Okay, fair enough. There are a number of these in development. As I've said, I'm not familiar and I've done some looking, I'm not familiar with any published validation of the technology, but it certainly is an attractive evolution for over current chemical color tests because it does remove some of the subjectivity and makes the assessment of the color change a little more systematic. So I think development of cell phone-based or smartphone-based technologies, maybe it's related to color tests, maybe it's related to other kinds of chemical processes that can be done in the field. I think that's a fruitful area for exploration. Thank you, I appreciate that insight. So at this point, just like to thank everyone who has tuned in and also for the three speakers for really providing insightful overview and detail about this exciting area. I just wanna note that the three presentations and the recording of the webinar will be posted to the Chemical Sciences Roundtable website by the end of the week. And so that URL is on your screen. There were some more questions that were residing in the Q&A. So this is obviously an area of great interest. If anyone has any additional questions, comments or concerns, you can email the CSR at nas.edu email address that's also on the screen. And look for you to mark your calendars for a date to be TBD, but our next webinar will be held in June and we'll cover issues within the chemical supply chain. And so for more information, make sure to subscribe for updates which can be done also on the website. So with that, I'd like to close the session and again, thank the participants for the excellent presentations.