 All right, we're going to go ahead and get started. Welcome back, everyone. So we wanted to, our next topic or area of discussion that we wanted to get into was broker interaction. So obviously the brokers are, and I would add into that software developers, as appropriate, were key components to getting the Alpha Pilot up and running. So we've got some questions here, some topics to just sort of kick around. And I mean, just sort of open it up for our understanding a little bit from the importers. How did you get the information to the brokers? And I guess we're talking about here the reference number primarily. How was that transmitted? And how, I guess, did you align it with incoming entries? Give us a little bit of background on that. This is Bob from F&T. We tied it to our StyleMaster files. So when we created the import, we actually kicked off a common delimited file that was EDIed over to the broker with the registration number on it. We did two different ways of transmitting the information. We're not EDI connected to our border broker. So what we did is we gave them the LPCO code with a product database. And they programmed that into their article product library and used that every time we made a shipment. So that was the manual process. 99% of our business, though, is EDI with the broker. And we do tie it to our product master file. So within our product master, all the information, we use a series of flags to flag the article subject to the CPSC pilot. And from there, that kicks off a series of business logic to go out and pick up the GCOC full PGA message set. And then that gets incorporated into our XML to the broker who then puts it into the Qatar. From a Walmart perspective, we have EDI transmission set up with our brokers for purchase order data. We don't have specific product data that they call through and pull. We push them specific purchase order data depending on the clearance port. So we couldn't necessarily tie the registry number to an EDI feed. What we had to do is we had to take what Kara had entered, run reports to identify which purchase orders those products were attached to, and send the broker via an Excel spreadsheet a complete listing of purchase orders that that registry number would apply to for the two ports that we were working with. So it was manual. Sounds like a very manual process. Yeah. Refer to the Luna was also manual process. We provided spreadsheets for both our full message set data as well as our reference certificate numbers to the broker mashed up with the correct product ID and they added it to their internal product library. So I'm hearing that if it's not an EDI, it was a two file transfer where the broker then had to join based off the filing. Okay. For Procter and Gamble, we also followed the manual process. We separated our normal process of information for this pilot specifically because we were limited our participation to a particular supplier and port. And, but we simply added the reference number to our product data master file for that supplier. And with the anticipation that eventually it will become an electronic process once all of the data elements are finalized. This is Garrett with 7th Avenue. Ours was pretty simple. We've basically just tacked the reference number on to the end of our EDI feed to the broker and that related to our complete parts database that we transferred to them. Okay. From the broker's perspective, what were some of the issues or challenges that were faced? You know, I know we had heard about some issues and we knew there were some delays from getting the code from the vendor, software vendors and getting that code tested and implemented. So, you know, some of the challenges that sort of the brokers faced and perhaps even some of the challenges as it related to their interaction with software vendors, we'd love to get some more information on that. Hi, this is Ian Smith from Expeditors and we actually did not have a lot of trouble. For one thing, we are a self-programmer so we program all our ABI software and we opted for the just the simple transmission of the registry number. So that was quite simple, quite easy to get up and running pretty quickly. We, you know, we could have done the full data message set but it just made much more sense to go with the simple transmission of the registry number which was pretty easy to get. It was very simple to get from our importers and from a technical standpoint, I don't know if anyone from our Seattle from Corpets on the phone, I think everything went just fine. Add to what you said, Ian, from our perspective because Expeditors was our broker of choice for the pilot, what I, the feedback I got from them was that, you know, as long as they had the data, they could supply the data and that was fine. However, there was a disconnect between what we had, has been indicated on the survey in terms of the message set total versus what they filed. So they stated they filed 140 entries on our behalf. We have 121 message sets. Their thought process was that there was a potential for disconnect between CBP and CPSC via the ACE transition. So something maybe you want to check, yeah. Their entries were 105, lines were 140 for us. They say there was 121 was listed in 136 on the survey. I want to be interested to see where the drop off was. Yeah, just something to check. If we could, I mean, if they have that good of data, if we could get it from them with the entry numbers and entry lines. We can work with you to try to figure out where that dropped. That would be great. We can absolutely look into that. But other than that, you know, they didn't have any problem as Ian had indicated. In expeditors again, you only program to be able to transmit just the reference number, correct? Yes, Jim, that's correct. Okay. So I think, Jyotus, if you're on the line, you also program to be able to file the full PGA message set. Can you give us a little background on that process? Sure. We actually use an outside software provider. We don't do our own programming. So our software provider did do that programming for both the full message set and the registry. We actually do have two separate applications for ABI. This is Chris, I can speak on one. Maybe I'll have him speak on the other. So for the one, everything went fairly smoothly. I think our biggest challenge was in adding the CPSC information to our internal product file. We had to kind of quickly come up with a way of doing that, which long term probably won't work. So we'll need a better method of doing that with the larger pilot or the actual go live. But overall, I think everything went fairly smoothly. I think the registry was definitely the easier way to go and we would want to encourage our importers to use that as the preferred method, if possible. So the easier that is for importers to use, I think that's going to be the better. If Jennifer, you're on the line, maybe you can speak about our other system and how it went out. Sure, well, I can't really add much, except the fact that I know that one of the challenges that our software provider was facing at that time was some of the other changes they had to make with the ASIM lamentation. So certain things just hadn't, were put in a different priority. But once we did get the programming in place and we tested it, we were able to send them through successfully. ASIM lamentation was interesting. I think we have that as sort of a bullet point on the next side with filing, but as you all know, the ASchedule slid and we were hoping to hit sort of a sweet spot in there in between ASIM lamentation and other things, but it ended up that we got really stuck right in the middle of prime, I guess, programming time for AS. So we do recognize that there were higher priorities with mandatory AS transition that we bumped up against and I think we lost out on the priority a number of times. But yeah, we do recognize that. Did you have? Yeah, Jode has filed for us. So at FNT and of the 27 products we did, 359 line entries, 342 CPSC message sets, I think everything went okay. I do know they had challenges setting up some of the programming with ACE challenges like you just mentioned and along with I think some of the challenges with the transmissions, but once they got past that it worked fine. Yeah, I would echo, Jodes was the one that filed R except for the Laredo, Border Brokers is what filed for us in Laredo. The programming ACE definitely contributed to some of the delays in that they had to dedicate their resources for some of the ACE implementation that was mandated. So it did push it a little bit. I think one of the other things that we struggled a little bit with in Jodes is that initially they had programmed to file at the 75.01 which is the standard trigger for CPSC filing on the summary. And when they went to program the error messages coming back or the accept messages didn't make a lot of sense according to the 75.01 or the summary filing. So we switched to the 34.61. I think that piece also caused a couple of delays into when we should have been filing because of the 10 day delay or the nine day delay between the 34.61 and the 75.01. So that was the other issue that we saw. So you were getting errors when you tried to attach it to the 75.01 but not the 34.61? Correct. And then when we got the acceptance messages against the 75.01 they didn't make sense because they're really designed I think to attach to the release. Which is an indication of hold authority rather than just contributing information. So I think you're all familiar with, so CPSC had, we had put together with working with with CBP in review of the trade support network our Cataire, our implementation guide. You know, I guess just sort of a general question about how any comments on that guide and I guess in particular looking for sort of any feedback on anything for the future that we could do better as far as interacting with the trade I would include brokers and importers. You know, if we're able to move to a beta phase what could we do better as it relates to communication, documentation, anything that would help the process smoother. For example, with companies that might join that are brand new to this. I have a thing that could relate to both this issue and the previous issues that we haven't discussed. This is, sorry, I said Magnus Beer from Ikea. I don't know if any one of the others have been thinking about the disclaimers. Meaning that we will have a flag on certain custom codes, ACA codes that says that we need filing. But product, requirement, definitions and the custom codes don't quite really match up perfectly. So we will, that will be a need for disclaimers. And for us internally, that seems to be a more difficult situation to handle because it's a non-requirement. We are clear, we know exactly which product we should have a certificate of compliance on. The question is how do we identify correctly which product that shouldn't have a certificate of compliance and should have a disclaimer instead? I don't know if there's any other company that have been thinking about this. This is Jennifer from Walmart. We have been thinking about it, not necessarily from a CPSC perspective, but Fish and Wildlife is another good example of that. Where the disclaim process is potentially burdensome because you essentially have to, it's not a yes, no anymore, it's a no, but here's why. And sometimes that's more information gathering than you have already with a yes answer. And you have to maintain records associated with that, where in the past you didn't. So yeah, we have thought about it. It's not something we've necessarily addressed yet, but it is a good issue to talk to. It is an issue for us that we definitely need to think more about as well, just the nature of the HTS. There are some codes, toys, for example, are a lot more straightforward and I would expect less disclaims, but when you get into some of the apparel, disclaims can be, excuse me, a really big issue and how do you address that? So that, again, trying to get back to minimizing the burden and not having to spend a ton of time disclaiming. I think to add to that, two things. One is that it may be subject to CPSC, but no testing required. Is that disclaim or is that a GCOC? And then the second would be there are some items that have a tendency to be in a gray area that you wouldn't necessarily flag the HTS for, but they're probably still subject to CPSC and how are those accommodated? No, the general process envisioned across the jurisdiction for CPSC, just from a very high level. If there's no certificate expected, you won't have to. The HTS will not even be on a list. We won't even be looking. If there's below a certain percent threshold on a proportion on a certain HTS, there's gonna be a lot of burden because it's gonna be a majority of disclaim. How much are we gonna, I mean, this was an initial thought as we're digging through this. Above a certain threshold percentage, let's say 70% or 80% of the HTS is expected to apply. In that situation then those exemptions are expected to be on there and it's worth the burden potentially in that type of an instance. The fine line is gonna be those HTS codes where it's the toy 90 code, the above 12, how many of those are gonna really be ones that we're watching? Initation jewelry, those are questions that you just have to say, do these really, what's the proportion? What's the burden versus the, and that's something that, it's a hard one for us. It's definitely a hard one for us. And we identified that early on. If it's a burden or it's hard for you to, also it's hard for us as well. So we're trying to second guess what we think you might be needing or wanting. So maybe some clarity around that would be great. I think that is definitely an area we need to work on. Well I think as Jim said a little bit earlier, the Alpha pilot was really broad, intentionally because we were trying to get volunteers with whatever products they wanted. One of the things we are looking at and thinking through with regard to the beta is what makes sense. And we're not necessarily thinking that it's going to be this kind of free-for-all but more how do we prioritize what we're looking for? Not only to help CPSC but to help the people who sign up and who want to come on board to say, let's make this make sense and be logical and be what's really important and also just to give you more guidance. One other thing to consider too is that the beta pilot or I'm sorry the Alpha pilot was pretty straightforward in the tariff classifications that were being used. Essentially it's called for in the tariff. It's a GRI one in most cases. So easy to recognize, easy to fly, easy to say yes to. But further to her point, if you have embedded items in a kit that are subject to a CPSC requirement but the nature of the kit itself isn't a toy, then you have to be, we have to be able to account for that testing on that particular item but how do we declare it at entry? Because the primary tariff code, the way we do it is XBV, the X line is gonna be declared, the V lines are gonna be supportive and there but not necessarily seen by your side. So those are the kinds of things that need to be considered as well as you move down the level of, as the level of classification gets more complicated from GRI one to CPSC. Right, and in terms of kits, one of the, this is John from CPSC. In terms of kits, I do know there's a nuance within the UPC G10 that supports the identification and the GBC of kits and specifically bundled beyond a classification which is one of the reasons why that's so attractive is that if it's a kit within a certain classification that's something that we're gonna be potentially interested in if it's a kit that's outside. And so it's identified as a kit and we know that there's gonna be elements and that's something HTS can't do. In some instances, the products are supposed to be declared separately and other instances they're not and that nuance is a tariff issue in terms of how they break that out but it's not a product characterization issue and so it's hard for us to get that standard policy. I understand that, but for us, it's going to have to be an exception management that we have to account for. So that will be difficult to automate. Agreed. We are very sensitive to the exception issue and we've, from the very beginning as we've been looking at the overall products of risk, how does that get managed? That's something that's been on. And I think a lot of it goes back to sort of looking forward is how we define scope and what are those tariff numbers, product types that we are really interested in. What's the data that we need to be able to do import targeting? You get, I mean, you can get into the weeds very quickly and it can get very complicated when you start getting into kits and jurisdictional issues. You know, we do our work based on a feed of data that we get from CBP and that's defined by tariff number. So it's how does all of that mix together? It definitely is a very complicated issue. If all the HTS codes were as clear as toys under three, life would be wonderful but we're going to have to nuance this together. So. All right, I've been told it's a good time to move on. So, and again, I know and we've talked about, you know, I know there's just sort of overlap with all of this. So, you know, feel free if you have comments on some previous issues that we've talked about, feel free to come back to them. I mean, we are, we want to make this sort of as informal and conversational as possible. But sort of the next topic that we had is around filing. And, you know, I sort of hit on this a little bit about, you know, sort of what could we do better? But, you know, sort of thinking about filing, what could be any suggestions on how we could do this in a way that puts the least burden on industry. Again, you know, sort of this is a trade-off here in the sense that we're trying to get data that is going to help us do our job better. And as part of that, better means that we don't stop shipments unnecessarily. You know, we can identify those risky shipments based on the data that we have. So part of this is we do feel a trade facilitation aspect in that if we get the additional data, we know what not to look at. But with that in mind, sort of thoughts on, you know, how could we, how could this be set up in a way that sort of sets out the least amount of burden on industry, any thoughts we'd be welcome to hear? This is Ken from Walmart. I think we just have to be very selective about the data that's required. I mean, at the end of the day, whether it's automated or manual or some combination thereof, the more fields that are required, the more tariff codes that are required on and on, it's more work, you know, regardless of what that system looks like. And so I think we really need to understand, find that sweet spot between, and I know you're working to do this, I'm not telling you anything you don't know, but we really have to find that sweet spot between what is the burden, because the burden can be significant, versus what is utility of the data. And so, you know, I think that we need to continue to have that conversation, is what is it that the CPSC is really looking to get out of this? And what is that smallest universe of data that trade can provide to you to accomplish that goal? And let's see if we can find a way to settle there. You know, I'm worried that, you know, we'll take one step forward, and then we'll take another step forward, and then a little bit more data, and then a little bit more data, and then all of a sudden we're overwhelmed, and we're providing just, you know, millions of lines of data to the CPSC, and really you're utilizing, you know, maybe a fraction of a percent of that. And not even be designed that way, we just kind of find ourselves there before we know it. And so I think we just need to be really disciplined about what do you need, how can we provide that to you, you know, in the least burdensome way? Yeah, I agree 100%. The chairman in his open remarks, I think he used the phrase that the Alpha pilot was testing the plumbing. And that really is what it was. I mean, we set up the structure, right? We set up the process, and the volunteers and the brokers really helped us in testing that to see if it worked. The beta has always been envisioned by staff is testing the elements. How does the data itself help us to do our job better? And I think that's where we get into the conversation about are those the right five pieces of information? In the end, is it less that we need? Is it more? I don't know. I mean, I'd have to think that it's gonna be hard for us to say that we need more when we're not testing other data elements. But, you know, the commission I think was really, really listened to the feedback from trade as we were discussing e-filing at the beginning to try to come up with what were those data elements that we think would enhance the risk? And that's where the beta is really hoping to go is to test that data to find out whether it is, in fact, the right data elements. And I'll just give you my first concern around the beta and this is one of the reasons in our responses back to you, we didn't fully commit to participating. Because my concern is that in order for you to obtain the amount of data you need to really test what's useful and what's not, it's gonna require either an enormous manual lift on our part or it is gonna require some sort of systems development. And until we know what that system is going to look like long term, that's not an investment that I can go to my bosses and say we need to make. Because as soon as we make that and we finish the beta pilot, we're gonna say, oh well, actually the system would work better now that we've had 200 participants or whatever it is, the system should actually look like this and then we're gonna throw money out the window. So I'm not sure how we're gonna balance the need for y'all to acquire a lot more data from all your participants versus the need for us to have some sort of finality around what the automated process looks like before we start developing towards that. So I'd almost say I think we need to maybe focus on that systems piece, that automated piece, figure out what that looks like before we move to the massive amounts of data to determine what is really beneficial. Currently the CPSC is directly sharing information with CBP to be able to assist our processes out of the ports. Jurisdictionally, we're talking about some 90 odd million records a year that are being passed so that we can understand what's going on. The volume of information there is, it's huge. It's absolutely huge and the process that we've got established with CBP was one that was mandated under the Safe Ports Act so we needed participation in ITDS that we get this done. The standard setup there is something that could easily apply the other. We don't want to have multiple different tools as well. I mean, we have limited staff as well as you know and so we can definitely take the lessons learned from that and apply it back out. The technology piece is something that we can definitely work with you all and I do believe in terms of developing some type of technological standard is something that's not gonna be a hard lift in terms of proposing out. So I mean, that's something that we can work through. We can definitely work through and we're doing it right now and the agency has standard exchange going. So it's, you know, in terms of cost, we'll work through that. I mean, you know where I'm headed, so. I was just gonna say, I think the line or sort of the objective and I, cause I agree with you on the technology part of it is getting that to be as easy as possible to transmit the information. I guess it's also having it be flexible that if in the end, say, CPSC determines that we don't need five elements, we need three, that the technology can change and not without incurring a tremendous amount of cost so that it can be flexible enough to change that. And in that type of an instance, you'd have the same hardware and you'd probably have the same general programming. It's just that what you're gonna be doing is your XML layout would change. The difficulty would be we as an agency don't want to have individualized feeds from each company. That would just be ridiculous. We don't have enough staff to be able to do that kind of thing. So, I mean, just in vision, it's gonna be a standard technology like an SFTP type drop in process, EDI exchange, where on a periodic basis, we would say, okay, this data's not working for us, we're gonna drop these or let's maybe add this one element. There's no way that in my mind, beyond five, six, seven data elements are ever going to be useful to the agency just because of the nature of what we're trying to do. It's product risk, it's entity risk, it's location risk, it's not, there aren't that many elements that we need to be able to term this. So, that's my view at this point. I think I just wanna say all good points, but to what you said specifically, Ken, we do hear you when we sit around talking about what this beta looks like. Our question is how do we design this so people will actually participate and that it both makes sense in the short term and is leading to something that CPSC can get behind long term and be able to tell trade and the importers and the industry this is something we're getting behind long term. And whether that's minimizing HTS codes based on what we feel we really will need, we'll look at, we'll use to help indicate what we wanna look at and what we don't. We're looking at a lot of different ways to try and make sure that what we're asking for is what we're actually gonna use and that we're not, to your point, just collecting data to collect data and that is a huge part of what we're working through with this report that we're putting up and with how we're putting in options for what the beta looks like. Yeah, under a beta, I mean, the hope is that and we have to work this through, but the information would then be directly tied into the system that we're using out at the borders and be able to flag folks to differentiate the product and be able to demonstrate that and to show how the scoring and the overall methodology would be significantly impacted within that structure on a larger set of data that's coming through. The one-off collection that we did, we didn't integrate. So that's something that we're gonna be working towards, so. I know we have a message here about warning messages or a question here about warning messages and I think there was a little discussion, but sort of any other feedback information about any warning messages that you received and I guess, were they instructive as to what the issue was? Any thoughts on how they might be able to be made better moving forward so that they are better instructive of the issues? I think most of the issues we had during testing were resolved. There were a few of them that were unclear that we had to go back and say, I'm not sure what this means. I think there were a few of them that Geodis encountered that we had to work with CPSC to clear up, it was on the CPSC side and if we're timing that with the one USG release through ACE, I think that's even better so that conceptually, importers know that CPSC is part of the one USG. Just, I guess before we move on, just any other last thoughts, comments on filing? Well, let me speak again from Ikea. One thought is, when we go further, we're going to be the beta pilot and as early as possible, we would need to know if we're going to have the same elements or if we're going to have other to be able to prepare in good way and that applies also to when it's time for the real thing. It's very good if we know in following about exactly what type of data and structure it should have because once it's real, we cannot say to CPSC or to anyone else, sorry, we don't want to participate anymore. So we would need to have everything in order and everything as it should be. Understood, yeah. That takes some time, it takes time. And I think to add on to what Magnus says, I absolutely agree, it's quite easy to add information to an XML to transmit data, but thinking through that process, well, it's quite easy to add something to an XML. It requires mapping on somebody else's side to receive it in, to put it someplace and to map it back out through ACE. The other thing is when we added data element, we all have to go back into our source systems and say, hmm, where do I get that from? How do I interface it in a good way to our entry data so that it flows through without a lot of manual intervention? And I think that's where Magnus is coming from is that when you think about, I just need an extra data element, think about really the timing that it might take to add that data element and what each and every one of us will have to go through to ensure that you obtain what you need. Yeah, I think, go ahead, Magnus. Yeah, yeah, just to give you one small example, someone changed a bracket in the feed and the whole flow, information flow stopped. Yeah, and that could also happen. If we add something new, you don't know exactly how everything will behave until you have really tested it through. Right. That's something we want to do in very good time before we're going to go really live in something and also preferably the same thing for the data pilot, or the data pilot. Right, and I think the staff sort of that has been working on this, that is certainly something that as far as scope that we have our eye on in order to be able to provide some options to the commission and I think the two big areas of scope are scope as far as it relates to products, HTS codes that would be in the beta and then scope as it relates to data elements and what data we would be looking for. Again, I would just say, the alpha was really set up to test, set up and test the process. The elements themselves really have not been tested for their usefulness in targeting and so just my own personal opinion is I'd sort of expect them to remain the same going into the beta because I don't think we have a basis for really recommending changes at this point. I had just one other technicality that I wanted to add on a technical, and it might be that we're operating a foreign trade zone and I'm not sure if anybody has brought up the use of a foreign trade zone and when the most appropriate time but what we have found with other government agencies moving into the PGA message set that seems to be exceptionally problematic. So if there's an opportunity to think about that and how you use it, that would be fantastic. We've met with foreign trade zones several times and to understand that the general process and when it would be ideal, for example, you've got a standard weekly release that you're gonna be pulling out on when to declare and when to get that information through has been something that we've been actively studying as part of this overall process as well. So we're cognizant and we're willing to, we would actually love to have pilot participants under the beta who are gonna give us curveballs such as the FTZ issue, such as I'm only ever bringing paper to the CBP desk. These are the kind of things that under a beta you want to understand the full breadth of the process to make sure that you're addressing it as best as you can before you turn it on for full production and we really wanna make sure that we're getting a full understanding for all the different processes and the nuances that we can share policy prior and discuss policy prior to flipping that switch. Yep, we did meet with the NA FTZ folks and they were great. They came in and really laid the foundation for educating us. It's a FTZs are a complicated process. It's a complicated process. But they were very helpful and we did commit to continuing to have dialogue with them to understand it. They were, we talked a little bit about e-filing. They were particularly interested in certificates of compliance, which is, you know, this is all sort of mixed up together but so that, but yeah, so we have had dialogue with them and continue, we probably will continue to have that. All right, so we're gonna move into our last topic and it's really recommendations and we've touched on a number of these things already but this is an opportunity to just give us your thoughts on what can be improved as we move into a beta. We're always, we're looking for that feedback. We do really feel that we've had that from our perspective that this has been a great situation in that we've really worked closely with some really good importers and brokers to be able to develop and test this process and we wanna keep that dialogue open. We wanna continue to try to learn from you as far as your experts and what you do. So we wanna try to learn from you as far as, you know, what is the best way, again, striking that balance, what's the best way to try to create the least burden on industry while still allowing us to get data that we hope is going to make our job and what we do better. So, you know, with that, I would sort of open it up. There's some, you know, some specific questions here, you know, improvements moving into a beta, onboarding and communication, the testing phase, the production phase, post-implementation, pilot timeline. So, you know, I'll just open it up and anything that you all have, we would really welcome to hear. So first, I think Ikea would like to say thank you very much to the CPSC for doing the pilot compared to other PGAs that have gone live into the PGA message set. This has been very organized. It's been very pragmatic, very open to questions, responses available to the community. So thank you very much. We were very pleased with the onboarding, the communication, the phase of the pilot. It went quickly. We didn't anticipate that it would go as quickly as it did. We were hoping we'd be ready at the beginning instead of a little closer to the end. So that kind of brought us to the other areas that for Ikea, we felt that maybe the production phase would be a little bit longer so that we could really get a handle on what we were seeing, what we weren't seeing, where would we need to make modifications or changes to enhance the process or gain efficiencies while at the same time getting you what you needed as an agency. So that was Ikea's feedback. I do think on timing, we were really hopeful. We went live in July. And we were really hopeful that everyone would sort of be ready in July and get sort of a full six months of filing, and obviously that didn't work with everyone. So in the way that we had set this up with CVP is we had sort of defined the timelines and we had put a federal register notice out with some specific timelines. So it really did kind of box us in, so to speak on scheduling. I would say sort of the staff's view on the beta is that that would be a pilot that would be much longer from a data collection phase, probably about a year's worth of data or a timing to file because we would be looking to be able to get that data to determine its usefulness and sort of have a much larger number of importers participate. So we are thinking about a longer implementation time there. Yes, definitely a broader number of participants. You all were the cream of the crop and my expectation is that if we did integrate this with targeting, we would have not done anything. So the issue is that for us, we would want to integrate it potentially with recalls that are announced, be able to integrate it with other processes that we have within the agency spend that time. It's going to take time to key all that up and to process it through and then to deal with all the different ways people file, all the different modes of entry where we're going to have issues and where this actually provides a different benefit potentially at the sea environment than maybe a land border. Give those kind of nuances and potentially the land border it might even be better depending on how active our feet is over time. So these are the kind of questions and that a beta would allow us to be able to demonstrate a lot more benefit over time and to really be able to provide a much better set of procedures to support everybody, not just the cream of the crop. So. I have a question or a thought, actually. In terms of data validation on your side, we are, I think you heard, we're entering, we're reviewing our product. We're dedicated to doing that. We're entering our product detail data requirements into the registry. Is there a way to establish a timing aspect in terms of a pass fail kind of thing? So if you, we as Walmart have entered in 3,300 products into the product registry within a certain amount of time or even a rolling amount of time as products are being entered and reviewed, is there a way for your validation process to identify a yeah, we're gonna target this or a no, we're not gonna target this and notify us ahead of time or notify the trade ahead of time so that we can take care of it prior to entry, whatever issue may be. That's a tough one in that obviously the classification would need to line up to be able to give us an indication that the cargo is out there. In terms of there were specific warning flags on a manufacturer that you may have provided that information on or if there is a known recall on a product that you're providing us information on the registry way in advance and you weren't aware of this for our code, there might be something that we could work on to do that but we've got to be in this process through with you as part of that beta but it does not serve us at all in wanting you to send through the process something that we know that's gonna be recalled that's gonna be stopped. We wanna give you as much advance notice as possible for us because it then provides us the ability to not have to look at that product at import. I want my people looking at other stuff if you're already giving me a warning ahead of time and in my systems, I can do that ahead of time it would make sense. How that works out, we'd have to be in that through. Yeah, it's interesting because the way that we do risk analysis, we do it at the line level basically. So that's where our rulesets and our scoring kick in. So what, if I understood your question, I think what you're looking at is for us to push that risk assessment prior to entry at the product level and we'll have to think about that. That's just not how we're set up now because the way that our targeting system is set up is we get this feed of data from CVP based on our jurisdiction, then based on entry or based online, we do our risk assessment. Now this, the e-filing data would be additional data that would feed our rulesets is the way that we've always thought about it. Right, and if it's coming in way in advance and we've got something to tie it to, we'd have to set up programming to match that up, do the join and provide some type of notification, do some research and then reach out. That's a new process for us. Right. Yeah, that would be a new, but it's definitely in the realm of what's doable. Yeah, I mean I think it's definitely something to consider because if you heard, as you heard from everybody, they're tying the registry to their product. We're an anomaly in that way. We then have to tie the registry, the product to a purchase order to an entry. So if we can tie the registry to a product, have that product reviewed, then obviously we can still assign a registry number to that product for entry, but if it's not tagged at entry, then there's not as much pressure from a timing perspective to have everything on our side done and completed and reviewed and renewed and reviewed prior to entry. So I just think it's something to try and consider. It may also eliminate the need to justify a disclaim because if we've had that product reviewed, we know it's not subject to, even though the tariff say it may be subject to, then there's no need for us to do a disclaim process against it. I would argue that is our process would mature if you ever filed a disclaim against a product code, you would not necessarily need to do it more than once because we would develop a history file that would state that this was a disclaim. And potentially at that point, we would then know and not send you an error message saying that that was ever a problem with that code in the past. I'm just, I'm thinking this through with you out loud as part of this, I'm not promising anything, but this is the kind of, these are the kind of processes that as we develop this over time that you can have those built. Right, but you're thinking that from the code level, not the product level? I'm thinking that at the product level. If you disclaim at the product level prior, and it's been shipped in and there's a verification, maybe some type of checkbox that might have been done that yeah, okay, we didn't need a certificate on that. If I have that sitting somewhere, I can match that on any registry coming through and then if you file it, it makes sense. It's not built, but it definitely makes sense. Theory makes sense. I think the theory does make sense because just thinking about the information that we're going, that you get right now, that we've gotten in the registry, it is a lot of those core pieces of information that you would use for targeting. You don't have the entry information, you don't have the line information, but you've got a lot of the key components. It's just like John said, we're just not built like that, not to say that that's not a good place to go at some point. Right, because when you get right down to it, it's about the product, it's not about the tariff. Right, exactly, exactly. Yeah, thank you for that. I think as we consider those topics and we consider that type of process though, I think we have to consider the interface with ACE and the interface through U.S. Customs and the brokerage who may have programming tied to saying this HTS number is subject to CPSC, they may outright reject it from an ABI standpoint, you can't get it through, so I think we... I want to clarify one thing in the vision that we've seen longer term and this is from a staff opinion at this point in time. I don't think we'd ever have, within a certain HTS code, CBP reject on our behalf in advance, we would be sending signals from our system to their system interacting. There's not gonna be much in the way of those bucket one failures unless it's just typed badly that they're gonna say you need to fix things. But on our behalf, it's gonna come all the way back to our system. Now, if there's a registry filing, there's no reason why we couldn't have our system talk to your system, potentially independent of CBP, that's added expense, that's added work, we'd have to figure that out. Or potentially ACE was originally designed to try to pass everybody's trade documentations out and back through, how well would they be able to support that? We're still working with CBP to test those as well. So in terms of that coordination of documents, filing documents directly through ACE. So yeah, that's gonna take some time. We're working on that element as well. This is Charles from Freedom Loom. Got a question. A lot of our products may be subject to the rules, but exempt from testing. So I mean, you would expect all those products to be in the future in the beta program to still be uploaded to the registry or send the full message set. Well, the way we had it set up, and I'll ask Lisa if I don't get this right to jump in, but I think the way that we had it set up is we had the ability where it was, where the testing, where you could identify that the testing entity to be able to indicate no testing was required. But under the alpha, yes, we were asking for that information to be filed. Can we look at that as something to consider, how to deal with that in the future, certainly. But that's the way that it was set up in the alpha. Under the PG message set, when we evaluated the full PG message set, there's the line exemption, and then there's the test exemption. The PG message set architecture on CBP's perspective does have that capability. It also has the capability to nest multiple manufacturers within their architecture. That being said, I don't think in this pilot, we want that route. And in the beta, we want to test more complicated designs. So you do have the ability to do test level exemptions, you do have the ability to do entry line, this HTS is exempt. So, yeah, that's how we do it, how we test it. We want to be able to get a little more complicated under the beta. But just to clarify, in that situation, Charles, we weren't expecting you to put it in the registry and we weren't expecting you to file a full PG message set. There was an alternative message set, which we believe will move forward with where it's basically you declare the exemption and say, I didn't need to test this. So it's not a full PG message set, it's small, kind of like the reference PG message set where it's very minimal information and you just say, hey, I know this product falls under an HTS code that you might be looking for a GOC or a GOC or basically the testing information, but I have an exemption and here's why. So it's a small minimal amount of data to try to lessen the burden, but from our perspective to know that you understood that you needed to file something and that you didn't just not file testing information, we need something to tell us that, hey, yeah, I know about this, this is why I'm exempt. Thank you. This is Ken from Walmart. I guess a couple of thoughts I have talking about maybe recommendations for the beta. And I think it could be challenging when you have 100 or 200 participants, but I think maybe some more routine conversations like this throughout the course of the beta, I think would be helpful. Even if it's once every quarter or a six month toll gate, something like that. Certainly we've submitted some written information, but I can't imagine you get as much from that as you have from the conversation here. I know I certainly wouldn't have. So I would encourage us to maybe try to find opportunities like that just to touch base how things go and what do we need to test that we haven't tested that sort of thing. Excuse me, and then the other thing I would recommend if you haven't thought about it already is seeing if there's some way we can test the trusted trader program as part of that beta too, just anticipating that that will be a component of whatever full implementation is. I think it would be a good opportunity to see what that might look like. And certainly to the extent that we're a beta pilot participant, we would be happy to be a trusted trader and test it out for you. Thank you. All right, I would sort of open it up for anyone that has any last comments, suggestions, recommendations, anything they would sort of like to say as part of this meeting. And then we'll kind of go from there and close up. Okay, so again, just thank, I wanna thank everyone for all of their participation and the pilot for making yourself available and participating today. This is, I agree with Ken, you can get written feedback and that's helpful but to actually be able to have a conversation and ask questions and have a dialogue, there's no substitute for that. So I agree with you as we move forward. I hope that everyone has recognized that we've been willing and available to talk and having these public meetings is also very helpful and if we're able to move to the beta, I certainly expect that we will continue to do that. So again, just thank you for all of your work over the last several months to prepare and to program and to participate in the file data. This has been very helpful for us. Like I mentioned earlier, we're gonna prepare a report for the commission with the evaluation of the alpha and some options and a recommendation for moving to a beta and then we'll wait commission guidance as far as the path forward. So, but thank you again. And I think with that, we will conclude our meeting and thank you to everyone on the phone and everyone that viewed the webcast today. Have a good day.