 Okay now, yeah, so I had some trouble with OBS. I had to restart everything. That's why everything looks different What happened to See if to test the compute music Yeah, that's just some annoying music to test the Hava Sound coming from the machine This tool by the way is available on the landing page. There's a link. You haven't checked the site. Go check it out currently just updating the info section couple of questions there I know we're not currently streaming on our platforms a YouTube alone Again, this is because my streaming software scosing a trouble. I had to Reinstall everything. So that's why it's on the YouTube at the moment Doing a bunch of things. We don't have much time. We have an hour at the moment There's a bunch of things that we want to cover Wanna add an example of neural nets? We had something running in the past on the website. That's roughly what it looked like predicting a sunset Enough this music is too annoying Turn it off. That's a good reminder for me from doing something funny with my face So I have to edit it later out If I'm making a video out of this stuff. Let's try it like a Digit detector Yeah, we had a mix so we're trying to train a model some mix success with that And many because we couldn't transfer the model that we trained into Json file so so we can use it with TensorFlow JS what's the problem and One that is available online just has a What's called only trained up to five digits instead of you know all ten like zero to zero to nine So that's not fun There's some all the tools like that running online. They're not they're not great They will try to make a new one our own It was bunch of kegel competitions before that they achieved very high scores with a digit the detection Currently, I don't think it's running. So we're doing everything as a flask application Running at once. So the idea is to Draw a digit and it will recognize what's what you get this Many Here is yes to do if the model is non-existent so we were able to Do this before utilize a different model that But then again, it was only trained from my zero to four. It's only five digits That's not cool not cool Yeah, we have a bunch of models there You have variables assets We have HTML and so when you primarily trying to rely on the JavaScript Yeah, the idea for it create a copy of that Yeah, this was It created model the problem if that was It is a JSON file so we can use be that model this model can be used in HIV Yeah problem with this. It was only trained on five digits Can run it works And this is our own model using our own model local To remember there was something else that had to be changed in HTML for it to work Model doesn't work We need to the problem was we have a prompts Prompt text file. Yeah, we had a long Prompt session with this one. Yeah, I'm trying to reinstall TensorFlow Digit classification. Yeah, we got stuck with it essentially. Yeah, we're using a TensorFlow.js One styling that's okay You have the JavaScript so many many everything is controlled by JavaScript Yes, we have to come that way. Oops. Yeah, that's what I need to edit later out in the video editing We need to convert the model into Something that the TensorFlow.js can use and then use it JavaScript is that's what we Have at the moment we try the simplified version, but there was a problem this conversions essentially Yeah, the conversion wouldn't work Can they give it? A quick try again, we can give it a quick try so it is a flask application, but Essentially, there is nothing happening at the back end It's pretty However We do have this Python code that generates the model And then the idea is that we need to convert the thing to Do something JavaScript can use sweet Jason Jason file So essentially the model needs to be saved in the Jason file Tell anything like this before will be good to know Really good to know Okay, so I have this Text and that was the first prompt a First session where the second session as well a crazy long like this prompts One to get having Issues and vast I'll show you what the idea is She's have to in HTML This three And we'll comment the first one out We have this local model a copy relative And then we have the five digits somebody else is somebody else they generated that model, but it's only five instead of ten digits model I know we have digits one. So that's actually works Yes, that's the Idea Yeah, so it says digit one 56% like you heard digit four is 20% Why do you make it? Oh, yeah, now it's does a 99% the digit four Let's see if you do some rubbish. Yeah, I can see the Digit zero percentage going up This they get you get the point in this case Yes, but we're trying to show all the other five digits and we Couldn't do it Why just realized it wasn't recording this session as well for my local machine Actually recorded to change the settings later. So it's records when I you know the percentage like I would for the digit being one is going up One should have this. Yeah, it seems like the percentage keeps going up. What if you had that? No, it kind of stayed the same. So you can kind of override your for Keep doing this the percentage likelihood for the digit being four will start increasing To this five, but we don't have five. That's probably to fix that so try what we did a Yesterday quickly again see if that works or us to be able to run a local model But we do not want to run it in the back end and we want to run it Rondon that the model It's to be Converted a JavaScript a JavaScript model Into a JSON file. So essentially stored in the JSON file. So why? TensorFlow can't just the generate the modeling TensorFlow.js to begin with Can it say Quickly they won't work again So this one so that the bot knows it's a JavaScript application Then we have this Well Using TensorFlow.js for this we have a local model JSON at the conversion from a regular model Keras, so whichever way it was saved Didn't really work. So we have this model. It's being saved in a folder and then Converted you actually have the model J As folder Where is it? It's in a static folder. That's fine model JS Have a binary in there and then model JSON looks like This but it's actually a formatted document. Yes. So that's the model saved in JSON format But Being wrong with it as well Problem is Get this error failed to load the model while GPT-4 is generating stuff It looks like you're encountering an error when trying to load a TensorFlow JS model using TF load layers model The error message indicates an issue with the configuration format of the model or the way it is being interpreted by TensorFlow JS Here are a few steps and tips to troubleshoot and resolve this issue Check the model file ensure the model JSON file and associated weights files like group one shard one of one dot bin Are correctly placed in the directory specified in the path and that they're accessible from the script Verify the model format TensorFlow JS requires a specific format for the model files If you have used TensorFlow JS underscore converter to convert a TensorFlow model to TensorFlow JS format ensure that the conversion process was successful without errors Here is the basic command for conversion that you might have used Double-check that you specified the correct input format output node names and parts Inspect model JSON open the model JSON file in a text editor and check for any obvious anomalies Such as missing brackets typos and keys like D type or shape almost place commas which might corrupt the JSON structure An incomplete or corrupted model conversion a Mismatch in TensorFlow JS versions used for saving and loading the model Ensure that the version of TensorFlow JS used in your project matches or is compatible with the version used for model conversion As noted in the converted by field in model JSON Version compatibility if there's a version mismatch either upgrade downgrade your TensorFlow JS library in the project or reconvert the model using the version of TensorFlow JS underscore converter Corresponding to your TensorFlow JS library version in your project Simplified testing as a test try loading a simple or different pre-trained model using the same method to see if the problem is specific to the model or your setup Debugging a script use console logs extensively around the loading function to ensure paths are correct and outputs errors from each step are logged for easier debugging As an example snippet of how you might set up logging around your model loading code to catch and diagnose issues If the issue persists after these checks you might want to regenerate the TensorFlow JS model files using the TensorFlow JS underscore converter ensuring all parameters and the environment are correctly set up Yeah, probably have to Regenerate It's we have these two files This one was saved yesterday. That's because we just Reformatted its structure the way it looks like and what is verified the model format Shoulded confession process was successful. Well, how do we know if it was successful or not? Yeah, I used A model JSON Hey, maybe you can do those checks Yes, TensorFlow JS cannot pass the Pass the model JSON file the contents of your model Jason look correctly formatted for a TensorFlow JS model and the conversion process also appears to have been performed correctly The area you're encountering suggests a potential issue with how the model is being loaded or interpreted by TensorFlow JS Here are a few troubleshooting steps and modifications you can try Make sure that the model files model Jason and group one shard one of one dot bin are correctly served by your flask app You should be able to access these files via the URL in your browser like HTTP colon slash local host port 500 slash static slash model slash model dot json If these are not accessible TensorFlow JS won't be able to load them Ensure that the TensorFlow JS version used in your project matches the version mentioned and converted by TensorFlow JS converter version 4.17.0 Sometimes discrepancies between versions can lead to issues You can update your TensorFlow JS library using Modify your model loading path in the JavaScript file to ensure it's correct relative to where the script is being run from If your HTML file is in the root and the model is in static models, your path should probably look like this Look for any errors in the browser console when attempting to load the model Errors here can give more specific clues on what might be wrong such as cause issues file not found errors, etc If you're running into cause issues common when testing locally due to security restrictions in browsers You might need to configure your flask server to set appropriate headers You can do this by adding the following to your flask application And make sure to install flask course if not already done As a sanity check try loading a known good model that's pre-hosted or comes from TensorFlow JS examples to ensure your setup can load models without issues If none of the above work try reconverting your TensorFlow model to TensorFlow JS format ensuring there are no warnings or errors during the conversion process These steps should help identify or resolve the issue with loading your TensorFlow JS model If problems persist you might need to check for specific errors thrown by TensorFlow JS in the console and address them accordingly Okay, so contents of this URL That should be okay a version compatibility a chain is getting the Area and loading path Is fine Loading is Okay a browser console now that should be fine course should be fine already this step if Okay, it will just say to re-convert I'll say to re-convert Given the detailed error message in your successful test with a known good model from TensorFlow JS's model storage the issue seems specific to your model configuration or conversion The error message stating class name and config must set suggest a fundamental mismatch or a missing critical configuration detail in your model's JSON structure the tensorflow JS expects double-check model conversion Ensure that the TensorFlow model was properly saved and contains all necessary layers and configurations before conversion If there's an optional parameter or structure that TensorFlow JS cannot infer it might cause such issues Reconvert with specific outputs When converting with tensorflow JS underscore converter ensure that you specify output nodes if your model structure might cause ambiguities Here's how you might specify output nodes if not done previously This ensures that the conversion tool knows exactly which outputs to expect and configure version check Although you match the TensorFlow JS version with the converters version It's also essential that the TensorFlow version used to save the original model is compatible with the TensorFlow JS converter Check for any known issues or compatibility notes between these versions inspect model architecture Examine the TensorFlow model architecture to ensure that all layers are compatible with TensorFlow JS Certain TensorFlow functionalities might not be fully supported or might require specific configurations to work in TensorFlow JS manual JSON edits As a last resort you might try manually editing the model JSON file to ensure that every layer has class name and proper config This is risky and not recommended unless you're very familiar with TensorFlow JS is expected format but can be a diagnostic tool to see if specific layers are causing issues Simplify the model If possible simplify your model architecture and reconvert to see if a particular layer or configuration is causing a problem This can help isolate the issue community and GitHub issues Check TensorFlow JS GitHub issues for similar problems or post your issue there Sometimes specific bugs or undocumented behaviors can cause such problems and the developer community might have insights or work rounds Use an alternative hosting method Since you've tested the models from Google storage work consider uploading your model to a similar environment or use a different static Server setup to rule out any environment specific issues Make sure that no firewall or browser security settings are interfering with the model loading Validate the parts and URLs again, even though they seem to be correctly set up Implementing these suggestions should help you pinpoint and resolve the issue with your TensorFlow JS model loading every conversion successful From the logs you provided it appears that the TensorFlow JS conversion process has completed successfully despite the warnings about GPU availability and TensorFlow TensorFlow RT integration These warnings indicate that the system attempted to utilize GPU resources but couldn't find the necessary drivers or libraries However, this doesn't impact the conversion process itself which can proceed using CPU The key takeaway from your logs is that there are no errors directly related to the conversion process itself Implying that the conversion from the TensorFlow saved model to the TensorFlow JS format should have been successful The model should now be usable in a TensorFlow JS environment assuming the conversion was correctly configured And there were no issues with the model itself that would prevent successful conversion test the newly converted model Since the converter didn't throw errors explicitly related to conversion, you should test the newly converted model again in your application Ensure you refresh any cached versions of the model files in your browser to avoid loading an older possibly faulty version Verify model loading Use the newly converted model path in your TensorFlow JS code Check the browser's developer console for any new errors or warnings when loading this model Check TensorFlow JS version Ensure that the TensorFlow JS version used in your project matches or is compatible with the converter version used TensorFlow JS converter version 4.17.0 It's a good practice to use the same or very close version numbers to avoid compatibility issues Re-examine model usage If the model still fails to load Consider whether the model uses any features or layer types that might not be fully supported or might require specific flags settings when used in TensorFlow JS Console TensorFlow JS community if issues persist I'm still getting this error how to fix The error message class name and config must be set suggest a specific issue with how the model JSON file was structured or created during the conversion process This problem typically arises when the model configuration does not include necessary details that TensorFlow JS expects for reconstructing the model from the JSON file Here are some detailed steps and checks you can perform to resolve this issue The model JSON file should contain class name attributes for each layer and a config section describing each layer's configuration You should open the model JSON and check if these attributes are present TensorFlow JS requires these details to understand how to instantiate each layer Make sure the versions of TensorFlow used to create the model and the TensorFlow JS converter are compatible Incompatibilities here can cause issues in how the model is saved and converted Rerun the TensorFlow JS converter with flags that might help include all necessary information Sometimes adding skip underscore op underscore check can bypass certain checks that might incorrectly flag usable operations as unsupported As a last resort you can manually edit the model JSON file to add missing class name and config fields for each layer This approach requires a good understanding of the expected format Here's a basic example of what part of the JSON should typically look like for a layer If the model is complex try simplifying it and rerunning the conversion to check if a specific layer or configuration is causing the issue If not already done test the TensorFlow JS environment with a known good model like a simple model built and exported directly in TensorFlow Did I change the model creation script to add the missing components? And also can you regenerate the command for converting the model into JS? Yep To ensure the model is correctly exported and loaded into TensorFlow JS without any compatibility issues here are a few tips and the correct command for conversion Ensure output names while TensorFlow typically handles naming implicitly explicitly setting output layer names can help in scenarios where specific output nodes need to be identified especially during conversion for TensorFlow JS Simplify training for debugging purposes reduce the complexity of the training to ensure the model itself isn't causing the issue Here's an updated version of your model script Now let's convert this saved model to a format suitable for TensorFlow JS The following command should be used after installing the TensorFlow JS converter via npm if not already installed Then convert the model This command assumes you've saved your model in carers format mnist underscore model h5 If you choose to save the model using tf saved underscore model save adjust the command accordingly Make sure to specify the output format explicitly and use the correct signature if required This helps in scenarios where there are compatibility issues or specific configurations that need to be adhered to during the conversion process And a retrained Right this might take a minute or so Yeah, well, it's uh, let's finish It looks like your model training went well and the model had been saved using the older hdf5 file format TensorFlow is recommending using the native carers format for saving models Which provides better compatibility and feature support in newer versions of TensorFlow You can update your model's saving code to use the recommended format This might help avoid issues, especially when dealing with different TensorFlow or TensorFlow JS versions Here's how you can modify the saving part This command saves the model in a directory that TensorFlow can directly use for various purposes, including further training or conversion to other formats After you've saved your model in the new format, you can convert it using the TensorFlow JS converter Here is the command updated for the carers format TensorFlow rt warning the warning about tensor rt not being found is informational unless you specifically need tensor rt for optimizing inference on nvidia gpu's If you don't plan to use gpu accelerated inference in production, this warning can be safely ignored 1DNN operations, if you encounter discrepancies in model behavior, consider disabling 1DNN optimizations as suggested, but this is usually not necessary unless you face specific performance issues GPU libraries, the warnings about GPU libraries not being found indicate that TensorFlow is set up to use a GPU, but can't find the necessary CUDA libraries If you want to train models faster using a GPU, consider installing CUDA and CUDNN from nvidia By following these updated steps, your model should be well configured for both local experimentation and deployment in a web environment using TensorFlow.js If you run into further issues with TensorFlow.js after these adjustments, it might be helpful to check the TensorFlow.js version compatibility or the specifics of how the model is being loaded and utilized in your javascript code Yeah, that's the problem that we had before There's some sort of incompatibility It looks like you're encountering an issue with saving your model due to the file path extension or method used CARES now expects a specific file extension when saving models, where it will prompt you to use a method appropriate for your use case Here's how you can handle this If you want to use the CARES format, which is recommended by the error message, you should specify the CARES extension This format is good for continuing training or resuming training later If your goal is to convert this model to TensorFlow.js or use it with TensorFlow serving, saving it as a TensorFlow saved model might be more straightforward This method does not require a specific file extension and will create a directory with the model data laid out as a saved model, which is compatible with many TensorFlow tools and services Once your model is saved in the correct format, you can then convert it to TensorFlow.js If you saved it as a saved model, use If you saved it as a CARES, CARES model, you'd use Make sure the path where you're saving the model has the correct permissions and exists on your system Also, ensure that your TensorFlow.js converter command match is the format in which the model was saved Each format has slightly different requirements for how the model should be loaded later in JavaScript If you continue to experience issues, you might need to review the documentation for the specific versions of TensorFlow and TensorFlow JS you're using as compatibility details can sometimes change Okay, we will be trying this Can later on Cover this thing real quick Someone was asking Someone was asking about what was the question does binary chaos help people for analyzing public health data or is it only Providing a platform for doing it. It's a bit of a convoluted question, but we'll have a go And it's what the GPT-4 said primarily that binary chaos that comparably offers a platform for analyzing public health data That is correct rather than directly assisting users in the analysis Yeah, it's more or less correct It provides various Tools that simplify the process of processing and understanding biomedical data that is correct This includes an array of uh, dashboards and applications for exploring signal processing topics such as the digital networks and more that is correct Um Right, so that that so the third part of the binary chaos that comes does not currently spot the uploading of personal data So I don't want your personal data And We instead utilize publicly available data sets like a physio net or github Yeah, the tools are experimental Not certified for clinical commercial use bottom line In simple language If you Want me to locate your data Put it on a public Repository such as github physio net We have a whole List of resources here um Can put it on ficture A 4 2 u tu Zenodo physio net IEG portal kegel um anywhere anywhere else that is importantly in the Public domain And uh, yes, send me a link. I'm happy Take a look at it hopefully that Answers the question Bionic house.com is an online platform that focuses on biomedical data analysis It provides a variety of tools and dashboards that users can utilize to process and understand complex biomedical data The site aims to democratize access to advanced data processing techniques Covering a range of signal processing topics such as statistics neural networks fuzzy logic and more six source Eight source the platform is designed to be open and user friendly offering an array of example applications and practical tools To help predict and understand signal characteristics and events Which are vital for advancing biomedical research and applications All of this is available without the need for downloading or installing any software As the tools are accessible directly via the website six source seven source Moreover by ionic house.com is committed to open access releasing all its content under a creative commons attribution license Allowing users to use the tools for any purpose Including commercial use as long as they credit the source six source Get the original question. Do we help people if they're analyzing public health data or it's It only provides a platform. Well to analyze data. You need a platform. So that question is a bit Convoluted so in other words yes We want me and GPT Want to help you analyze your data I need it to be we need it to be in the public domain So publisher data set Is the best option for something like myself or anyone else to look at it for three I hope that Helps and i'll try another stream later We again, we had some trouble with my streaming software I had to reinstall it therefore the format is a bit different as you can see And let me know if you like it or not if any questions And we're doing more of this Training later on To know what happened there. Yeah, I'm still finding out the best format for for this videos I'll have that A model retrained. That's great Hey, we need to Let's check It's a bit odd that they didn't have a message a like a successful um Something Hey, let's see what file can reach that file. Why? a was No, wait, that's not the the correct It's not the Model Saving in Right here Yeah, like 10 minutes ago. That sounds right Only have five minutes to go ljs And yes, it was updated Run the thing model is there Loading doesn't work Was all Yeah, what's the problem now make sure they're using the right command Now that this gem script script is correct Extet the command Somehow Yeah, I'll have to continue this next time. I'll see you in a bit