 Welcome everybody to our final demo day of 2023. Yay! We've got an action-packed agenda today and I'm excited to introduce our presenters from the Andres teams and from our PL network companies. If this is your first time attending the Mother of All demo days, the goal of our demo sessions is for folks across the community to learn about what other teams are doing and ask questions about their projects. It's an opportunity to problem-solve, ask questions and explore new ideas. As we get through today's demo day, I encourage you to share your feedback for presenters in the chat or follow up after to collaborate. So just a quick run-through of who we have presenting today. DAGhouse is presenting console, your friendly web interface to Web3 storage. Textile is presenting Textile network basin, replicating data to Web3 native storage. PolyBase is here to demo private USDC transactions with AML compliance via ZK proofs. Lava will demo a modular backend for crypto apps. There's going to be a recording from station demoing the Filecoin station app and Lillipads demo is on running AI and ML compute jobs with Lillipads for coders and non-coders. So just a reminder, presenters, please have your cameras on during your demo and I will have a timer on my screen to keep time. And so first up, we have Alan. Hello, I'm Alan. I'm here to talk to you about Web3 storage console. And so yeah, first thing is in November, we launched a new version of Web3.storage. It's a big step forwards for the product in terms of verifiability and it enables it to take a big step backwards from centralized infrastructure. So it uses public key cryptography. You can for decentralized off and the IDs for decentralized identity. Now the web console is one of the ways you can interact with Web3.storage. It's what I'm going to demo today. But we also have like a CLI tool and a JS client and a Go client. So when you go to console, it will create a key pair for you. This is your identity in the browser and it's what you'll use to sign you can to perform operations like storing data and submitting pieces to Filecoin. And so we use email to share permissions across devices. So let me just put my email address in here. When I click authorize here, all I'm doing is actually claiming delegations that belong to me already. And that's doing so by verifying that this key pair in this browser is owned by the same entity that owns the email. So we didn't call this login for that reason since there's no centralized kind of service that we're authenticating against or logging into really here. So let me just click that email and verify, get into the console. Okay. So all of that kind of, I guess, authentication or thing is kind of optimal. You can always export a delegation and import it on another device. But it just makes things really, really easy for sharing your access to your things across devices. So to your mobile, to your laptop, to your big computer, whatever. So when you're in, it kind of looks like this. You've got a list of your spaces here and spaces are just places where you will upload or register upload things, register uploads to. They have DIDs as well. You can see them listed here. From here you can like import a space so you can actually gain access to someone else's space so that you can upload to their space just by sharing your DID with them. And you can obviously create new spaces. So that's that. So with spaces you can then click in and see all the items that you've uploaded to your space and then just paginate through them and just see them nicely, nicely displayed for you there. You can then, if you want to share your space with someone, you can get, if you can get hold of their DID, paste it in there and you can download it. You can delegate permissions for other people to do it and you can share that with them and they can import it on their side and then they can use your space as if it will their own. And they can also delegate to anyone they want to have access to that space as well. Obviously get to upload stuff. I'm going to go and go into this other test space that I've got here and just upload a file. I've got a nice file here which I'm sure we're all familiar with. So I can just drag and drop and now should upload it. Here we go. There it is. Yeah. Very good. And then so what is it? LW7U. That should appear in my list here, LW7U. There it is. Pretty cool, huh? So what else did I want to do? You can click on these and you can see the root CID and then you can just access that data on the gateway as a picture of me. Great. There we are. So yeah, that's that kind of view. And what happens when you upload data to WebRedox? So first of all, on the client side, we get your data and transform it into what we call a DAG and that gets serialized. That basically allows it to be content-addressed. That gets serialized into something called a car file and that car file will content-address that set of bytes as well. So this is the CID of the car file with your DAG in it. And if you have a big DAG, then what we'll do is we'll shard it into multiple cars. And that helps with kind of resume ability. If you upload a bunch of things and then for whatever reason your internet drops out, then you can resume from where you left off. Don't have to start again from the beginning, which is quite cool. So anyway, that's kind of simple, simple thing. But the awesome new thing that I really wanted to show you was that you can actually click through to any one of these shards and see more information about its kind of status in the world of kind of file coin as well. The shard CID as you saw before just there, but this is the PCID and PCID is kind of equivalent, kind of equivalent to the shard CID, but it's what you would use to refer to your data in file coin. You see its size, which is pretty small. And that's kind of an kind of an issue when you want to get stuff into file coin because actually what you kind of want to do is have a big chunk of data to give to a file coin storage provider rather than a very small thing. And what Web3.storage does is it kind of plays the role of this data aggregator where it will take in a load of data and aggregate all of these pieces together into a bigger piece that we're calling here an aggregate CID. So that's that. And then so we've got interesting interest as part here is that we've got an aggregate CID. So this is a bigger piece that includes your piece that's been aggregated together. It's also got the height here because whenever we refer to V1 aggregate, sorry, PCIDs you kind of also have to refer to the height as well. V2 PCIDs have the height encoded in them and they're currently in a FRC which you should, I don't know how to vote on or something, I don't know, but you can read about it here at the very least, but they are that. So we've got that listed there and it means we don't have to carry the height with us all the time. It's encoded in the actual CID. So then after the aggregate CID, so this is the CID of the thing that has your thing in it, we've got this, what is called the inclusion proof. This is like a Merkel proof that your piece is contained within a bigger aggregate piece and this shows you the direct path from the root of the aggregate to your data and these are the nodes in the, as a piece it gets transformed into a binary tree. This is the direct path you'd have to take to reach your piece from the root of this aggregate. So this is the proof that your piece is in that data and you can actually use the thing here to expand it out and you can see other elements of the proof that they used, but they are not in the direct path to your data and like I said, it's actually a binary tree. So every node here actually has two other children and we can expand it out even further to illustrate this, even though these nodes are not part of the proof, they are still there and other data is in there, but you can see you can get the idea that this is an inclusion proof and this combined with the storage providers here, which you'll eventually get, is what's called the kind of a data aggregation proof and this is another FRC, which is FRC 58, which is really interesting, but it's all about this proof that we have here, that we're generating and the other awesome thing about this is that due to UCANS, this is how we get hold of this information is our service storage is kind of giving us receipts for every operation that happens. So when you want to store something, you sign it, you can that says, I'd like to store this and you get a receipt from the service that says, okay, you may store it, here's the URL, put it there and then you'll put your stuff there and then you'll issue another UCAN that's signed by UCANS saying, I want you to put that piece in Filecoin please and then our service will say, okay, we're going to aggregate that with a bigger piece, just wait a minute and so what you can do is you can check in later and you can actually follow this proof chain or this receipt chain for actions that happen or as your piece is making its way through the aggregation pipeline into deals. So when we've got enough pieces that we can make an aggregate, you'll get a receipt that says, your piece is here and this is the inclusion proof and then once this aggregate makes it into a deal with one or more storage providers, then you'll get a receipt that says, this is the data aggregation proof, these are the storage providers you can find your data in and this is the inclusion proof for your piece of data and there's the storage providers here just listing out the IDs of the storage providers but also the deal numbers or the IDs that they're in and you can take the like the V1 CID here and you can just click through and this goes to FieldFox but it gives you more information about the deal so you can actually verify that your piece is in that deal with FieldFox as well but also obviously it's on chain but just using that here for visual proof and I think that's about all I wanted to cover that's good because I'm two seconds over time so thanks very much for listening happy to say gitty questions in the chat because we haven't got any time and yeah go and go and check it out it's really cool awesome thanks Alan so next up we have Dan presenting Textile Network Basin yeah so today we're going to walk through just like a brief overview of the Textile Network what we're building and then show a quick demo as to like how you can actually use the network so basically what we've designed is like an initial version of the protocol and in order to interact with the Textile Network you can use this Basin tool it's a CLI tool that lets you do a few different things so you can see that there's commands in the right hand side but basically start by creating a vault that is like this container for data you can stream data from existing web 2 infrastructure like a prose grace database or upload raw parquet files and over time we'll add more support there but basically as you stream this data into this vault it gets automatically replicated to Filecoin for cold storage via a deal that's being made using an EVM compatible or FEVM compatible smart contract on the back end but so as you push this data gets replicated to Filecoin and then there's also an optional like hot cache feature so then you can retrieve data even in a shorter time frame allowing you to have like this you know full data availability layer with the the Textile Network and then in the future we'll be adding a number of features around just like greater access control we'll have like a theme of stateless or stateful vaults so a stateless vault would be something like just purely fingerprinting via homomorphic hashes of all the data changes for that vault or like the actual data availability piece where it's replicating to Filecoin it's replicating maybe to a hot cache there if you need that data within a TTL time frame so just like try to cover that full provenance verifiability and data availability story in web 3 data and then as far as the demo it's we have one uh you know partner that we've been working with a lot just to like really drive some of the features so WeatherXM has been pushing data to vaults so what I'll show you is just okay like as they're the data producer in the scenario what can a data consumer do and since vaults are all publicly so stored data publicly basically I can go and fetch the data via like an API or via CLI and actually run queries over that data and just note that like everything that they're pushing the network is from devices that they're that are all across the WeatherXM network so it's like this you know fully decentralized stack in a way so yeah let's just dive into things so on the right you can see some like simple visualizations of the data but let's talk through like the general project setup of like how I use the WeatherXM basin information and and I built a python script to actually run these queries and just note like the the jobs that are running are just with GitHub actions but I do want to move this to Bacohal in the future so just keep that in mind hey everyone today we're going to walk through the textile network and basin and how it's helping WeatherXM replicate data to cold storage but also making that information accessible to external users you can see the repo here shared in the the deck as well for that source code but basically what it does is run a job just through GitHub actions that'll write data to a summary file starting with the like a CSV that stores every job every run that it has with average and aggregate metrics and then there's also a data file really marked down file that stores like a snapshot of information across the dataset so you can see some of the various columns we have there such as like temperature or wind information device ID some aggregate metrics over a specific query range and then plots of information and since there's only one run in this example it shows a single plot but you could imagine how that gets interesting over time and it's to give you an idea of what this information looks like throughout all the different commands that we just showed or the script itself you can fetch those vaults associated with an account so as noted XM data.p1 that's one of the vaults that WeatherXM is using to push data and replicate it to purely just cold storage but we do have a hot cache feature that I'll touch on in a second once we have one of those vaults we can fetch all the records associated with the vault so this will give us each and every CID that is associated with files that have been uploaded when you create a publication but to describe how caching works so when you just since I only have 10 seconds there's also like a caching feature too so you can see in this view when you create a publication or vaults you can also set up a cache so you have that high availability but hopefully that covered things sorry to cut things short but I was a little long there. No worries we can share that out as well but thank you so much Dan appreciate it and next up we have CID from PolyBase. Everyone yeah I'm ready so I'm also going to share my screen and present a YouTube video because we did this demo before and this will be a little bit easier but we'll be here for for Q&A or you can just email us as well after. Very high level we are presenting PolyBase is a ZK Layer 2 roll up to Ethereum with private USDC transactions that's the first application we're building on PolyBase and one of the key features of it also is that we have compliance proofs for AML using zero knowledge proofs so you don't have to reveal any of your transaction data you can just produce a proof that shows that you haven't interacted with any bad actors so let me share my screen and get started there. All right cool welcome everyone today we're going to do a brief demo of PolyBase. PolyBase is a layer 2 ZK roll up for private transactions specifically today we're going to demo two of the most important features of PolyBase the first one is private USDC transactions this is not possible today so any USDC that's transacted today is public forever obviously that's a problem for a lot of these cases that businesses use USDC4 including payroll, B2B transactions, trade, finance, and even written for for consumers and so we're going to show how we can do that basically for the first time ever on a public network and the second is compliance proofs so we're going to show a mechanism for generating AML compliance proofs on private transactions again something that hasn't really been possible before and we have built this network from scratch and written the ZK circuits from scratch to be able to do both of these things. Cal and go ahead just scroll down so just a quick really simple overview of what's going to happen in the demo we have two participants Bob and Alice they both are going to have wallets on PolyBase Bob is going to send Alice money and then Alice is going to generate a compliance proof that proves that she hasn't interacted with any bad actors we're going to have a PolyBase node running that's going to be able to sequence her in the prover and then we're going to roll up to Ethereum where we have a smart contract deployed for verifying the the proof generated by PolyBase. Great yeah thanks Sid you're going to act around. Yeah okay and so the I'll just give a quick overview of all the different steps we're going to take so we're going to start the roll-up server that's the PolyBase node we're going to create two wallets we're going to commit some test USDC to Bob's wallet we're going to make a private transaction from Bob to Alice and we'll see some of the steps that happens when we do that then we'll generate that compliance proof and then we'll actually ban Bob's address and then Alice will generate another compliance proof and we'll see how that fails when Bob's address is banned so Cal and let's take it off. Oh yeah thanks Sid so the first thing I'm going to do is start the PolyBase server because the clients will need to interact with that when submitting their proofs and then I'm going to create the wallet for both Bob and Alice so on the top left I've got Bob so I'm going to create his address and you can see I've created a wallet here for Bob and this is the address this address would be similar to a Metamask or Ethereum address and this allows people to send money to Bob it works in a very similar way and we can create a wallet also for Alice. You can see she's got an address as well which is different to Bob's address okay so then I'm going to create some mints on USDC for Bob and what this is going to do is it's going to send a transaction to Ethereum effectively this will be bridging money into the network so a lot of the money in the original form it would mint it on PolyBase and submit that transaction and you can see I have the note here which is really just another word where it's saying the transaction that's occurred we have 100 USDC here and we have Bob's address here to see who it's from and then we also have the source address which is the same as Bob's address and that source address remains consistent across many transactions so any transaction that derives from this transaction would still have the same source address and that's important for compliance which we'll talk a little bit more about later in the process so the next thing we want to do is just send a transaction from Bob to Alice and this would be a private transaction so Bob is going to have to create the entire proof that he needs to ensure he's compliant and then he can send that off to the prover so to do that I'm going to say PolyBase transfer Bob and we're going to send it to Alice's address I'm going to send the 100 USDC and so here we have the zero knowledge proof being generated but that's been generated in one and a half seconds it's really it's often thought of that ZK proofs are pretty slow we've worked really hard to get this down as much as possible and we're pretty proud of being able to get it down to kind of around a second on commodity hardware so and we've only got 16 kilobytes sizes so we can kind of send this from a mobile device to the browser a network bandwidth limited device would be able to send this proof and this proof has all of the private data mixed into it so that all that needs to be sent to the prover is the proof itself in a hash so transaction has been added to the roll-up mempool which we can see there's stuff happening over here I'm going to go through that in detail in a moment but that transaction was sent to the prover and the note was sent to the receiver so Alice received the note to hand over that money's incoming and as soon as it's committed to Ethereum she'll have that money and I just want to mention something here like the the proof is generated on the client side and so it's pretty amazing that we're able to generate that ZK proof in on the user's computer on an app or in the browser and that's what allows us to keep the transaction data actually private to the user but then generate the proof and send it publicly like Helen mentioned yeah exactly and so again now we have a new transaction or note being generated for this new note for Alice or transaction for Alice it's gone from Bob's address to Alice's address but notice the source address still remains as well and we keep that that trace because all the transactions so that will remain as Bob's address throughout any other transfers that would occur in this process so I have a group up to the aggregate to the advocate and I can kind of this is something that we're going to continue to work on whilst that's running in the background I just wanted to kind of show you the diagrams you've got a high level view of what's happening on the network and I can kind of explain where we're up to in the process so the client here you can see this client is for example Bob there could be many other clients on the network all those transaction proofs are bundled by the role of prover as as Sid mentioned all of the private data is is kept on the client the prover only receives these proofs that one proof is sent to Ethereum it verifies that proof and then once it's verified the proof it updates through cash and that enables all of the people in the network to independently verify that that's happened so it looks like this proof has gone through and just to kind of follow this through step by step we have the proof being sent to Ethereum we have the transaction of the proof being verified on Ethereum and we can even look at either scan and see the transaction being applied to Ethereum so now Bob knows you can see here on the left we know that the root hash has changed we've got a new block that's been created in this new root hash and Bob's now confident that the node did get sent to Alice and Alice could do the same check so let's jump ahead and have a look at Alice's balance and you can see that the money has indeed gone up over to Alice and we can check Bob's balance well who wants to check to generate a zk proof on the client she checks the compliance and then she can also generate a zk proof on the client to prove that that source address or the transaction or the note that she has is a valid note and that's done by checking the source so just to explain that in a little bit more detail I've got a diagram here which kind of shows what's happened so far we have Bob minting the note that note was sent to Alice and this is the process with her creating and checking the OVAC list or any other list that would determine whether you're a bad actor or not and she's able to create a proof that Bob this source address Bob was not included in this OVAC list notice that she's not checking or Bob so she's not checking the note specifically she's checking the source and she's proving that the source of this note was not from a bad actor because Bob is not a bad actor and the only thing that's required is the proof to be sent to either Ethereum or any offering like Coinbase and they just need to look at the proof in order to know that the note that Alex has is compliant and then they can pay her out her money so again we're hiring all the details I'm just sharing the minimal information but is this is a transaction that is allowed and now for an example let's see what would happen if for some reason Bob was actually a bad actor on the network we can simulate that process of adding here Bob to the list so I'm going to do that here so I'm going to ban Bob's address so now this address has been banned if Alice was to try and create a ZK proof to prove that her transaction is valid she obviously can't do that because the note is not compliant anymore because of the ban source address cool I'll stop it there thanks so much Sid up next on the agenda is Kajemni from Lava Network all right so I am Kajemni Karimu from Lava very excited to be giving this demo today just want to explain a little bit Lava Network you may have heard of it as an RPC provider but it is much more than that Lava is a modular back-end for Web 3 apps its modular can expand beyond RPC and with Phase 2 which is most recently revealed new features of Lava we're really excited because Lava is a whole lot faster and supports a lot more so let's get into it we didn't know that Web 3 blockchains traditionally promised serverless applications and even though we have serverless in Web 2 Web 3 has a plethora of options which do not necessarily yet meet the needs of a truly serverless application because behind the scenes some of the same technologies are employed to essentially connect dApps to blockchains so right now we don't really have the serverless thing but we do have multi-chain many many chains sprouting up which are intended to make this more decentralized reality a possibility but the thing with it is that as new chains pop up dApps require RPC on every chain there needs to be some way for these dApps to communicate with these chains and because there's different chain architectures different ecosystems this has become kind of a mess and a difficult infrastructure problem to solve so more chains means more RPC providers means more people trying to spin up nodes on these new chains and provide services to people who are building dApps this is the closest we're coming to serverless right now more chains more pain public RPCs get launched on small chains they get overwhelmed oftentimes and then developers want to build their dApp so they end up running a node on new chains in order to be able to service that chain and and run their dApp so lava network offers modular data access for Ethereum Filecoin and all of web 3 basic idea behind lava's approach is there's a decentralized network of node operators there's open endpoints we use specs which is a modular system for representing APIs and chains anybody can serve then lava protocol pairs data consumers to nodes and incentivizes node operators to serve these fast reliable accurate responses so this modularism is one of the theories it's basically taking these building blocks this is a representation using tetra's graphic and so these building blocks get added on the lava which means from the gateway or the SDK or the server kit you can actually see these different chains and supported APIs and make calls to them with the SDK in particular you can do it in a decentralized peer-to-peer way so it's really great because you can initialize multiple chains with very simple calls and then make calls independent of the way that that chain is communicating so this is an example with stark net json RPC you can see that the SDK is actually initialized here and then a block number response call is made using lava SDK dot send relay this is available on npm for those who who didn't see lava net slash lava dash SDK basically behind the scenes what's happening is it's sending the request to the fastest of several providers that it queries the lava blockchain to find and the request gets sent so with because I have very little time left I just want to quickly show something let's see so this is a really cool uh node application that I'm working on which uses the lava SDK and actually have a pre-built example of multi-chain calls with a system called badges initializes several of the chains simultaneously so the SDK right now is communicating and then it gives the results of these calls to various chains so we see we were able to get block numbers from various chains with one SDK making all these calls coordinating all the communication simple interfaces in a matter of seconds and that's really what lava is all about right now it's making this developer experience a lot smoother and a lot more uh collected so we're now working with sub squid to add sub graphs to lava and making additional advances with lava phase two are we're on the road to main net main net is really exciting it's around the corner for us and uh we've been making big improvements to our protocol and uh doing a lot of integrations with partners so we look forward to anybody who's looking to integrate check us out at docs dot lava net that xyz I personally write a lot of the docs so I appreciate any feedback that comes that way and once again this is lava presenting on modular data access network for web three thank you awesome thank you so much for sharing the next demo that we have is from patrick woodhead from station and this is going to be their demo about well coincision app hey my name is patrick woodhead from the station team and I'm just going to do a quick demo to show you a bit more about station so the first thing you can do is navigate to fill station dot app the website and this gives you a brief introduction to everything that's going on the station so what is station station is a product an app that allows anyone to join the web three economy you don't have to have the technical knowledge or financial backing to start running nodes and participating all that you need is a computer to get going and what you do is you download the fargo in desktop app or you can even run station on a server station with station core and you can start to simply earn fargo in in the background for for station running passively while you just carry on doing what you do with your day anyway what's actually doing we've identified that everyone's computers have got a lot of spare resources in terms of storage space compute capacity and also the bandwidth you pay for and it'd be great if we could leverage these things in the same way that airbnb allows you to rent out your spare room but like you'd be able to rent out your spare hardware for web three networks so it's very simple you just have to download the desktop app or start running station core connect to the network and start contributing and earning fill in return there's a bit more in the website you can just check out the faqs if you're still not clear or you've got any concerns as well as heading to our doc site docs.fillstation.app where you can find out a lot more so if we head back here we can also just click download and get going now i've obviously already done this and so i'm going to open filecoin station here and we've got the onboarding steps um i've obviously already onboarded as well but i've just reset that so you can see what these onboarding steps look like uh tells you a bit more about station and what it does about how you can be rewarded once you set up your filecoin wallet inside station and then fill it be transferred to that wallet once you've participated enough in the economy so we click through and then we click on create wallet this is something i've already done so behind the scenes i'm just going to reset my station connection i'm going to quit this again and reopen and it's actually just opened up here in the top bar because station runs in the background so that you can carry on doing your day today while station is doing its thing and you're running fill i'm going to open station from the top bar and you can see that it shows the number of jobs i've completed as well as my schedule rewards you can see here also that it's got a few lines in the log table the zinni has started zinni is the runtime um underneath station so that we can work towards anyone be able to deploy any sorts of modules uh to this to this special runtime and then spark which is the first module running on station and spark is a storage provider retrieval kacheka and it checks retrievals from farpoint storage providers and reports details around them and we really hope the spark is going to start um really improving the retrieval success rate in the far coin retrieval market and that's what we're rewarding station operators for doing we can also open our wallet apparently i have zero fill in my wallet but once the schedule rewards reaches the threshold of 0.5 fill and we do one of our payouts then we'll see that the schedule rewards will actualize into real fill in your station wallet and then you'll be able to transfer out that fill to a destination address of your choosing and over time we're going to build in features to the app such that that's not the only thing you can do with your fill you can actually also start to contribute back into the farpoint network what's coming up next we're in talks with a bunch of other teams about potential other modules both for the desktop station nodes as well as the server side station nodes and we're thinking about ways in which we can improve the experience in this app for end users to help them learn more about web3 and understand how valuable they are and the other things they can do to contribute if you got any other questions please reach out let me know i'll be happy to speak to anyone and also please just download the app and get contributing um thank you very much we have one more demo from lily pad and since ali is on a different time zone we're going to be playing a recording for her one second i am working on the lily pad team we've been building out lily pad for the last six months or so now so lily pad creates this kind of decentralized marketplace for compute jobs so it enables users to leverage the power of distributed computing without the need to manage complex infrastructures so this platform really represents a significant step towards like this really open and trustless internet and unlocks the next generation of this internet scale models and applications in the web3 space enabling better outcomes so with lily pad we're aiming to provide a permissionless distributed compute network that kind of enables these internet scale computing jobs and you can do all sorts of cool things with it as long as you know how to prop engineer you can make pictures like this too but let's dig into how we're making all this happen firstly so lily pad's live and running already um under the hood lily pad combines this off-chain compute implementation details while providing on-chain coordination and verification so it's built in go and solidity and it's currently running on ipc at the moment so uses its own erc20 lily pad token to pay for services and it's fully evm compatible um so how do you use it though uh get on with it Ali well there's three ways you can use lily pad you can use it either from the lily pad cli you can use it from smart contracts or you can use it from the brand new a recently released lily pad ai studio which is a complete web interface flow uh you don't need to know any coding to be able to use that we'll get to that in a minute though firstly demo time let's see what we can do with it um i already have lily pad installed though so uh best thing to do is just try it out uh and our first and initial um actually lily pad command is our hello world of the lily pad if you will uh is actually uh kao say so i don't know if many of you out there probably have heard of kao say this is like a fun little thing it's got nothing to do with ai actually so this can just run on any cpu uh but basically it's just a fun little ascii art character so i'm going to run it now um and i've just got my uh terminal a little bit too large to see things there but this is actually likes of ascii art of lily pad so lily pad is now running on the network basically what it's doing uh is is advertising that there is a job on the network compute nodes are billing bidding for that job then once the deal is agreed that's put on a smart contract here so the deal is agreed uh and then the compute node that agreed to do the job after putting up collateral um computes the job and sends the results back to um the original requester and there's some parts in there that are checked for that job actually getting run and you can then go ahead and have a look so this saves to ipfs actually but we're also saving to local folder as well to make it pretty easy on the cli and if we open this up we get a little look how do you do so that's the message we put in there and that's uh the lily pad uh hello world okay so that's pretty cool but you probably want to do a little bit more than just write some ascii art so how about a stable diffusion text to image generation we're seeing everywhere um i want to show you how you can use this on uh lily pad as well now i uh had used to prompt earlier to create kind of this futuristic kind of hacker girl uh in a cafe uh and i'm going to kind of create a similar one uh except for this one i want to have purple hair so i've deliberately put the purple hair prompt in here as well so let's have a look at what happens when i run uh this one with the prompt here so it's going to do the same thing submit the job uh and send to compute nodes someone is going to accept that deal uh that'll then get uh run on the network and return to the user now i've run this earlier because it does take like a you know 20 seconds or so to run and this is the image i got for it pretty awesome right i really love this image uh there's a video up on the youtube as well if you want to see uh more far so if i do want to use a different seed all i'd have to do as i can see on the kind of thing here is i would um input that as an input to the cli as a different random seed and i would get a different image so for example uh i did this earlier and i put a different oh it looks like this one's finished awesome so i put a different seed in for the same prompt earlier for this one and i got uh this kind of purple head uh hacker girl in a futuristic suit pretty cool uh all right what else can we do on uh lily pad well how about llm i'm sure you've heard of chat gpt which is you know a large language model so what about running something like this on lily pad is it possible and the answer is of course it is uh so if you wanted to do that we have an open source model called uh mistral 7b um deployed to lily pad as well uh which you could go ahead and use uh and to do that you can just put the prompt in here uh lily pad run mistral 7b answer do you want at the moment here i'm just putting in name 10 renaissance artists i don't know i'm sure you can think of something much more interesting to ask it uh but again this will allow submit to this global gpu network um one of the gpus will take on the job and then the results will be returned so same thing here the answer the returned answer here is saved to ipfs and also returned to the user um and again i have put up a demo here on um our youtube if you wanted to kind of go down the rabbit hole a little bit further on that one uh oops okay so you can do way more than that as well you could do things like fine tuning a model so for example if you had uh a bunch of clawed monies art and you wanted to create a model that no matter what text prompt you put in uh it would come out with an image that looked like a clawed monie artwork you could do that with something like fine tuning so you could submit say 30 or 40 of clawed monies pictures to a fine tuning kind of algorithm and it would fine tune an sdxl a stable diffusion model to this uh to clawed monies artwork for example or say to your uh technical documentation uh so no way more about your technical documentation but would still have all the benefits of say the foundational lm model underneath it as another example this one's only sdxl fine tuning um and you can then go ahead and run that so this is pretty cool because you can bring in these images cid from ipfs train to that model and then run that model that you just trained to on lily pad so try that out at some point uh in fact this is pretty much exactly how waterlily.ai for those of you've seen this before uh runs under the hood so an artist will submit uh their work and a model will be trained specifically on that artist's work and then a user can come in pay a small fee uh to generate a stable diffusion image in that user's artwork now okay so uh we've seen the cli demo you can also use um lily pad from a smart contract so you could trigger jobs like i've just shown you directly from a smart contract as well now if you want to do this the best thing to do is go and have a look at our docs on how to do this we're going to put some more information in there and i'll put a video up on this soon uh but it is possible to trigger jobs directly from a smart contract or um you know from a front end dap using our smart contract interface as well now what if you we don't have the model or the compute job that you want to run on lily pad well this is where you can make your own now this is kind of an advanced uh thing to do at the moment uh so you're a depth of license challenge uh we would love to see some more modules uh on lily pad uh so have a go at making your own module and submitting it uh kind of two lily pad for the benefit of everyone um now what if you're on the other side of the fence here though what if you want to run your own node uh or contribute in future resources to the network well docs for that also going up in 2024 uh so check out this uh kind of bit.ly link if you are a jpu provider and you're interested in in adding your node to the network all right what about the no-code demo time that i showed you lily pad literally is for everyone and so is AI you shouldn't need a degree to access the art open source models or how to use them for your new cases um so we've actually uh built out a lily pad AI studio um which is a web ui you can see it logging into the app here so you just sign into your account here with a gmail or similar you don't need a wallet uh you can then go ahead and choose one of these modules and i've just shown you all of these how say stable diffusion mistral which is that lm model uh go ahead and choose one of them and ask it directly in the web ui interface what you want to know and you will get the response straight back here this was um this is a video i made while i was in Istanbul so that was just uh was a bit interesting like a rainbow unicorn in Istanbul grab this so it also you can also ask mistral you know some top 10 things to do in Istanbul while you're there and the lm will give you the answer so this is like you can go to app.lilipad.tech and you can use that immediately and we'd love to see your creations as well we'd love to see your prompt engineering so this is one um one of our users made and we'd love you to join us come and contribute to this future of compute join us in the bakya project slack uh bakya dash project dot slack uh dash slack or you don't follow us on twitter at lilypad.tech otherwise i hope you have some fun playing around with ai and ml on lilypad and uh look forward to catching up with you again soon awesome thanks so much for sharing who's our final mother of all days for 2023 a special thank you goes out to all of our presenters who made this demo session a success and we are looking forward to seeing more of your guys's demos in the future thank you everybody and i know molly might be in the call if you have anything that you might want to say for the last demo day um yeah i only caught uh my name because my i was sending it to the wrong headphones but i'm assuming this is a happy holidays sort of message thank you so much to all of our wonderful demoers and excited to see y'all again with some snazzy new demos and and code in the new year we might be um adjusting our our format a little bit to have a little bit more themes so we get some shared humans demoing um things in similar domain areas on a call but it's also been really fun to have the diversity of different things from uh polybase to lava to station to lilypad so thank you guys all so much for this it's been really awesome and thank you to misty for making this all happen great awesome thanks so much guys have a great day and happy holidays right