 We don't have lights. Okay, we have been lazy. This is mainly a breakdown to some extent and as well as trying to find people willing to put in the huge amount of work shipping fetching note will take so I'm just I hope that somebody will get excited and so on. Question for all of you, do you like the current HTTP Client in Node? Okay, okay, yeah, that's kind of the feeling of everybody. Okay, on top of that we have a lack of, on top of that, that is fetch which is highly popular in the browser, to use in the browser and that is a completely different paradigm from what it's in Node and it has a certain pack of other limitation. It's also packed, so there is a pack that you need to follow if you want to call it fetch. It also sits on the global object and we also just had a presentation about taking that back. So I just want to talk about, I just do a thing. Maybe we might want people for people joining remotely to have a laptop. We can get this. Probably not a good idea. Okay, so there are a bunch of things that I want to cover about fetch. There are problems of fetch, of bringing fetching Node and problems that the current HTTP client has in Node. So there are two levels of, of the two, two sets of problems. Okay, so I'm writing it, I'm sorry, I probably can't read my, we are very far away, so unless you have super, super high-sight, you might not read this. So if you want to come closer or whatever, you might want to. So problems of fetch, fetch is a complex pack that has a bunch of, a bunch of nice properties for the, for the web. Okay, and first of all, it's tie integrated into the caching model of the browser. Okay, the fetch pack, it's, it's as a full chunk of the spec dedicated to our caching works. So do we do any caching for HTTP? No, no. And our current HTTP client do not perform any type of HTTP caching. It's not an HTTP cache anywhere. We are not writing stuff to disk. We are not doing anything related to caching at all. We are leaving that to the ecosystem. And the ecosystem turns out is not doing it either. So, you know, there is that. There is a security model. Okay, like, you know, cars, all the, you know, fantastic stuff for the web. Does it, does it even make sense to have those, you know, you know, that's, those are open questions. Okay. There is the connection pool. And, you know, you know that from a browser, you can make up to four requests down, whatever, I don't know if that even changes, but whatever, for a four power request down to your HTTP server. Okay. And there is a concept of HTTP keep alive. And there is a lot of those things in the browser context. Okay, all of this is managed for you. Nice, it's good, right? In Node.js, we add those through an agent. Now, one of the greatest things that Node made as a change is disabling the global agent, because everybody was having a problem with it. So, our experience turns out that having the global connection pool and the global set of connections for, you know, pulling servers to keep alive is probably a bad idea for Node.js. So, those are, you know, more high level topics. Okay. And I'm just flagging my opinion. If you are different opinion on those topics, you know, please let me know. But I keep going. Okay. Please interact with me whenever you want. So, this is a high level. Okay. And on top of these high level concerns, we have other things, which is, it's a global. Okay. And this is problematic because all of these are tied to a global object. Okay. So, that's also a concern. We could ship Node as a global, right? But still, if it's still a single function that has attached a single caching and security and connection is still problematic for people. And it's implement what would you stream, so we currently don't support. Okay. We could have those. So, that would be simple. Okay. It's not simple. It's just a lot of hard work. Okay. So, it's possible. It's just a lot of hard work. I'm not blocking it better. I'm, you know, it's a lot of hard work, making sure that everything is compatible with us. So, these are, and then there is another topic. Okay. Now, because it's tied to what would you streams? Okay. We have an ecosystem problem because, you know, people are if I want to set a file or a cost using caching, now I have a node stream on one side and a what would you stream on the other side? And now, you know, those two are not up to each other. So, now we have a little bit stuck. And there is that, you don't see there. And now we've created with the rest of the node API and the router and the entire system in general. So, these are like stuff for French. Are there any questions? Yes. Yes. Yes. Yes. So the web browser has a slightly different method to add in more event listeners and different context with concepts that are coming and we don't do that at the moment. We have a generator that has an on and off event listener and on and off. So when, if we do fetching more, we need to reach that. So Theo? Yeah. So, let's talk about those in behavior and in API. Okay, those are two types of categories of two sets of different problems. Okay. I want to briefly talk about what the current issue of the HTTP in HTTP requests for just because it's HTTP.request. And because it's kind of connected to some extent, loosely. Okay. I want you to understand how the agent model works in HTTP.request. One. Two. Probably an Atoly. We should explain it to us. Yeah. Essentially, you're good. So the problem is it has a complex agent model. So the tension is that, you know, the agent request at tail in a performant way, it's hard. I've been trying to optimize this for a while. It's like, you need to configure it, you need to configure the HTTP agent with your set of parameters. So you have a bunch of things where you have the max sockets. Okay. And you have keep alive. And then you have a series of timeouts for all of those. Okay. So it's complicated real quickly, especially if you're building a microservice architecture built on top of HTTP. You know, these techniques might be birded down. Now, my popular modules like not fetch use this and not fetch. Sorry, thanks. Have you put in a not fetch? Have you put in passing an agent? Okay. But now if you want to use fetch as a polyfill for the browser, you are you need to have also have a global agent sitting somewhere and blah, blah, blah. And you can actually end up in the same problem that you have a global agent. And you might not really want to even have that in the first place because it's it goes into it goes into problems on its own. Okay. Now count that, you know, compensation model, you have two ways. If you're wanting to do HTTP and HTTPS, you have two agents and it gets even more complicated complication. And then it's a no, it's tied to HTTP one. And these is problematic because, you know, we have HTTP two, we are going to have HTTP three in the future. So we need a high level API for like this thing is going to get in legacy is getting old very badly. And this is my feeling. Okay. Does it make, hmm? Is there any objection or any point to that somebody disagree that somebody really like HTTP dot request is really passionate about it please pick now. Okay, fine. That's all. So, this is the HTTP request, I would like to have miles coming up. Would you like to speak for like two minutes about why, why we need fetch in, in node and as a thing, like, because I've just listed about some very negative things. So you want somebody really wants fetching node to actually articulate that. So I'm not, I'm not the only bad boys. It's good. No, but I mean, that just an API, I've always found it to be like a rather intuitive API for working with. I think within the ecosystem, it's become a fairly idiomatic way of writing code when I sit down and I want to make, you know, like some sort of request to get some sort of resource over the wire. I reached for a note fetch almost right away. And if we want to write code that can, you know, be platform independent. It's really, to me, like I will go and write something. It's like, if not fetched, then require note fetch. And I think that especially as we move towards ESM, we're like doing that kind of dynamic polyfilling top level, make it a little easier, but it's still not necessarily something that we're like jumping up in joy about like a pattern to be following. So it'd be really great to not have to polyfill it in the same way. And it would be, you know, good to have shared. And I think that like having that one interface for all the major JavaScript platforms will significantly simplify a lot of people are ready. So there have been a number of exercises to implement fetch. There is a note fetch library out there. I've gone through and actually implemented fetch. I'm going to put quotes around that for NodeCore. It didn't implement the security. It didn't implement the caching. It didn't, you know, implement what would be streamed. If I didn't implement most of what was on that list. So I didn't throw it out there because it wasn't actually fetch. I had the name of the function and it returned promises the way that works. So what's there, the challenge is we can only get part of the way they're right now. Even if we did make changes to a request, you know, we would take a significant amount of work to do. And then HP to we specifically did not do the compatibility API on the client side. This has just been an open problem, right? We didn't want to duplicate the same problems with HP. Yeah, there is also the problem that the current agent model is based on the concept of a TCP socket or a TLS connection. So each request gets assigned a socket or a connection. And in a multiplex model of HP 2 or HP 3, that makes absolutely no sense. When HP 3 comes, it's quick based, which is UDP based, which is going to use an entirely different model that nothing within that core right now applies to. So that is more or less the problem. I just wanted to state the problem. And, you know, because the question mark like shipping this is going to be a journey. It's important that we have two things. We have two requirements. We need a better HTTP client for Node, shipped within Node itself. And we will need to support Fetch, some sort of Fetch-like thing for the web and compatibility. By the way, Node Fetch is amazing. We also have Promise APIs for a bunch of things in core, and we don't have a Promise API for HTTP request anyway. So we might as well try to make it as close to the web API as we can. Yes. So I just wanted to frame the first goal of the session was framing the problem. Okay, I don't want to. I have some ideas. Okay. Oh, yeah. So, okay. Yeah, that is actually quite nice. So there is. So one of the problems for API-wise is what would be streams. Okay. Now, at the beginning there was some thoughts off and some, you know, very bad blood being spilled, whatever. And about this, there is, I have the opinion that because of the sheer amount of number of downloads and usage of Node streams, we will never be able to replace Node streams with one of these streams. Okay. This is the, like, we need to live with that mask that is Node streams forever. Okay. Yeah. It's, we can't remove them. Okay. It's not that we cannot have anything. We cannot remove. We cannot deprecate those. Okay. Like, for, if you don't know, the user land implementation of Node streams, which is readable dash streams is the fourth most downloaded module on MPM. Okay. Just so that you know that you are aware. So it's the ecosystem relies on that thing to keep being there and working. It's in very, very, some very deep dependency chain. I broke your voice by doing minor releases there. So, you know, just so that you know, it's, it's, it's in some very, it's everywhere. Okay. MPM uses it, for example, ships it in MPM ships it in, in their bundle and so on. So we need to some interrupt. Okay. So my, my idea for, for interrupt with this and open, I'm open for, for a lot of feedbacks is using a sync iterators. So essentially passing through language level constructs that they both support with the same semantics, essential to actually be able to process, to, to move this data around. And now this is not, is not empty of challenges, but it's something. Okay. This is post this should be possible as right now. So essentially it's, it's not possible because we don't have an implementation of those around. You know, we, you can write a function that reads from a node stream and read from a what would you stream right now without with the exact same code. It seems pretty cool, right? So this is, this is possible. Then one other way for shipping those it's using it's with the pipeline function that is used. So essentially make sure that you can use pipeline, no stream, what would you stream and vice versa. Okay. So you can combine those things and the logic for combining those things is built in, into, into the runtime. So users don't have to, to deal with it. Is this hard? Yes. Okay. What the hell? The reason is you want to read from a file and you want to pipe it and send it to fetch via post. How do you mix those? Are you recommending we implement the second dream, implement the full on all our APIs, create a Waku G stream one and not stream one. And what about the ecosystem? Wait a second. Let me give you a minute. WG stream. Do you mean a what WG stream in Node.js? Yeah. So using async iterators as you described. Yeah. So how is that different from just a sync interval object? It's what the same thing is the same. It is the same thing. So when you say, I think it would be a sink interval. Yes. What's that? What, what WG stream there is a sync interval. Yes. Yeah. Well, it's in the spec. So I don't know in the browser. It's. And now the stream is our async intervals too. Right. So right now if we want to get a stream from fetch, we need to do get reader that returns a stream. And then we need to do like get a sync iterator, like symbol of async iterator or forward. Maybe we can ask the fetch spec to add a method that just returns as like in a sync iterator directly and not the WG stream. And then when slash if they had that, we can only expose that and not expose. Yes. That is, that is a possibility for, for facts. So we might not have to implement those in note and create like this is a question mark. Okay. I'm not. It's a good idea. I love, I love the concept. Okay. So essentially providing a way for not having to provide that bit question how we do this, but it's, it's still, it's still open. So, you know, it's, there is this. Okay. Um, what else can we talk about? It's, um, yeah. Just going to say, you know, one of the other alternatives. So you're my fish rocks, but working on this alternative low level streams. One of the possible options here is that we just don't implement fetch. We provided a better mechanism, lower level mechanism for fetching use on top, you know, why that is. So somebody could implement what they do streams easier as it doesn't solve the HP requests API issue right now. But if we do make it easier to, to at least get the parts right to, to implement it, then it gives us some room for experimentation around what the API should be before we bring it forward. Yes. Um, absolutely. Um, so, um, yeah. Uh, the big question here for all of you, um, that are here and listening on the live stream. Hey, hi. Um, it's, uh, you know, it needs people to work with this. This is a complex complex problem. Like with, it is not simple in any form. So it needs a lot of people working. There is a lot of people complaining that fetch is not there and not enough people saying, Oh, how can we have this? How can we make this work? Okay. How do you address the compatibility with the rest of the ecosystem and work on this? So, um, uh, one option if there's a bunch of people that is willing to, to help and work on this is we can spin up, uh, uh, uh, a repository in the node organization for some extent, uh, where we can, you know, tinker with and experiment and then do a massive PR to know when we are ready. Uh, this model has been used by HTTP two and modules to ship. So it's a proven module for, you know, managing very, very big and complex changes. We have a lot of ramifications. So we might want to do that. Uh, but as I said, as I said, it requires people. So, um, there are other volunteers. You know, that was kind of, but yeah, they, I know. Hey, we have one. Okay. Fine. I'm happy. Uh, I'm happy now. Okay. Um, so, uh, it's, uh, um, it's very interesting. It's a very interesting problem. And it's also, um, uh, I, I am a proposal, but I don't know if this makes any sense. Okay. And I, I just want to float it to you and you can, you know, throw rocks at me or tomatoes or X or D products. Full bottles of wine. Hmm. Full bottles. Full bottles of wine. No, that's unpack. Open that and let's drink them. So, um, I, I have this idea of, uh, of these, some of the problems of fetch and the way node works and the way we compose NodeJS applications in the process in, in real life is that NodeJS, having it as a global, with a global connection pool and so on, all those things that are global are potentially problematic for node itself. So, um, I have this idea of, uh, make fetch API. Okay. That is, uh, a possibility. I'm just saying that we are not exposing fetch as a thing. We are exposing, uh, um, a factory. And what you can put in here is, you know, I want the connections, the timeouts. I want HTTP two, three, one, whatever compatibility. Okay. And, and so on and so forth. Okay. So essentially you can actually define what is, you can actually create your own input. You can actually customize this and that it's specific to your needs. You might also want to maybe white list or black list some, uh, uh, URLs that you might want to fetch. Okay. So I don't know. I have, this is more or less my take. Um, there is also the possibility that you, as we said, they're not shipping, no, no streams, uh, or what would you streams for a lot of reason? That's a question. Um, those things are there. All those problems are there. I'm just, you know, what we need to call this is more of a call to action rather than, uh, uh, full, uh, full blown proposals or section is just a problem, a problem that we should be aware of and that it provides something better for the ecosystem. Um, there's also one more thing for a lot of folks. It's, uh, the current HTTP model is very hard to expect. It's very hard to intercept calls and so on. Made to external services. So there is a little bit of traceability in there just to complicate things further as a perspective. I don't know. I am more or less done. Uh, I'm early. So, Hey, I'm sorry. Um, I'm asking for questions and feedbacks here. So there are a lot of complex things in there. Currently, most of us put them in my no fetch or in our teams to whichever one they want, like the best promise is it not also an option not to do this? And to use a module that's outside the ecosystem and not bring it into core tool. Is that not an option? The user not fetch as still all the issues of our current HTTP model. So rather than bring this straight into core, which is like a huge trend, perhaps an alternative middle ground is to take a currently existing one and work on it to bringing those other features and HTTP to allow people to use it. Some future point. Bring it in. In order to do that, that would require significant changes to node core itself. Yes. That's mine. So essentially you are essentially yes, but in order to ship that, we need to ship new API for node core for a lot of things that will be need to be low level. And that probably not be great. Is the largest problem the streams toward WD stream problem? Uh, not necessarily. It is, um, the main issue is that the main issue is, uh, the major problem is the HTTP agent model that we have essentially at least that's my current take. And if you look at how things are written down, we have an incoming message and outgoing message. It's very, very weird. At least for me, like I couldn't took me so much time to figure out things were moving around. And so on. It also makes like certain edge cases about proxies. So if you want to use a proxy in node, you can have a TV request or proxy to use an ecosystem module that monkey patches some of the stuff on that on an agent. And okay, I, it's, it's a problem. Um, there is a question from, um, Jeremiah. So I don't know if anyone's explored this at all, but are we sure that we aren't going to want a lower level HTTP two and or HTTP three quick, whatever client or client high level lower level. So that in a sense, so that something that would also then let no fetch be implemented on top of for, for those protocols in a more, let's say compatible way, obviously, because currently they're kind of stuck with the. Yeah. I plan that we have, but James, take that. The answer is, yes, we're absolutely going to need them. Um, fetch as a higher level obstruction does not expose enough detail to allow you to actually make use of the specific features of HTTP two and HTTP three. Uh, there's like push streams, for instance, there's nothing there to do that. So we actually need a low level client in order to have a client side that exposes those for HP three. For HP three, um, we're actually going to have two clients. There's a quick protocol that is HP three, which is an application on top of quick. Uh, and that API, there'll actually be just a really low level quick clients. Uh, there in HP three API will be on a layer on top of it. So there's there's multiple layers of clients there and multiple levels of detail and different features that need to be exposed to each layer. I'm a little bit confused then again, we need for fetch because I think, um, those protocols were kind of listed as a reason that we would want this, um, even though we will, it seems inevitably look at lower level clients in any ways. Um, there are a couple of reasons. Um, the first one is, um, it's not probably close to number one complaint. Um, from the folks on the web platform. So, um, and currently using no fetch as a certain series of, you know, compatibility issues around, as I said, connection pooling by default. It has no connection pooling, which means that if you're writing a react app, uh, server-side render react application or view or whatever, and they're using fetch. Uh, it means, and from the global object, it means that you are actually, uh, creating a TCP at that TCP socket every time you make a request. And if that's an HTTPS connection, you're actually instantiating a new TLS context every single time you're making a connection. Which means that you're going to get very bad performance out of your app. So, that, those are reasons. Okay, I'm not sure if there are enough for you, but that's, um, that that's one of the first questions that don't have to be asked. Yeah, no fetch connection problem? So, you need, no, you need to pass in an agent. So, you basically need to move around an HTTP agent or set a global HTTP agent on an old core, causing all sorts of those problems. And are you, are there any more questions, maya? No, I'm just flagging, I'm just exploring this area, okay, and looking for people that want to to deal with this and try to sort it out in somehow, in some form, in a way that is interoperable, decently interoperable with the web. So, I, you know, it's a thing, it's a problem that you need to solve if you want to stay relevant. So, somehow. Um, Miles, I don't know where Miles is, but he's out. Hi, Miles. Um, uh, Miles, you have done some work on, uh, in embed, trying to embed known fetching node. Uh-huh, yeah. Uh, what broad blocks did you face and, you know, what are the main, main point problems and points? Oh, hi. So, I have a branch and, uh, nothing works, but I have, like, one or two commits and basically, I've entered node fetch into the depth folder and I brought the web platform tests because, uh, I don't know if Troy is in here. Oh, Troy did some, like, really awesome work with documenting and automating the process of rendering in web platform tests, so there's a whole bunch of tests for fetch. Um, so I got the web platform tests set up and then, um, I was trying to wire up node fetch, uh, and it was, like, super naive, just, like, instantiating it and just, like, putting it into the global and then putting it into the module cache. Um, but for some reason, the way I was wiring it up, it wasn't actually available on the global and it wasn't getting required and then it was 3 a.m. and then I was like, screw this and I shut my laptop and then I just have not motivated myself to ever open it again. So I've mostly been on offline since. Um, but yeah, I guess, like, the biggest block over there was that last little bit of, like, actually how to wire it up and expose it. It was just something I was doing wrong. Um, I bet you if we sat down and looked at it for two seconds, Joy or Anna or someone who's more familiar with some of the internals of node would probably be like, you just need to do this one thing and then we would have, like, that fetch at least working experimentally. So I think I'm pretty sure I have that branch on my computer. I could just go rebase it and then someone can try to get it working and we could just play with that. Anyway, um, I don't know if you expected something different from this session or anything. Uh, it's uh, yeah. I will explain in my talk on the second day about when, like, where we should add something to go. So, um, yeah, that may be useful. Also, like, I think for testing fetch in node we need to, like, have a server because, uh, Web Platform test, in the Web Platform test, um, they have this kind of server setup, uh, if you, like, actually run in it, um, it will, like, start a server so that you can test certain ideas, like fetch against that server in, instead of, you know, just testing the interface without actually taking functionality. I think we would need a little bit more, like, um, hooking up to do in order to actually run tests. Yeah. Any, yeah, that's that's a good point. Uh, for all of you that you don't know, there is this Web Platform test. So there's a test from the Web Platform that we are pulling in into node and we run them as part of, we can run them as part of our testing suite. So, uh, we can, if we write our own, write our own tests, we can spin up a server, okay? That's what our test our current HTTP thingy. But with WPT we need to do it in some other ways. Is that, I understand your point correctly? Yes. There's also some kind of, you know, redirect and, uh, we also need to, like, eliminate so, uh, in the Web Platform fetch, there are a large chunk of tests dedicated to things like CRS and, you know, caching. Like, we would, like, if we want, like, a minimum set of functionality, in fact, we would have to, like, exclude all of those. Yeah, it's an important point. If, if, in our implementation of fetch, for instance, if it's not alright, if we're not 100% compliant with the fetch spec, are we, can we actually say we are implementing fetch? Uh, we had some arguments within the core around, um, Web Platform MPI compatibility, um, that it either has to work the same, the exact same way that it is in the browser, or to not, you know, or, or we're doing something wrong. We've had the kind of, these are arguments around the promises and some other things. Um, so one of my concerns is if we do implement, we're not sort of implementing fetch, we provide the MPI, but we make modifications to it so that it works better in Node, right, what kind of issues are we going to run into with the differences that are in browsers. To avoid that, um, we really should be looking at, going back to the WebPG and trying to propose some changes such as making, make fetch or whatever API here part of the standards. Right. So, I, that's my concern. So there's like one or two extra commits on this branch that I don't remember writing and, uh, it looks like it might work. What I did is, um, it's compiling right now. Um, do we still have the, yeah, let's, I'll show you what I did, it's bad before I do that as quickly. Sorry, I just, uh, quick tip, if you're using computers and you're putting it on a projector and that projector's on a stream and you don't want people to see what you're doing, close all your messaging, uh, clients first. There's a setting where you have a second display attached. There's a way to do that automatically. I think so, let's talk. There's some Mac app called Muscle, but doesn't it? Um, so, I made a new, um, experimental fetch flag, which, um, if you ever want to make a flag in node, essentially you want, you have node options .cc and node options.h. This used to be not nearly as nice, and I think, Roy, did you do a lot of the refactoring for this? Uh, for the node options, like this stuff? Uh, it was like the C++ refactoring stuff. Okay, well, thank you. It's so much nicer. It used to just all be, I'm like, node.c, if I recall correctly. Mike, can you share it on Zoom? Uh, yes. Nope. Nope. Sorry. Oh yeah, this is a cool article I was reading, and Roy, I saw a little look for patent grants in the MIT license. That's the fun part. Okay, so GitHub node. Sonnet. No, we haven't even hit V8 yet. This is going to take a while. Um, so here's the bad code I wrote. You can make it add option. The name of the option. That's the like variable. You declare the variable. Here. So experimental fetch false by default. Then inside of each to be here. You can just from internal options, you can call get option value and then get option value of like the option itself will then give you the flat, like the flag that you passed. So that's kind of how you make flags and note now, which is fairly straightforward. Um, so you can see cons fetch equals required internal fetch and internal fetch just as module that exports equals required internal dot depths dot node fat fetch. And if we go here to um, internal, yeah, that's what I did wrong in depths. Maybe that's what I Oh, I think doesn't internal slash depths then like go to the depths folder. Yeah, so that just works. I'd have to look up why it just works. Are you looking for why link fetch is not showing up. We'll see in a second. I don't even remember writing this so but then essentially all we do is like we check the option value. If it's true, then we require internal fetch which in turn then requires depths that fetch. So that's what defines the property. I defined it on HTTP. So this was one way that I thought we could fix some of the problems of at least people. A, not being happy that it's called fetch because it would be HTTP dot fetch. So it would be part of our HTTP API. It wouldn't be on the global and it's just rendering fetch. My plan for this and I do this sometimes is write something really, that you know won't ship, but it's some code and then open a PR and then people will often then come in and swoop in and that's what happens. So this is still building we're at the beginning of V8. So this is going to take a little while before we can actually test it. I think we have what is going on here. I think we have some time though. Oh, projector just turned off. The HDMI is sketchy. So we're at four o'clock. Is it break time anyways right now? So maybe we can take a break and by the time the break is done maybe this will be done compiling. So thank you for looking in my hat. You can go now.