 All right everybody, I'll ask you to take your seats because we're going to get started. Really excited to be hosting this panel here at Boku tonight. It's going to be a really great panel and we're going to have a great conversation. For those of you that don't know, Boku is an open web technology company. We develop and adopt open web technologies and our mission is to move the web forward. Speaking of which, a quick plug, one of the companies in co-working has made a really awesome open web game called Collide. It's all open web technologies, it's like a multiplayer space shooter and so if you care about the open web and you want your colleagues to know that we can make really great games with it, definitely check that out and tweet about it. So I'm going to hand the mic over to Anna who's going to be monitoring this panel. You can ask questions through Google Moderate which is linked to from the meetup page and also from the live stream on Google Plus or if we run out of those, we're going to take questions from the audience. So thanks a lot for coming and let's have a quick round of applause for Anna and the panel. Thanks, Willis. All right, so this is a tag. There's a few members in the room as well that can raise their hands and we'll see them. Really, the Technical Architecture Committee of the W3C and it's sort of an oversight group trying to review the work of other groups and trying to steer them in the right direction and at the moment what we believe that direction to be in is to sort of unify the web into the direction of becoming an actual sort of operating system basically, have exposed more primitives to developers and making it way better for applications. I'll briefly introduce the panel members. They'll each say something about themselves and what they're actively working on and then we can go to questions. Alec Russell, Yuda Katz, Sergei and Tim Berners-Lee. I'll just give the thing around. Hi, I'm Alex. I work on Chromium and Blink and these days primarily working on the design for something called the service worker until Tom changes the name any time now. If you have an idea for how to name it, please talk to Tom and generally thinking about how to decompose the primitives in the platform down to the lowest level possible. Hey, I am a JavaScript programmer by day and night. Most of what I've been doing for the past few years has been getting involved in groups like this and also trying to get other JavaScript practitioners involved in these committees and been working with Alex for a while now on basically the same thing. I'm going to figure out how to let people like us have access to things that until now were hidden and stored in magic inside the browser somewhere. So I think the service worker is basically a good example of how to deal with the fact that when browsers try to build offline, they build AppCache and what we would like is just primitives that we can work with and let the JavaScript library ecosystem build it out instead of having the browser guest. So yeah, I've been doing a lot of stuff but it's all focused around that. Hi, everyone. My name is Sergei and I am from Russia, a surprise. I am very new to double 3C so I don't know what answers could I tell you but if you're interested in something about Russia, you may ask me. I may also note that I am a JavaScript programmer, surprise, and I am working, all the previous was a joke because I am really working in large projects, JavaScript, for instance, it's a Yandex Maps API. So if you have any questions regarding the architecture of big JavaScript systems, you can address it to me. Thank you. So Tim Berners-Een, I've been, I suppose, now over here, when the website moved to Massachusetts, then it was the important thing to explain to everybody about the architecture was how declarative it was. And there's time for very nice, in a way, where it was this function, where you put a URL and you get out something where, more or less, if you put the same URL again, you get the same thing out again. It was this sort of invariant, which this function was a function where things changed over time, and things got, and CSS was nice and declarative. And the whole thing, when there was this sort of declarative rush about, and I suppose it replaced a lot of web-based things, it replaced a lot of things, which had been coded up. And even though everything was produced by programs running behind that, so the user, it looked declarative, it looked user, it just looked like this static space to move through. And as time has gone on, then the JavaScript has become more and more important to me on this arc now, to build this incredible computing system. And I think it's going to be very exciting. But I also wonder whether we are going to, whether, after a while, it won't be long before we start getting desperate to refine these huge heaps of code that are all dependent on each other and they're really difficult to manage, because code always is, and any system, and whether we don't end up starting just making higher level declarative things again. All right, thanks, everyone. There was one question, we had a Google Moderator set up, but there was only one question submitted. So if you can all start thinking about questions while we address this one, that would be great. Just raise their hands, because I'm not looking at the Google Moderator at this point. The one question that we did get through was about promises, and about what the question was like, promises seem done now? What were some of the challenges faced, and who specifically have been involved on the tag? I've been involved, but I'll let Alex answer as well. So how far back do you want me to start? Everybody know what a promise is? Hands up, yes? Okay, we'll call that three quarters. For the quarter of you who didn't raise your hands, they are a standard contract for something that is going to happen later or already. And it's one thing, and it may have already happened, that's it. That's all you need to know, but you'll be informed about it later. That's it. That's the whole contract. Also, it could fail. TLDR. So it turns out, in that description is so many opinions that you can't possibly shake a stick at them. Luckily, we've had quite a few libraries that have tried and failed, my own included, and some that have succeeded, not mine. But from all of that, you get some sense for the problem space. And so, early this year, we started to design Scrum with a bunch of people who care. God, this mic is loud. Trying to come up with a basic pattern for what the... You may want to briefly mention Promises A Plus. Yeah, a basic set of constraints on the design. The Promises A Plus stuff was sort of ongoing, but we didn't really take it as a constraint. We ended up all in the same place, which is a happy accident, but it sort of points at convergence, which is great, because it means that we may all be wrong, but at least we're all wrong together. And so, this spring and summer, we have gone through a bunch of iterations in what is unfortunately large design space to come up with a standard contract. And between many people in a GitHub repo that I was running and then later in a GitHub repo that Dominic was running, folks like Mark Miller, Yehuda, Dominic Denikola, Anna, who put Promises in the DOM spec in the first place, and it's been a bunch of time iterating on it. Yeah. We wound up getting a proposal in this room last week accepted to get Promises into ES6. So you may not like them. There are parts of it that I don't like, but hold your fire because having it done is better than having the thing that you want. I promise you that that's true in this regard. Promise. Yeah. So I think just in general one, the way to think about Promises if you're not heavily using them already is that it allows you to make a request and not you yourself have to deal with the response and hand off that thing that is going to happen to somebody else. So when you use callbacks, basically the way that you have to write JavaScript code is that the person who makes the request also has to supply the callback that says what should happen when the request succeeds. But sometimes you want to write code where the person who knows what request to make is not also the person who knows how to handle the response. And as you build increasingly decoupled JavaScript applications, you really do want to be able to say, hey, I know I need to make an AJAX request to this URL, but Ember is going to put it into the DOM or Angular is going to put it into the DOM and I don't actually care about that part. So basically having a thing which represents a future value that some other library or other part of your code can tack on to and see when the thing succeeded and not force you to write both pieces of your code in the same place is basically what this is about. Having a way of saying, here is a future value. You don't have to care from when it came. You don't have to know anything about how the request was made in the first place. It may not even be an AJAX request. It may be pulled out of some local cache or something. But you care about applying it. That's basically the point of it. All right. So audience questions? Tom. And there's really an application on client. And how do you kind of square that if you aren't into the web? All right. So to summarize in the mic, how do you strike the balance between a Turing complete system and the declarative system, which was what the web originally was, but it's turning into more of a Turing complete system? I'll start on the others. And since Alex and Yehuda already talked, so Tim can give his views. Well, you know them in two ways. One is, well, so you start off, of course, the JavaScript is not dictated language, but it is quasi-static. It is a set of files. And so that fits in as part of the model of the web. And then with that, of course, in a way, so when I think in the future, hopefully you'll also be able to invoke one bit of JavaScript from another much more link than venting it between HTML files, much more fluidly. And without being so constrained using such files you have when you run Python or Java or something. So that you can use the power of web and links and things. But then what typically people do is that they write a big procedural system and then they realize that it takes the parameters it takes are quite, then they pass in all these, first they pass in some parameters, then they pass in some options. Then they pass in some set of options which are in fact themselves nested quite complicated data structures. And then they realize that actually writing in JSON is a pain and there's a better language for it. And then they write a part of the language and you define language. Then you've got another declarative language which has got also some nice declarative properties and it's just implemented in terms of big layer. So I wouldn't be at all surprised if we see new declarative languages being bootstrapped using the procedural stuff and in a way we might be able to get some... keep a synergy in some ways like that. Yeah, I think that's exactly right. I think the idea... I want you to speak after me. I think the idea is that we sort of have seen the story where we say declarative is very important so let's just design the declarative part from the top down. We've seen that story over and over again like I said, Catch Manifest. How the image tag works, how the script tag works. We've seen this story. And over and over again what we found is that when we start or eventually provide primitive forms, people are able to do a lot of things with those primitive forms and eventually like Sir Tim said, eventually people end up building declarative forms on top. Now I think one thing that sucks about the current situation is that building a declarative form on top itself requires a big blob of JavaScript so just getting at the HTML that's on your page requires letting the parser run and then going and scanning the DOM for something. So as an example, and I think it's much more desirable to have hooks inside of the browser's parser process that let you say the browser's going through the declarative form, now it sees something it doesn't understand. Script, go do something. So have your own image tag that maybe uses Canvas under the hood or something like that. And I think that's sort of where a lot of us are heading with this is exposing primitive forms that aren't really designed to be used as ever-increasing balls of JavaScript but more are designed to allow people to build declarative forms once we figure out what it is that we want instead of trying to figure out everything ahead of time, which I think doesn't work. Yeah, so I think you're here to pass the mic to me because I spent a couple of years working on both components. So having built one of these very large piles of JavaScript which attempts to do exactly what you heard it was saying, which is to say, oh, look, I've got a gigantic dom of completely meaningless divs which happen to have attributes hung off of them, which I haven't viewed with meaning via the power of JavaScript. You eventually go, oh my god, seriously, can I just plug in here? Because there's so much utility in HTML to be able to tie commonly agreed semantics to standard UI to figure it a little bit. But that has really drastic limits. And those limits are largely based on what we expect our platform to do. And to the extent that HTML has been a very good carrier of stuff that we already know how to do, it's been wonderful with that. And I think it's not tarnishing HTML's legacy to suggest that what HTML does when it encounters an HTML element is to find a tag name, look up the constructor in a map, create an instance of that thing, pass in the attributes as arguments to the constructor and carry on its merry way. That's exactly what we do inside of Blink. And I think that's exactly what every other HTML engine has ever done. All we are suggesting is that that machinery should be made user accessible so that we don't have to give up on HTML to go do slang, which is what it is when you sit down and you take a div or a span and you add a bunch of CSS classes or extra attributes, data-attributes that mean something that HTML didn't have a pre-configured list or grid or something type 4. You mean something else, something HTML doesn't say yet. And if enough of us say it, well, we eventually would like a path back to HTML encoding that. And if there's anything I think we can do here, it's to use little bits of JavaScript that are empowered to plug into the parser to help us inform how we should evolve HTML in the future. Because I think what we're doing today is alchemy. I think we're not doing any sort of real science. When people talk about the semantics of HTML, I think they're mostly guessing. And I think they're guessing based on a little bit of data. So since early this year I've put together a project called Meaningless, which is a Chrome extension which attempts to look at the actual real-world semantics that JavaScript imbues with HTML elements based on DOM mutation of servers. So we look at not just the stuff that came down the wire when you sent it, but also all of the ARIA rules and states, all of the schema.org attributes, all of the micro formats and pretty much all of the other ad hoc stuff that you're going to add to one of these documents. You have the Facebook invented elements, all the custom elements stuff. So this is an attempt to go take an observatory to the web to see what we're doing with HTML. What is it that we're actually turning HTML into? And it turns out almost all of the web is divs. Almost all of the web that you interact with every day turns out to be divs and spans. And those are largely about styling. Divs and spans are about you putting together a block or an inline block. Those are your choices. And we mean a lot more than that. We mean a lot of things that HTML doesn't have words for today. Or we mean things also. We try to take HTML as far as it will go and then we try to turn it a little bit with an adjective or an adverb here and there to get it to carry other meaning. So letting you into the parser, letting you play as a first-class citizen in this world to do smooth extension, as Guy Steele would say in the language, is not heresy. It's just us voting with our feet to a better world that HTML can then come back through and standardize. I would like the Oxford English Dictionary of HTML. Can I have it now, please? Yeah, I think a good example of this whole thing on the HTML front is back when Apple released the Retina MacBook MacBook Pro. I don't know what the first one was. What happened was that all of a sudden all these websites couldn't display well on Retina screens and it turned out that it was actually really hard to make that work and if you looked at what Apple was doing, they were doing insane things to make it work on their own websites and so there was this mad dash in the W3C to try to figure out a way to standardize some solution. And what ended up happening was pretty messy and complicated, but I think largely the reason why it was so messy and complicated was that the process had to happen top down. It was like an emergency, essentially, and instead of our first instinct being let's write a little library that creates a new element we can call it the picture element, which is what they ended up calling it and we'll have it have the behavior that we need, we'll have it go figure out what the screen size is and we'll have it look at its child elements and render the right thing either maybe into a canvas or maybe using an image tag, right? Instead of having all that machinery available to us our first instinct was to say W3C, please, this is an emergency, solve this problem for us right now. And I think the problem is that then a year went by, and I don't know if we'll ever standardize anything here, it may end up being impossible, but a year went by before really any meaningful progress was made and that was because we felt like we needed to do a top down. So I think in general, the idea of exposing enough primitive so people's first instinct is not, oh my god, if I have to hook into that what kind of crazy shenanigans am I going to have to pull? Like what kind of crazy work am I going to have to do to allow myself to get into that process, but of course I'll make a custom element. That's obviously the right thing, I think is what we want. I think that answers one of Paul Iris's question. He's been submitting a bunch about whether or not new features should be pull available. So we'll go to the next one which is sure. I don't know why I cut them a year, but if you want to do something quickly, you can go www.feed.org and start a community group and if if something comes up don't wait for a year just do it and you can do it within a community group which gives you a way of firing up other things but I don't feel that it takes a year. So people should it's okay, we'll go to the next question. We don't have to say everything on this topic, I think. Trailing behind native's feature set doesn't seem like a strong strategy in the long term. What are we missing from the platform that takes advantage of the web's intrinsic strengths? Do URLs work for discovery in our multi-device future? Who wants to tackle this one? So I think let me just make sure I I don't see Oh, hi. Hey, hey, what's up? So I think the question was something like it seems bad that we're spending a lot of time trying to catch up with native probably this is not a war we can never win, we should instead be thinking about what things the web is already better at native and try to help leverage those things to make good applications and URLs is one of those things. Something like that is roughly the question. Cool. Okay, awesome. Yeah, my opinion from the last year or two of my own personal work is that URLs are probably the web's most important strength. I think URLs are probably I think URLs are probably the thing that's going to that has been here from the beginning and is probably the thing that's going to be here in 10 or 20 or 30 years, even if the entire application runtime is replaced with the WebGL context, I think people will still be typing in URLs in a URL bar. So I think URLs are crucially important and I think there's a lot of so the platform has added a few primitives recently in push state that has enabled people, frameworks like Ember to build tools around this. I think there's a couple things that need to happen. One of them is I think we as consumers of frameworks need to demand that URLs are considered first class citizen and that a typical usage of a framework does not result in bad URLs. So this is something that I care a lot about and I've been spending a lot of time thinking about how to build something like that. The other thing is related to the service worker stuff which is I think crucially important that it be possible to intercept navigations and if basically what you want to build is a JavaScript application that is only using JavaScript to replace, you have like a shell and you're replacing a center area and you're clicking around and you just want basically increased performance so you may be downloaded, JSON payload or markdown and you do the rendering on a client something like that. I think we need a mechanism for allowing us to intercept URL requests in the client and having some code that actually answers instead of always having to go back to the server for those things or do everything in JavaScript like a la ember and that's something that's pretty exciting but I think just to reiterate my earlier, the thing I said first which is I think it is crucial that we as web developers wake up to the fact that most people who write JavaScript applications for the first time screw up URLs and increasingly people who use JavaScript applications feel like things are broken. I think for a long time Gmail or sorry Google Maps had this problem where if you wanted to share something with a friend, you couldn't grab the URL and you had to punch a ticket and pull out your URL from some random icon on the screen. I think that was terrible and I think as JavaScript developers and as users we need to make sure basically if JavaScript causes us to lose the URL in the platform in the first place then we have no strength our strength is only there because URLs actually are a good thing so I think we need to make sure that we demand good URLs support from our frameworks. In short I cannot mention the web without the URLs. The URLs is the thing that made the web and I don't think that we lose it, it's impossible if we we're losing URLs so then the web begins dividing and that's not the thing we want to face in my opinion. Yes, right the URLs are the fundamental building of the web. Now HTTP is secondary to it and HTML is tertiary you can easily use HTTP to move other things in HTML you can use a URL not easily, if you introduce another protocol like HTTP or HTTPS then there's a big cost so I think it's important and you have to do things, it would be a good idea to get it all discussed with these people for example at the tag and there should be a hoops to jump through and it's important a flexibility point for us to keep so when we realize we need a namespace which has got very different properties from HTTP and we don't want to just keep morphing HTTP which is also a good idea then we need to have that colon we need to keep the HTTP colon on the front so that we can use it as a flexibility point for making a really significant change what I don't like is that what could mess this up is if people say actually we're going to use we're going to start renting out URI scheme names and we're going to put them in DNS or we're going to make them allow you to register them every app will be able to register 60 URI scheme names of course apps are generating URI scheme names sort of every few seconds out there so there's a heads up there's a danger there we might lose that flexibility point the next top photo question is from Boas there was a panel here about a week and a half ago with TC39 members the committee behind JavaScript and in that panel Adam Wolfsbrock the editor of the specification said that the web as a platform would probably last at least 30 years just like the 30 years of mainframes and PCs and the question to the panel is when do you think of the last well maybe you can correct my timing but if it's 30 years we got 10 more right 24 and a half so 5 years so start getting your resumes ready kid yeah I don't because the URL is the fundamental building block I think a better analogy would be the telephone system and the telephone system you still mostly use telephone numbers to call people unless you're in a very very tech space so 100 years later things are changing a little bit but it took a long time and a lot of infrastructure changes and I think the URL is a powerful enough concept for sharing I think increasingly the things that we do using the internet so I think the internet is not going away and increasingly the things we do with the internet are share content you're looking at save it for future use make multiple fork off where you are and look at it on another window and all these things are features of the URL so I would be much more bullish than a 30-year time frame I think what we are looking at with the web is actually if you look at the URL as the core building block it's more fundamental it's not fundamental infrastructure because that changes just like the first person who ever used the telephone didn't have an automated system it's more fundamental naming what we have with the web is a way of describing a thing and getting to it easily and sharing it with your friends I don't know what the upper bound on that is but I don't think we are particularly close to it You know if you try to look what are the new things that are coming into our virtual reality then you should know that every one of the things that's some fine things that are going online have has a URL because URL is a method to address a thing so I would expect that the role the URLs would increase not decrease so I see no reasons to think that the current web platform may degrade in years that's not like mainframe which is mode of operate but the mode the thing that address everything in the world and it could be extended very and very much more and there is lots of things that should be addressed and still has not URL and they will have URL so the future of URLs is quite light and don't dark Just talking about underlying internet technology it's actually already been around for 44 years it was 1969 they put internet together and they designed an internet packet and basically the same internet packet it would go over 100 or 300 bits per second line and now it may go over 300 megabits per second line or maybe 300 gigabits per second line so that nothing can you think of any other technology nobody's driven the train designed a train which would then over time evolve to go a million times as fast that just doesn't happen in any other field any other technology but there is a certain amount of scalability and time proofness here we're already doing better than some of the original some of the other engineering so I have a follow on question how many of those original bits per second were used for sending cat ASCII alright there's another one from Paul Irish his question seems to be particularly popular where are the bottlenecks in evolving the web platform with the spec writers people hacking on browsers both anchor browsers and is it possible to move faster anyone particularly keen on Yehuda let me tell you a story actually I did something I pinged Alex about for the first time this week that made me very very happy somebody submitted a bug to Ember that said I'm hitting this issue in Chrome and someone investigated and said it exists in the current Chrome but it's not there in beta channel and I replied it's an evergreen browser we're not fixing it it'll be fixed soon within a few weeks and it doesn't make sense to put some hack into Ember to work around the sub security case in Chrome that was the first time I was ever able to do that but it felt very good so I think for sure both anchor browsers are a problem and I said this a few times am I being reported awesome no but I'm personally very very scared of the situation with Safari I said this a lot I know other people are much nicer than me IE at this point is a slow and steady browser for me in the sense that it's not the first browser to pick up features but I have not seen any important features IE has basically nuked and all the important features IE is interested in and once they make it far enough along it seems like they'll do it and they have a good history of that Safari is definitely building up a bolus of features that they are essentially avoiding implementing and I think it's reasonable to assume that they may choose to never implement a whole bunch of important features I think it's good on the Chrome team to keep the pressure up on that and the Firefox team but I definitely have a lot of fear there and I think because of the fact that there are still slow and steady browsers and browsers that are not implementing anything at all and many of us have to write code for those browsers I would love for the specs to move faster but they're not the things holding us back at this point I think the specs are probably moving at roughly the same pace as IE at this point so the specs are slow and steady and I would be very sad if browsers were as fast as Chrome and Firefox then I would say the bottleneck was the standard process but at this point Safari really needs to deal with this Alright, the next question is from Matt Andrews in London AppCache has got a lot of flag for its flaws but it does work see financial times has good browser support and its replacement event worker feels a long way off and not declarative can we not fix caching masters no prefer of line fallback added Java script API etc. first Sorry, I just got to take this I'm sorry so the answer to the question was can we implement a bunch of new stuff in browsers and I'm happy to tell you that the answer is yes we can so we could fix the set of relatively small word snapcache that were listed and they would have an impact in about the same time frame that it will take us to ship the event worker so service worker that's what we're calling it, service workers and so the choice of do I get the very powerful low level primitive that lets me get myself jail in the future or do I get a couple of band-aids of everything that doesn't declarativeness is a virtue once you know where you're going sort of being locked into a particular trajectory is immensely wonderful once you know that's a good place to be going my I think informed I hope informed statement about what we've done with offline today has been that we have no idea where we're going we have not given web developers enough power to express what they need to do in script often enough for us to even be able to look around the world and go yeah most people want this like we have an idea about what most people might want to do but we're not most people like we are all individual peculiar people who have specific needs and until you can go pull the whole world and say hey if you put yourself in the situation and imagine that this was your problem and came up with a solution would it look exactly like mine until you can actually do that by observing the world as it is you're not doing science you're doing alchemy and so I reject the thesis that declarative is necessarily good up front I would like for us to have a vibrant ecosystem of people who are trying to solve their own problems that we can observe and say okay great everybody's doing this it's really expensive it's slow let's go put that in a spec let's go extract the things that are really valuable and turn them into a high level form that happens to be extensible should you need to get yourself out of jail in the future yeah I also so it may well be that the financial times eventually figured out how to make apcash twist to their whims but almost everybody I know of including Facebook and many other large companies have basically fallen back to the point where mostly what they do with apcash is produce better error messages in the case that you are offline so you may have seen for example that if you go to google when you're offline it doesn't show you the normal 404 page it shows a nice like hi I'm google and you're offline you're learned some more about this and I think Google docs has tried really hard to make apcash work and it's pretty annoying experience offline if you can make it work I just I actually reject the claim that apcash is mostly working I also reject the claim that it's mostly interoperable I think the core problem with apcash was that it was designed to solve a particular problem that it turns out that very few people need so everybody was forced to build their applications in that model. The example that I always give that was in one of those Band-Aid hacks is and it just shows you how hard it is to design stuff up front. Imagine that you want to build a website and when the website is offline you would like to show offline data but when the website is online you want to hit the URL so imagine that you are building a blog and you would like to when the user is offline obviously you want to show them the last thing that you downloaded but when the user is online you don't want to show them the last thing you downloaded you would like to always get up-to-date information because it's a blog. Apcash does not allow this so this scenario very simple scenario is not possible with apcash and there's no escape valve so what you have to do and you've probably seen some apps that try to do this is you show the old content and then you show a little yellow bar a yellow thing that says this app has refresh please reload except it's a blog so no that's a stupid thing so anyway the point that I am making is solving offline in a declarative way involves as Alex said much much more knowledge of what the problem is than we actually have and I think much worse than that it makes it impossible for the ecosystem to actually try to solve the problem themselves so instead of now having two or three years of experience with people like Facebook and Financial Times and Google actually using the primitive tools to figure out what the requirements are we have three years of people twisting their applications into the broken apcash model and I think in general we are better off we are better off shipping primitives that libraries can use and I'm actually a big fan of browsers using those primitives to build their own libraries that they think might be a good idea but in that case if the browser gets it wrong if Google ships a piece of polymer that is offline support and it uses the primitive and Google got it wrong totally fine everybody can go out on and do their own thing and not wait three or four years to convince the spec editor to make a change so I yeah you want to say something Dan you have to say it in the mic I just make one short point in this I'm Dan Applequist and I'm hi one of the co-chairs of the group so I said I didn't want to be on the panel but I just felt like on this on this particular topic I think you know I used to say two years ago no no no I don't really don't want to be on the panel okay shoot I used to say this exact thing like well financial times is using the is using app cache that must be okay right you know but like two years later I found myself saying financial times is using app cache that must be okay right and nobody else has built applications to that level that financial times is built and I think they've only done that because they've really put a lot of engineering resource on it to do a lot of very custom stuff and I think that's really proof that we need something new we need a new approach I think it's a real strong proof point that you don't see a lot of huge you know a huge number of great offline web applications out there so we need something new obviously and we also have tons of platforms and we also have basically every single platform is saying basically abandon ship on app cache if you want to build offline apps zip up your stuff and send it to the app store and people are actually thinking that this is a good idea I think that's also a proof point that we have done something wrong here alright the next question is in a different direction if URLs are significant and the content bound to the resource identifier why is versioning not a first class citizen on the web yeah why can't you query a website for version is what the question comes down to it seems I think Tim might have good idea yeah why not and I mean there have been so you know the often in fact there's a lot of work on metadata there's a lot of work on provenance and things so in fact a lot for a lot of things so for example there's existing protocols you can use if you want to expose that sort of thing for example you can put link radicals meta in which you can just put in the HDP header and you can then point to a metadata file which contains which gives information about the provenance of thing if you look at the stuff the provenance of working group the some areas where provenance is really really important like in scientific data or like in museums and like in court histories and things like that so you'll find that some areas where people have gone into this in great detail look at the the output of the provenance working group if you want to see ways standards for writing that sort of metadata I think why doesn't it happen across the web I think partly because the needs for provenance and the types of versioning that go on are kind of different I think if we try to force them into the same model it would be disadvantaged but there are but there is there's all the building blocks to allow to export and allow people to find the the provenance and where things have come from and previous versions and things yes, I can go on more with it it would be rattle there are some questions related about security apparently Corkford made comments that the HTML5 spec is not taking security into account and what are people's thoughts on this and also when we expose more low level features such as WebGL that increases the attack surface since you're closer to your hardware is that a problem or is that okay it's hard to answer the question without knowing well a few years ago it was basically the question was as fake as I said it was so we would probably call Doug but to channel him a little bit there is an argument in security engineering about a chosen protocol attack which is to say that the more craft you add on eventually the combinatorial effects of the permutations that you've sort of left around to combine are things that will bite you in the ass this is true this is absolutely the case and so your choices are to attempt to have a simpler system or to have a system with defense in depth and if I have my druthers it will always be a system with defense in depth because no matter how simple a system is you can get it wrong and the best research we have about human frailty suggests that all of us get something wrong all the time unless your name is Dan Bernstein in which case you get a gold star did you write Q mail no okay you don't get a gold star so for the rest of us we have to figure out how we're going to give I and so I believe our best tools for getting by involve not trusting ourselves very far they involve turning off the greatest amount of privilege possible they involve giving ourselves tools to say no more often than we do ourselves so that we are put in constraints that generate reasonable conversations among actors we can reason about big boxes that look like I'm talking to a secured thing over there and it's a privileged conversation versus lots of little fine-grained things that I have to reason about at every moment little fine-grained things you have to reason about are those permutations incarnate this is probably why I am not a huge proponent of secure echmascript or many of the other initiatives like the lockdown javascript I think you should put it in a box and you should make that box as strong as you can and you should put it in many different boxes you should sink them in the bottom of the ocean and you should throw away the keys in a different ocean if at all possible but you should never attempt to build one lock that will keep everyone out because one lock that will keep everyone out only requires one attack to succeed it doesn't require getting the deepest ocean trawler you can and then figuring out 12 different locks yeah so I agree with that I sort of have a different angle on this I think people's perception if you just look at the platform is basically every time you add a new feature you add a new surface area and the commentary explosion etc that feels true I think one way to mitigate that is to not constantly add new special case capabilities to the platform so instead of adding 10 offline caching features with their own internal magic that have to work if you instead expose one caching primitive and you have all the additional features that the platform provides built on top of that caching primitive then you can lock down the caching primitive and everything else is built on top so an example of this maybe there are some security issues in appcache so you might think well if we have appcache and also the service worker then oh my god now we have two security holes but what you should do instead is you should add the service worker and then you should rebuild appcache on top of the service worker and now there's only one security hole and we should increasingly say there's all these html tags instead of having each html tag have its own individual appeals to random c++ code as much as possible they should all be built on the same DOM primitives that we secure one time and then those systems all those tools that we as we add more more of them if at all possible the new features should be built in terms of existing primitives that we already know are secure and that we have already spent a lot of energy securing in case it wasn't obvious before I was talking about sandboxing one of the things that we've spent tons and tons of energy on the Chrome team and doing and the IE team has to their benefit is putting c++ code in a box in a box that if it blows up it does not harm anyone else and we do that all the way down the chain from the JavaScript VM to the GL commands that get sent across WebGL we actually put those in a buffer and they're all verified before we send them out to the GPU to rewriting all those GPU commands in such a way that they're never actually hitting the other side directly we are actually putting an air gap effectively between your code and the actual machine so that you never actually run a thing that says please go get a thing from the network it actually calls an API which sends an IPC from one process to another a process with very low permissions to a process with slightly more permissions to go do that thing on your behalf and that's how you reason about it you winnow it down to a very small set of things which you can actually go and validate and you can start to talk about and you put those things in very large boxes where they can go and live their own lives but have very small, very tightly constrained conversations with each other and you make very secure boxes that talk to each other over very, very small channels in case I wasn't clear I think we're both talking about basically the same thing but my perspective is mostly just a response to the perception that every time the platform has a new feature it has to by definition increase complexity and I'm just saying if the new feature is built on top of basically if you write a JavaScript library then you have not increased the security surface area in the platform so if you write new features as if you were writing a JavaScript library then you have not increased the security surface area of the platform the multirider thing is down but I remember Tom asking a question about that some people still hold a standard sport is in disdain and is that a reasonable thing for these people to do with exception like are things changing or going in the right direction I'm looking at Dan I think he might have like a thing to say since he's on the panel now yes they should absolutely be held in disdain I think one of the things that we talked about earlier is the community groups I think that's one way in which W3C is trying to get faster get more responsive to the community I think all standards groups need to be held to account by the community of practice to which they are delivering and I think community groups and things like this and multiple touch points Twitter anything can be used in that way to get more and more feedback a faster feedback loop between implementers between practitioners and the people that are writing the standards and I see that happening more and more these days you should hold those standards last question if it deserves disdain you should hold them accountable before we started when W3C was not there I met the ISO system in disdain because when I was working for a small company I couldn't afford to buy the standards and they certainly weren't available on the web but then the web wasn't there but also they were made by a vote between nations and I thought but the IETF I held a huge group regard so I started going to IETF meetings and I tried to do a lot of the a lot of the web standardization in the IETF we had an HTML working group in fact there were two things the reason we made W3C was not because the IETF we had any disdain for the IETF which is a great organization and largely we modeled W3C on it one difference was in fact for the markup stuff these people just didn't, they weren't in the IETF if you got something reviewed in the IETF and it was all about markup you just didn't get the right people looking at it so to certain extent we had to put together a different group of people for the HTML stuff a lot of them had come out of some of them had come out of other markup communities but also like the X Consortium the idea was we had to hire people and the X Consortium had people who wrote code and it had to produce a new version because the manufacturers wanted needed this stuff to go and the demand was yes we need new versions of the web coming out every now and again we need somebody who's got release authority on that and so when the consortium started it was a bunch of stuff writing specs and the working groups the editorial review groups were just groups from the manufacturers that came and reviewed the stuff we were doing and then they became working groups because folks said look if you're doing effectively standards you have to work like a standards body and you have to struggle rejected violently on some famous occasion and they said you need a process and so we were much nicer working without a process so it turns out that when you build a standards body you can try to be fast and you can try to be fair and you can try to be good and you obviously try to do all those things at once and there's always a trade-off so what you end up doing if you tweak a standards body you end up making it a little less fair or a little less fast or a little less good and then WTC you can see is in turmoil constantly being pulled in all these directions with people trying to cut corners trying to get everything working their nuts off trying to make sure that they try to put extra time in so they can be fair enough to everybody while still producing something really good as soon as possible and so if you want if you feel like you might find like for example you hear somebody say oh WTC I'll take you so long go look at it look at why it takes a long time why because you get bright people in a room trying to make a good design yeah that takes a while so I know if you can think of a better if you can go to if you think that WTC could be better well go and join the community group which is actually revising its process document you can go meta on it and because it's constantly revising its own the idea is it's constantly revising its own process yeah I'm sort of on the on that same note although I know I don't want to be too happy because there are definitely things that could be improved and I don't want to make it sound like everything's perfect but I think there's a couple of constraints specifically on the interoperable web that I think people need to keep in mind when thinking about how long things take so I think I know a lot of node people are like oh my god those WTC stodgy WTC guys take some so long to do stuff look at us zipping away so amazing there's two there's two constraints that the web has that they do not have that are the main reason why it takes so long number one there are many interoperable browsers and as users of the web platform as developers on the web platform we would really like it if all the browsers agreed on the features that were being added so that once we want to go use them they are actually there you could imagine that Chrome would be able to move quicker just like Apple can move quicker in iOS but then when you want to go use the feature it would not be found in your favorite browser or your least favorite browser probably so there actually is a necessary process that is unpleasant because your least favorite browser is probably the least helpful in the committee process but you really do want them to eventually say yes I agree I will implement that you really do want a process where Internet Explorer and Safari and Opera some days they're still around no they're blank now and Firefox and Chrome could all get together in a room and say yes I will implement that now no it does not have this problem no it just implements it or doesn't implement it that's one and then the second thing is that in the browser it's actually really really hard to remove things that people have agreed to so certainly it is possible to remove things that are experimental that have not really made it through the process all that's fine things change all the time but once something is in a standard unlike again in Node where they just rewrote their entire stream system and everyone's like sounds good oh 10 great in the browser if the browser ships a feature it's really really hard to remove so in addition to having to get everyone to agree everyone has to agree and feel confident that it's a feature that they will want to support in the future or at least one that is not future hostile other features they may want to implement so as you can imagine if you try to put yourself in this room it will probably take a while that doesn't mean that there's not things that we can do in the process I just I agree with Tim in general that there are that you need to think about the context that platforms that billions of people use are designed in and then maybe think about think about what is the breakneck pace for that process what is the fastest you can possibly do and it's not you know some guy sitting in a bedroom hacking node there's some lag that comes from just doing and serving a billion people billions of people oh well I'll disagree at front then and then agree violently so the disagreement is on what it means to take something away you know the idea that once something has been a consensus then it's difficult to take away so right now in the Blink community whether or not we can remove WebKit's prefix CSS properties which were never standardized which are used on less than 0.03% of all page views is that too many that's a really fricking huge number of page views right taking an aggregate 0.03% of the web is a fricking enormous number are we willing to break those people right and do we know if we will we don't even know like how bad will it be we have no way of knowing well we're talking about you know what does it mean to actually go move this process as fast as possible so for instance we're talking about taking away XSLT support from HTML we'd like to turn it off it's a gigantic enormous burden it's binary size it's hurting everybody who loads a renderer for Blink right it's slow it's big it's live XSLT we'd like not to have it anymore but some content depends on it how much is too much these are really important questions that you get yourself into and so that's the upper bound on the rate of progress but I will say that the one thing you can do is also to go ship something and this does not make anyone in here I assume very happy but it turns out that the fastest way to change everyone's expectations about what's possible is to make it real in the world so getting something done in standards requires hubris and then a little bit of emo about the humility that you're going to have to take on later to maintain this stuff you actually need to move faster than is reasonable and hopefully not too fast to tune other people out and you don't know how big the mess is that you're going to leave behind you just hope that it's bigger than it is today alright some closing notes from Dan on this topic I just wanted to make one more point which is something that I think we the community sometimes overlooks is that W3C has to worry about patents and IPR too which can slow down the process on particular things intellectual property rights so the W3C everything that comes out of W3C should be royalty free should be implementable royalty free right and we want to make sure as much as possible than anything that comes out of W3C you can implement without having to worry that somebody is going to come along and sue you by saying you know you've infringed my patent right so there's a process that going to detail because it's extremely boring the process that needs to be that needs to be followed in order to do that and a lot of what goes on in W3C is in service of that actually and it's not necessarily visible because if because people are not getting sued actually there are a couple of high profile cases recently where people have been sued but it's a very rare and so I think it's it's one of those things like you don't notice it because it's not happening but there are an awful lot I mean if you're paying attention to the patent space and technology you know that there's an awful lot of litigation that's happening around patents around innovation and new technology these days especially on mobile devices and other areas like health and all kinds of stuff that's really holding back innovation and I think one of the things that W3C does well is to try and keep that out of the web but that takes time and it takes energy and sometimes things get held up because of that because some companies participating in the standard throws a patent what's called a patent exclusion in it the last minute and they say no no no we're actually not willing to license our intellectual property under the W3C patent policy and that stops things and I'm participating in something right now around the push API that's in the one of the working groups in W3C where we're going through exactly that process so you know yeah yeah touch so touch and pointer so that basically that process dragged on the TLDR the process is that there's a email that gets sent out that says hey representative from the W3C this thing is about to become or is moving along in the spec process do you want to assert patent rights and most of the time nobody says yes but occasionally someone says yes and occasionally people are very mean and or bad actors and they wait until a process has taken has already gone through everything everybody's happy with the technical stuff and then they say boom patent exclusion and yeah and that so basically the reason why touch has taken so long to standardize is that they went through the whole process of standardizing what Safari put on the web itself as a thing that everyone else wanted to then copy and then they said no actually even though we put it in a web browser this is not a thing that we are going to allow you to standardize so even though there's a lot of web content that we're telling people to use sorry you can't do it so in Microsoft had the pointer API which has been moving along but basically everybody had to start from scratch everyone had to say oh I guess we're not going to be able to standardize this thing which we didn't know about in the beginning until Apple appeared and did the exclusion so sometimes things take long because of patents yes everybody else I'm being in politics there's a good tie-in to the next question and we only I think we have another 10 minutes or so do you think the open web platform is good for people and should it be a basic human right were the questions from Boas and I think a nice tie-in to that is if it were a human right Omar asks like how will the web change once everyone is on it because currently there's about one sixth of the population or so is on the web what if the other what if the rest is connected what will happen Dan you're so happy to be here well I'm on record linking what is it which article of the UN Declaration of Human Rights is it that's the freedom of expression there's a UN Declaration of Human Rights and one of the articles article 7 something like that is roughly speaking freedom of expression which is roughly speaking equivalent to freedom of speech and is recognized as a universal human right by that declaration and and I think the web is linked to that and you know I've talked about the right to link before I know that's a bit controversial but I think the web is an expression or is a implementation of the freedom of expression and the freedom of association and all kinds of other fundamental human rights I don't know if that's an answer how to change when I have a result utopia right yes we finally get there okay human right I think people who have been pushing so long for water and healthcare and vaccination to be a human right and water is a relatively recently officially added to the UN list feel what you know hello you know there's a lot and the people look at having a web as being having a smart phone and being on the web in African culture as being as being a luxury that is a fundamental misunderstanding because actually when they talk to people on the ground they say they tell me about the cases where people have not had water but have had some old computer and have had some internet connection and have got the job using the internet connection translating from English into their native language which they told in the story they managed to teach themselves by having a copy of the Bible in their native language a copy of the Bible in English and so they managed to with internet connection they managed to get a job and also managed to by acting as a translator managed to make a link between the native speaking world and the people speaking that particular language and then that brings in money and money can bring in the PVC pipe which brings in the water so in fact it's not obvious that you should not think about providing people with connectivity until they've got water and until they get vaccination maybe in fact one of the important things you can do for healthcare is to get them connectivity so one of the things the World Wide Web Foundation is doing is looking and it's talking to lots of other organizations and it's trying to figure out how all these things connect nowadays of course it's not in developing countries it's it's in developed I always put developed in quotes countries that people are now suddenly realizing wait a moment who can turn off my internet because after it got turned off in Egypt and now with NSA revelations people realize that people can turn off their internet and worse people can spy on them watch them use the internet see who their friends are wait until they reveal themselves as being an enemy of this particular administration and then round up all their friends and put them all in jail without a murmur so the countries where that happens it's countries where it happens using very sophisticated technology and using man-in-the-middle attacks fake certificates produced by the government deliberately spying on their citizens but nobody I think after the snow thinks that this is something that only happens in these bad developing countries anymore so now this question about what rights do we have if you spy on me if you as a policeman try to spy on me and who's spying on you who's who will guard the guards I think no country at the moment has come up with a good question about that I think most people realize that law enforcement have to have the power to be able to spy on people but they haven't got a good story about how it's gonna about how that power is gonna be controlled and I think that we're going to have to have some really serious discussion about it in all nations and what I'd like to do is use the 25th anniversary of the web next is 24 years old the first memo I wrote about it in 1989 so that'll be 25 years ago in March and so what the bunch of organizations hopefully will do is just get everybody asking this question what is the web we want actually what sort of a web do we want to be spied on if we want to be spied on what's the deal you know and I think I hope what will come out of that is the charter like the folks in Brazil have been trying to put a charter for the internet which says no we have a actually as an internet user I have the rights not to be spied on except under certain very specific circumstances not to be blocked except under very specific circumstances and those will end up getting a list added to the list of human rights and I used to think that it was something you know where we could take our time doing it now I realize it's really really urgent I'm not a utopian I don't believe the technology necessarily makes people's lives better technology has demonstrably made many people's lives worse so the question needs to be formulated in a different phrasing it needs I think to be put in the context of what makes it possible for human lives to flourish what makes it possible for you to do better by the set of values and morals that you hold dear tomorrow versus today to encourage yourself to be the better version of you and with regards to the people around you and your society and that generally speaking is not a straight line and so I think the smartest person in the room is not in the room so Clay Scherke wrote an essay for foreign policy excuse me the foreign affairs journal just before the quote unquote Twitter revolution in Egypt and the uprising is all over the mid east in 2011 I believe and he basically put it this way there's kind of two different theses about whether or not social media or the internet make our lives better one of them is the sort of instrumental idea of social change via technology and the other one is sort of the environmental idea of social change through technology one of them says this is a wedge this is a thing that actually creates change of its own the more of it you have the more change you have the other one says it is a thing that enables people to get what they want as soon as they can talk to each other it is a thing that enables an environment in which people can do it turns out that technology is a great enabler for control look at the great firewall of China look at our ability now to sensor and filter our communications with each other and self-sensor right what are you not saying online now that you know that the NSA is listening it is a huge detriment to us all to have enabled the ability for governments to create the fear within ourselves not to say what it is we actually think so what is it that we should do now I believe that it is up to us to figure out how to create an environment in which technology creates better opportunities for people and not to treat it like some sort of instrument by the blunt application of which we will have created the better world it actually requires care we have to be thoughtful I think that is a good wrap up thanks everyone for attending yeah please stick around and finish the pizza and the soda and the beer and come again next time because it will be great again thanks to the panelists