 All right. Hi everybody and welcome to the JavaScript SEO office hours for this week. And I see that we have a few people in the Hangout with me as well. And we have a bunch of questions that were submitted. So I'm thinking that we have quite a bit to go through. Is there anyone in the Hangout who wants to already ask a question before I dive into the YouTube submitted questions? So I have a question for the beginning. It's okay. So we have one problem. Signs two mounts. We have add two index XMLs to our search console. Yes, we have two site maps to index site maps with sub site maps. Okay. So and we have a look and our pro cameras and we can't see any diff difficult with this. And the point is how we can maybe check from Google if the site map is not really good or something else. Then the Google search console show us no trouble information, but only read two site maps from the second index site maps from roundabout 600 site maps. You mean we like URLs from the site maps, not you don't have 600 site maps, right? No, no, we have 600 site maps. How many pages does your website have? Roundabout, let me not lie. Over one million. We have 3d parts with configurations. Right. Why do you have 600 site maps? This is catalogs, catalogs, site maps with parts from the catalogs. For example, we have parts from M.I.N. And they make something like screws and engines and something else. And you can go to us and download 3d files from this to use it for your CAD models. And so why is there a separate site map for just that? Why is that not part of the like general site map? I mean, I get the point that some websites are missing their site maps, but not for like new URLs. That doesn't sound like... I afraid we need to split by 50,000 site maps. Yeah, we need to split. And then we need to make a new site map. I know where the where the maximum number of URL is actually, but let me double check if we can. Site maps is a little outside of what I'm normally dealing with, but okay. So with the site map question, I'm not 100% sure because that's a little outside of my area of main expertise. But generally speaking, so if I understand correctly, the problem was that we are not seeing all of your nested site maps or we're not seeing or we're not indexing all the URLs from the site map. So the problem is the second index site map. Let me explain. The second index site map have over 600 site maps, index site map. Hold on. So you have a site map XML and then in the site map XML you're specifying other site maps, right? Yes. And now we're talking about one that is specified in there, like you have no, no, no, no, we have two of this. We have we have one site map with site maps, one index site maps and the second index site maps have the problem that only the second and the five elements, the second and the five site map in this index site map is only got read. Anything else is ignored. And I don't understand why he look for the second and the five element, not for the first and so. Yes, sir. That should not happen, generally speaking. I would bring that up either in the webmaster forum or in the general SEO office hours with John, because he might know something that I don't know about this kind of thing. Okay, okay. Good. Awesome. Sorry that I didn't I wasn't the right person to answer this one. No problem. Thank you. You're welcome. Now, I think I take one from YouTube. We are in the process of replacing the HTML text content of our pages with sorry, with JavaScript fill text commands on HTML five canvas. That's not a good idea. Are there any plans to index text and images canvases in the near future? I can't predict the future. I can't make announcements of that sort of like I can't I don't know. I don't we don't have any plans like ready for public highlight that we are planning on indexing anything in text or sorry any indexing text and images or in canvases. What are the recommendations for exposing this text content in a way the indexer can actually read it without violating cloaking guidelines? Honestly, for accessibility reasons to begin with, you should not do this. I know a bunch of projects that tried to do this like successfully and nicely and fancily. They all are not around anymore, because it turns out to be quite hard to do this. What your browser does for you for free, you can't use CSS, you can't use any of the browser given like resize or responsive paradigms. You have to basically calculate everything yourself. That is a pain for both people using screen readers as well as bots and it also is not very performant usually unless you're really like using something like WebGL acceleration for this. But I would just refrain from doing it unless you have a very important use case where you 100% have to do this and then you just take the bite the pill that you are not getting indexed. But honestly, I would not do that. To avoid impacting page load times, you would like to populate this HTML strictly after the visual text is painted to the canvas. What signals can we send received to ensure that text is ready before the indexer snapshots the page? There's no such thing. Just build a normal HTML page. Trust me, you will be better off. That's that. Canvas and image rendering things, I would not do that. Now, more people joining. That's lovely. You advised, okay, on April 8, that's a good one. On April 8, you advised against no script in favor of native lazy loading or something like, for instance, the intersection observer, what they say is like the Google suggested way. But the former have browser compatibility issues so later you will not show images for users without JavaScript. Well, first things first, I didn't say I advised strictly against using no script. That was something that was blogged about but it's not what I said. I said I would not recommend relying on it 100% and for the future because you don't know if we actually at some point might deprecate that. We are not having plans to deprecate that either but I just say it's a fallback solution that I would not exactly suggest. The native lazy loading, yes, that is not compatible with all the browsers. A bunch of browsers don't support it but I think it is a nice progressive enhancement for those people who are using a browser that supports it and more browser support will follow in the future I'm pretty sure because it's kind of like an obvious thing to do and those who are on an older browser might not get it but they're not losing something. It's they don't have the best experience, sure, but that's like you have to draw a line somewhere if you understand your user base well enough to say like oh most of our users are actually on a browser that don't support it which I think I would argue today is actually the case for most websites to be honest. You can do both the NoScript fallback on top of the JavaScript way that we can index it because that doesn't hurt you either especially because the NoScript fallback is for those users who are not using JavaScript and they would get the images nonetheless and then the crawlers who understand JavaScript get the lazy loading that works using for instance an intersection observer whereas the browsers that don't support JavaScript get the fallback or crawlers that don't support JavaScript also get the NoScript version. I think that's fine to do I'm just saying like you want to re-evaluate that in the future it's not like a bulletproof thing forever unless you are using the the way that works for Chromium-based browsers because that's I think what Bing uses as well because they use Edge and Edge is now a Chrome-based browser as well as we are using a Chromium to render so we are seeing the content I'm pretty sure Bing will see the content you would have to test with their tools to make sure that they see the content and that's not going away whereas the NoScript fallback if that's your only strategy it might go away at some point in the future might being the keyword here. I'm not saying we are about to deprecate it what I'm saying is it's not a future proof thing for SEO purposes it is still future proof for your users so go with that but anyway you would have to more or less like evaluate what works for your user base and what for your use case as well. So Martin that is my question I have a follow-up question on that so I also type it here so um okay compared to the method that you just say using Java script and fallback to non-script there is another method that also progresses enhancement required lighthouse I pay I pay the what I do is that they they say that there you actually can do one thing they said you use native lady loading first but you don't put our fee first you put data as our fee in the native lady loading and then you use a javascript to detect if the process abort if not they pull the field of javascript to do so basically lady load a natively pull back to javascript load so my question here is that so my question here is that first how do we specify placeholder in this kind of fallback because uh it's like if you use lady jaja jas lady load first uh directly you know the placeholder is the imc source uh but if you do this kind of progressive enhancement how do we specify placeholder and another issue is that compared to this and use direct javascript lady load uh directly and fallback to non-script which one is more safe and um uh and uh uh uh uh yeah there's there's no simple answer to this it's the it's the thing like you would have to to make up your mind and basically you would have to look at the different approaches and you would have to decide what works for our use case if you know that pretty much everyone is on a browser that supports javascript and the no javascript case is something that you can reasonably ignore or you're all already ignoring in the rest of your development then i would say uh using using either the javascript approach with the intersection observer or maybe using this hybrid approach that web.death documented here um is a is a very fine way to go about it um you can always also specify the no script version that never hurts it won't hurt you you should not rely on the no script version to fix all your seo problems in the next 100 years but for now it works even for google search uh which would just mean like you have multiple ways of actually giving us the image we don't really worry about that it's not a problem to have a no no script fallback so if you want to be like super foolproof and make sure that it always displays images you can use one of the lazy loading strategies plus the no script fallback for instance um you can use the approach that that uh web.death took i'm guessing you as a placeholder you can just specify the image source here on top of the data source i guess that's fine i'm not sure about that you would have to ask whoever wrote this article which is not clear from the page itself yes it is uh the authors were hussain adi or matthias uh would be the persons to ask for the specific approach that they they're documented here but yeah so like no script will be foolproof and um is a way of making sure that it also works when javascript does just fail okay thank you so much it's just a follow up question not really related uh a lot of uh developer worry about a no um the browser don't support javascript but actually how many how many browser actually don't support javascript now so that that's a that's a good question and the answer is that's the wrong question um because the thing is yes pretty much every browser supports javascript but there are privacy uh extensions that some people use that are very very clearly like disabling pretty much everything unless the the user specifically opts in so they might not see the content immediately um there can always be transmission issues if you're on the mobile phone and the html as the html arrives over the wire can already start rendering with the javascript i have to wait until everything comes down and if my javascript is like half a megabyte and then in the middle of it the connection is cut the javascript won't execute and i won't see anything on my phone so that's the case that you might want to prepare yourself for but only if you have data that you need to prepare for that so if you have for instance like telemetry on on the errors that you're getting and there's like errors that support the hypothesis that javascript loading failed for a bunch of your users you might want to consider that a case for no script um but generally speaking pretty much every browser has javascript and it normally works um there are cases where it might not work okay thank you so much you're welcome when static html and the rendered javascript title in the metadata differ will google always use the rendered javascript metadata fun fact usually yes but we might also just consider it a bad title if the title does not seem to be relevant to the actual page content and then we just might rewrite it the same goes if like the query suggests that we might want to rewrite the title we might rewrite the title so uh but generally speaking the rendered javascript title should take precedence unless there's something going wrong in rendering which is unlikely do we have questions from the hangout before i continue with the submitted questions i have another one i'm not sure okay so so my question here is that um so uh since html2 is supported by most of the browser nowadays and most of the websites using them uh it best still important to limit html requests for external javascript and css because they are download parallel anyway like for example if i have 100 external javascript and they are defer so async what will still be a problem for um page speed um that's a good question uh i think there is still a maximum of multiplexing that can happen in html2 i'm not sure what that would be like but i'm not sure if you can just say like okay i'll just like get everything over html2 in parallel uh some browsers and some connections might still fall back for whatever reason to html1 um so i would be a little careful with that i would still try to minimize the amount of resources that you're loading specifically which is like this weird thing you have to find the balance between minimizing the amount of resources and also making sure that you're not forcing the entire application bundle down on every request and like so it's a dense between splitting the bundles reasonably and minimizing the amount of external resources okay so so far on what i heard is that it's still safer to try to not request too much request from it definitely is safer um until we are very very surely in html2 land um also for instance uh googlebot will still fetch from html1 at least for now so okay thank you you're welcome so i run site audits for clients and one point is to check now javascript navigation but i thought google can follow javascript navigation are there any cases where it can't what's the proper way to execute this so it's not a problem is the person who asked this question in this hangout maybe no okay generally speaking uh if you use javascript to generate your navigation that should be fine um if you are generating proper links and proper links are links that have a proper url and an href and are an anchor tag and all that then that should be fine if your navigation for some reason is a drop down we are not going to interact with that if your navigation is and when i say drop down i specifically mean like an html select element or something like that if your navigation is a bunch of span elements or li elements with an on-click handler we're not going to see that but that's not specific to javascript navigation that's just generally specific to links um so just make sure that we see the content rendered in our testing tool so you can use google search console or the rich results test or the mobile friendly test to make sure that the content is in the DOM and then with that you should be fine that's pretty much it oh that's another good question that i saw earlier many sites review enjoy using full page hero images that require a scroll to see the rest of the content all data is in DOM without scrolling so google render look poor uh like the screenshot that you get looks poor can this be an indexing handicap to the content you need to scroll to similar hidden content no no that's not a problem if it's in the DOM that's fine that's it don't worry about it uh just make sure that it's in the DOM that's really important do we have questions from the live audience uh so i have a question sure um maybe a little bit uh a crawling problem with google board and javascript so uh we have for around about five months change our url schema so from um url uh question mark q is uh equals blah blah blah blah to slash folder uh like stuff so and we have uh from time to time the problem that google crawled new or else with uh the wrong yeah the wrong or a schema but the old one right and this is uh this is how about the last one is from one week and the question is this is the problem from google that they catch uh the javascript too long and when yes how long catch uh the google about the javascript that is one possibility it is possible that we are seeing the old urls because your javascript wasn't updated that is definitely a possibility uh we are trying to catch as long as possible we might catch longer than you specify in the cash header the best thing that you can do is version your javascript assets yeah just yes we we do this and we can see if yeah sorry then that shouldn't be a problem there are certain situations where we might hit the old version but it's very unlikely what can also happen is that we're just seeing somewhere else linking to the old urls okay uh and the second question is if i look at this link he said to me the canonical is like the link he gets and this is totally wrong he can't get this canonical we have write something that we change the canonical to the new one safe that is great however that's not how that works uh the canonical that you give us is a hint to us oh there are situations where we might decide that that's a nice hint that you give us but we think the other thing has a good reason to become the canonical lead in this case uh so we basically cluster the URLs and then we choose the leader from that cluster right we say like okay so all of these URLs point to the same thing and then we look at signals and one of the signals is uh the canonical that you pick but very often webmasters pick the canonical incorrectly so we can't just like use that and then be merry with it because we would break half of the internet that way uh in google search so what we do is then we look at other things like how many inbound links do we have on this and how often do we crawl these and what kind of content do we see at these and the canonical can change over time so if these old URLs are dying out then I wouldn't worry too much about that especially if yours if your server setup is anyway redirecting to the right URL then whatever I wouldn't worry too much about it it can be a little bit of a pain to actually track reports in uh search console then because they're using the canonical I'm aware of that but unfortunately I don't have an easy answer for this if you have a sample URL and you post it in the webmaster forum and you point me to the thread or you post it here in the in the chat or in the YouTube thread for the questions I can take a look at it but I can't like do much about it I can only maybe at least explain why this happens but I guess that not that much action that can I can be taken by me or anyone else really if the old URLs are disappearing from the internet and you're also like submitting the new URLs via sitemap and you're like making sure that pretty much everything redirects to that new URL to be the target then eventually we'll see that okay so yeah the old URL that we've picked as a canonical is no longer a good candidate for this and then we might switch over to the canonical that you're giving us shouldn't cause a problem but um is it a little inconvenient especially in reporting I'm aware mm-hmm okay thank you again you're welcome so let's pick another question from here a bunch of them aren't actually like JavaScript related but that's fine hi Martin how does googlebot handle the spoiler boxes what's the spider box like this there's a link and then there's an href to some id and then there's a onclick handler that does a bunch of stuff with jQuery oh that's an easy question we don't click on links so if the content it looks like you're not loading different content you're just moving things around and making them visible or invisible uh so if the content is in the DOM how it looks like to me this will be mostly fine if it's hidden content we might think it's not that important like if if this is your main content the spoiler box is your main content you do not want to hide this neither from the user nor from the bot I think if it's hidden we will consider it we might not highlight it in the snippet um but we will see the content in the DOM if it's in the DOM if it's just in the DOM and hidden we will see the content so but we won't click on the thing so we won't ever see the spoiler box open I think which should not be a problem right more questions from the audience yeah I've got one all right a little bit um kind of well performance and speed related really particularly with you got something like an SPA um initial load might be much slower than obviously navigating through the roots or something that hydrates but I assume that uh maybe Google would consider maybe that first load because that's what's important from Google's perspective rather than speed that might be you know so someone's using the app would that be a roughly right assumption no that is that sounds pretty much right in the sense of um yes sure the moment a new user comes in the website will be slow and then it will be faster on subsequent navigations that's this that's the same kind of situation that you run into especially if you use a service worker right if you use a service worker and your service worker caches everything on the first first load and then everything else is fast that's fantastic for the user except that you still need to worry about the initial load time because that's what a first time user sees and the first time users are normally the ones that are from a user user experience point of view right if I'm doing a user study the first visits are what I most worry about because those are the people who are most volatile right they come to your website the first time and they're like ah this is not loading and then they are gone so all the things that your service worker or your single page application did in the background are lost um so I would I would say for page speed you definitely want to have a close look at the first load experience awesome also just that reminds me someone recently somewhere asked me to explain a service worker in layman's terms and I'll try if the person who asked that was is in here or sees this recording you'll be happy a service worker very fundamentally said is basically an intermediary between your application and the network so you can imagine like if you have a normal website without a service worker the website whatever it does when it sees like oh there's an image that I need to load it goes to the network and fetches the image from the network right that's what it does um there's also a cache that the browser controls but that might or might not happen so like the website goes oh there's an image I need to load and then the browser might go like you don't have to go to the internet I happen to have it here it's like oh cool cool nice but you don't really have control over that cache as much because you only control that via your server headers and if there's a proxy in between it might strip the headers and there's like a bunch of things that can go wrong with caching now a service worker is basically a little program it is literally a javascript program that you can specify for your site I can say like okay I'm owning example.com and I want to install this little service worker which is a little javascript program where whenever the network is somehow involved this little program gets woken up and gets to decide what to do with that situation so basically like if I'm the service worker now and there's the website over then the website goes oh I have an image to fetch that will basically ring a little bell I will wake up and be like oh you need to fetch an image all right and now it's the decision and the job of the service worker to figure out what to do it can do a bunch of things it can just be like oh you need an image okay cool you need that something something header dot png sure and then goes off to the network and fetches it that's that's the dumbest service worker possible because it does what the browser would do but it also has options to for instance have its own little cache which is basically a storage room so the service worker gets woken up get me that image and it goes like oh all right okay cool in that case I'll get it and I'll make a copy and I put that copy in my storage room here you have it and then the next time the the website asks for the image it has again more choice because now I have it in my storage room so I can either go like this might be old and there might be a new version of this image but this is fine this is just the logo of the company for instance here you go right away without even considering the network and then I can optionally ask the network if I want to update my copied version of that image or maybe I only do that like once a week or whatever like I have the full flexibility as the service worker so you program how your application behaves the biggest advantage of that is that now the website without making any changes to the website if my service worker stores everything it sees in the cache and maybe I give it even when I install the service worker I give it a list of things to put in the in its cache beforehand so basically the first time I visit the website sure the thing has to go to the network and fetch the image but my service worker also fetches everything on the site and puts that into the cache and then the next time I'm asking I can get it out of the cache right away that's a lot faster than actually going to the network and the biggest advantage is if I'm offline and my service worker is programmed in a way to check its cache before it even tries to go online if I'm offline it doesn't matter that I'm offline because my service worker has all the things that it needs already in the cache so we can actually function independent of the network we can also say like okay the website asks for this image I get it from the from the thing and I try to update it from the network oh I'm offline okay in that case I don't update it from the network so you get a middleman that you control and you control it by writing a little JavaScript which is your service worker so that's roughly it in layman's terms I hope that made sense so that's done and service workers are really really flexible because I get to program the service worker to whatever I needed to be I can I once built a service worker where I ran into the situation that the API was done using post requests only post requests can't be cached which is bad so what I did instead is I basically rewrote the front end side of things to actually make get requests and the service worker that was in the middle would take the get requests and turn them into the actual post requests and then cache the results because we can cache for for get requests so that that way I actually no hold on we made a post request but I I basically knew that if it's an API post request I decided I wanted to cache it anyways even though for some API calls I couldn't cache it so I got to program a service worker that very specifically cached things from an API that was by default uncacheable which is quite cool if you don't have a service worker installed or if the service worker fails for some reason you would still make a request straight to the backend and get the the data back it would just take longer so service workers are a really cool progressive enhancement way of like adding niceness to use as well as service worker successful installs that's kind of nice right any further questions from you all in the call oh yes actually there's one in the in the chat is that is that a question we've seen all right so we've seen JavaScript play a major role in how websites are built and how JavaScript frameworks adopted and take advantage of it most of the concerns now run around html and JavaScript rendering what do you foresee will be the next challenges for the upcoming years regarding the use of JavaScript or websites uh honestly making things faster um JavaScript still has this this really not great property of inviting you to build a bunch of stuff that will not perform fantastically well for two reasons one reason is that it can't be parsed progressively uh you just cannot html you can start rendering html as it arrives you don't have to wait until the entire html has arrived on your end the same with css you can progressively build the css tree um from the data that you get from the network you don't really have to to deal with that whereas with JavaScript you have to wait until the entire blob is over and we are still not having easy to deploy and easy to build ways of doing server-side rendering so that will be a thing where you would be able to build the JavaScript applications or the web applications with all the flexibility that JavaScript provides you with but still have a fast experience for the users so that'll be something that is not easy to achieve right now that will hopefully get easier in the future um then the entire like how do i split my bundles that still requires a bunch of manual work from developers that hopefully we can somehow make easier uh and the default thing to work um so performance is going to continue to be a big avenue of change or playground for change i hope so do you think there will be attempts to make JavaScript render progressively or attempts to make too technically to make it possible or i i don't think that there is a technical possibility for that an imperative language as JavaScript is one um is not really it's not possible it's one of the downsides of like it gives you a lot of flexibility which is fantastic but it also gives you the downside of not being able to to start rendering as things happen because you need the entire context to actually be able to to execute uh to parse and execute JavaScript so i'm not sure that we're going to see like progressive rendering JavaScript i'm not sure where um where web assembly goes but as far as i'm aware web assembly has the same problem that you can only execute instructions once you have the bundle compiled but maybe we have a way of actually getting that working that would be surprising and really cool and we are in future territory so i'm like i don't know maybe none of what i just said makes sense would like tomorrow someone finds a way to actually do the thing that i found to be impossible um but definitely also tooling is a big thing like a lot of a lot of websites i would argue that i built with JavaScript these days don't really have to be built with JavaScript it's just the tool that's just a novelty yeah not even no no i'm not saying that it's just a novelty and people like to play with it it is just a way that allows you to build a lot of things very flexibly and then you might be tempted to either use that for everything like a novelty or you might have reasons for it i remember when i was building a very dynamic web application where there was no way around it we did vr in the browser there's no way in heaven that we can do that reasonably without javascript we would have to run javascript so it was a dynamic web application but it also had a an e-commerce kind of system attached to it so that also needed to be built and actually maintained somehow we were a very small team so the only reasonable way of doing this was to say like okay so it's we can't just we can't afford having to run with two completely different systems we can't afford to maintain a cms and this application the only way forward for us reasonably is to somehow integrate this into our existing system which is a javascript web application and so we ended up having like a vue.js powered blog but because i was aware of the situation i was like okay we we as soon as possible we need to figure out how to at least like server side render the static bits and figure out a way to like get the stat like keep the static static without having to drag in a cms or a completely different set of tools that people have to deal with to publish something on the blog so that's where server side rendering then came in and i remember that was 2016 or 17 server side rendering of vue.js applications wasn't the obvious thing it wasn't easy and that's the same for every other framework back in the days and and that's just how you end up with situations where like we know it's not great but that's the only thing that we can reasonably do at this point so i would hope that the tooling ecosystem explores more way of giving you a well-lit path to success where that kind of thing is considered from the get go because most of the frameworks that came out around 2014 until maybe like 2016 or something in that area of time were pretty much like client side rendering and then everything else was an afterthought so i would like to see us steering into clearer waters where it's easier for developers to just build things that are progressively rendering by default no matter how and that would fundamentally have to break down for to server side rendering at this point server side hydration server side rendering and hydration is a great thing but it's it's not the default for most frameworks what i what i like to see is that what makes me hopeful for the future is that both higher level frameworks like nextjs or nuxjs or angular universal are getting a lot of traction recently as well as that more and more people are jumping on what they call the jam stack or jumping on to static site generators like gatsby or 11t or these kind of things where you can build things with the technology that you have at hand and you can use the same technology to build something much more dynamic but you can also compile or pre-render it into html i always like to think we are dealing with this now what's going to come ahead and i know like the web technology that we use for websites has gone from like all you know java script always existed it's like very as old as almost as html and then we had we had stuff like action script and flash and all of the things that came in and disappeared so i'm just thinking like what's potentially going to come afterwards and what's gonna um it's java script here to stay is it just something that we are gonna deal with it more natively going forward more in the back end more in the front end i mean this is just like things that i try to anticipate just to think the problem ahead of the time i i think this is not a fad or whatever i think this is a fundamental shift that has been going on for at least 15 years at this point when we started with ajax around 2005 ish it became clear that the web is a fantastic application platform it used to be a document platform that's how it was built and designed originally to exchange documents one and make them linkable to each other um since since i would say 2012 2013 1415 ad not no 2004 2005 ish uh it became clear that we can build applications on this and if i look at my daily work i notice that a lot of this happens like meet google meet that we are using right now is in the browser it's not that's not a website google meet is not a website maybe the landing page yes but what we are using right now is not a website that's an application google mail is not a website that's an application and i do things that i would do in an application google spreadsheets google docs so many different things there's video editing tools there's um i don't know there's there's so much going on facebook twitter all these things social media is is the social media is actually an interesting thing because you could argue it's a website but it's also more than a website um so i think the the paradigm shift from a documentation platform to an application platform is powered and driven by java script and i don't think this has been at least like happening for the last 15 years um maybe even longer i don't think that's going to go away anytime soon uh and i don't think that we will fully replace java script with something else anytime soon either so i would say like prepare yourself for java script to stay around it's just that we are learning where the limitations are and we're learning where the real world hits the the hopefulness of uh of java script developers in the way of like oh i i can do this with java script i don't have to deal with servers at this point and then it's like oh yeah if i deal with the server i get a lot of benefits from that um i do also hope that we continue to work to bring things into a more declarative state of things so for instance i know that we are working on things like html modules um we are working on things like declarative shadow DOM and hopefully eventually we might get declarative uh web components where we can then hopefully build things with a little less reliance on on java script and have more of the web platform take care of it web platform meaning that the browser does the thing for you um but i don't know this can go either way uh either we find out okay this this really needs to be done in the web platform and needs to happen in the browsers or no this should not be done in the browsers we definitely need to make sure that we are doing this uh in java script land because that's easier to deal with but java script itself is not a fad and using java script to build web applications is not a short time fad this will stay with us for a longer time for sure yeah thanks thanks for thanks for that i didn't want to monopolize so much the conversation no no worries no worries can i ask just one question before you finish sure so uh i actually i posted my question in the comments but you didn't pick it and let me see it was possible like five minutes before the meeting should have started so uh i'm not really sure if whether it's uh java script related but i'm trying to get some kind of answer from anyone from google for some time now and i'm getting desperate so uh one of in company where i work one of the main portals recently had front page dropped from google index it's actually not dropped from index but it doesn't show in search results and i'm not sure what could have been the reason of this because when i check it in webmaster stores it says that the page is indexed and i don't know when you try to see the search results for this page it shows that from 23 days ago it doesn't have any appearances in social results the problem with that question is and i i see the question from me right now and the reason why i didn't pick it and why i would never i would guess no one would pick this in an office hour is that i don't know from the question itself it can be anything it can be a manual action it can be that this isn't is a display issue and we're actually seeing it in search results or it can be it can be everything without a url i i don't know i can't do much maybe i can send you url and you can we only we only take information through public channels but i think if you're using the webmaster forum there will be ways of getting the url to us to double check if it's a problem on our side or if it's something because the problem here really is i can't help you because if i would that would be unfair to everyone else that i can't help it if i would have to help everyone who has similar problems i would not get anything else done that's the reason why we're not offering private support um what i can what i can do if you give me a url is i can look if it's a problem on our end that affects other sites as well and if it's a problem on our end that affects other sites as well because it's a bug then we would fix it but i cannot answer what you need to do to fix this problem unfortunately the best the best place for for placing this question really is the webmaster forum because there are other users might have an idea and you can send these other users private messages or basically like i'm not sure how exactly the new system works but basically you can get messages to what we call product experts who are very fantastic users who help and dedicate their time for other users struggling with things like this and you can get the url to them privately you can also feel free to send me the url separately or you can again share it now and then i can have a look maybe um that that would definitely work and then maybe i can all right okay maybe i can have a look at this and see and it's only the the the landing page the home page so it's only the home page or other pages are not affected okay that's an interesting one https www just making sure that i copy and paste it correctly because that's like this weird youtube changes or actually hangouts changes the links um as far as i can tell it has impressions plenty so if your search console somehow shows you that this page does not have any impressions uh from search results then that would let me wonder if there is a bug in search console yes well uh when i am everybody anybody from the company or even the country uh type sim uh uterine which is a keyword that was uh most indexed for this page uh you don't see the this page as slam as first search result it's actually the section on the side which is uh uterine dot hr slash slash uh vst which is new so i go to well if you try to type in just make sure the it won't be the first results in google results i mean there is a it's not the landing page that is the highest ranking result but that's a ranking question now this is no longer a a um an indexing issue or like some javascript issue that we have the content it does show up um your site shows up multiple times just like subsites of it not the front page which might mean that we think the front page is not a good hit for the specific thing for the specific query well uh that started happening though like i said 23 days ago and uh this was a page that was uh best ranked google index and in like two days it's uh from i don't know uh 300 000 results per day fell to zero so i think that is possible i mean the first thing is you say zero and that's not something that i can see here i see you have impressions so like people are clicking on it in in search results people are seeing it in search results um it can't be zero so it must be if whatever tool your what tool are you using that says this this does not have any impressions uh i'm using google webmaster so i can send you um interesting uh because that sounds like there is a problem with the uh with the reporting on this page rather than than the actual page because you do have impressions on this well uh i i don't think i have impressions for the uh for the home page exactly what you do i do yes um this might be a reporting issue if any is it possible to send i'm not i don't think you can send like screenshots so again let me let me like reiterate the best place for this is the webmaster forum okay if it turns out to be sorry uh well i'm i'm happy that you gave me some answers and all that sure yeah not a problem that's what i'm here for and um even though this isn't javascript related uh yeah i mean it seems to be seems to be fine from what i can see in our tools uh it does have impressions it does have clicks from search so there's nothing that prevents it inherently um could i ask a javascript related question sure hi you look wonderful it's good to see you thank you um so uh we have a site that is executing a third party script by our javascript the bit of a problem is that they seem to um you fire this sort of javascript and it's letting a lot of funds to the party and it appears to be causing the layout ration question what would be the best way to explicate these third party scripts in a way that they're not interrupting uh the time to interact with perhaps uh limit the amount of resources googlebot that has to be desert to the third party script that call the phone the call the phone the call the phone it's gonna crazy map so the best way to to if if the third party script um work with that is to give them the defer attributes so that they are only loaded very well at the end of everything so that they can't interfere with the rest um also move them as far down as possible on the site so that they get discovered as late as possible and uh that should be doing it the thing is that with defer um it might be that if your application code somehow uses something from the third party script then that can get it wouldn't be there to catch the metrics i think that's the thing in cheating it right now is they need that move back would it be effective to figure out if we could split that main bundle into laps the analytics needed and then the actual yeah yeah that could be an option you're welcome jamie thanks for popping by time for one last question i have another question uh yes yeah so uh is a question about native lady loading uh there is there any specific condition to med to make late native lady loading work because i see a lot of example is that there is a lot of web uh web page they use late native lady loading and when i test it using uh desktop it doesn't work but when i use mobile simulator they work i have the example here like this one uh when i use desktop i do the network tab and you see that they they start loading before come down construct uh even before down construct but uh if you use mobile simulator in the dev tool you will see that they actually only load when the viewport is close approaching do you have any idea of is that anything that will happen to prevent uh late loading happen on desktop site um i'm guessing either this is a browser bug or they do something like somehow specify that they want this to be preloaded or maybe there's like a service worker involved that preloads it that doesn't do that on mobile i'm not sure i'm not familiar with the website uh but generally it should work on desktop or should it let me actually see like i have a very simple test page um actually no i don't not for native lady loading i need to try that stuff i would ask uh the either the the google chrome desk account or maybe submit a bug in the chromium bug tracker to just like get a feeling for it but for that i would highly recommend building like a very minimal uh reproduction or reproduction test case where you're basically just like loading a bunch of images that are large enough to be out of the viewport as well um with native lazy loading and if that doesn't work on desktop that sounds to me like that's a chromium bug i see so if that is a chromium bug it's not it might be might be so i mean if if that is the case then every uh every lady native late loading will behave like that not just the website that i saw pretty much if well i mean it depends depends on the bug if there is something that needs to happen to trigger that bug then no not every website will happen will happen happen but if you if you can reproduce it then it's less likely that it's a chrome bug and it's more likely like the the harder it becomes to reproduce the likelier it is to actually be a problem of that specific website and and maybe others that are having similar problems uh that's a whole investigation i never did that in a lab environment but i saw several web pages behave like that so uh we can i mean yeah so i would i would try that with the reproduction case and if you can reproduce it on pretty much every website then it's definitely a browser bug thank you because lazy you're welcome you're welcome so i am looking for university three png.jpeg this is a weird website png.jpeg anyway so yeah right in which case i would think that i have exhausted the youtube questions mostly and i would think that we are over time a little bit but that's fine it's not that i have to be anywhere it's locked down anyway so in that case thank you very much everyone for joining um these the next version in this time slot in time zone will happen in two weeks on the wednesday again and next week on wednesday i'll do an more apac friendly one uh in a different time zone kind of basically so if you are basically getting up very very early or um staying up very very late for this office hours there will be a better suited version for you next week as well thank you so much keep submitting questions uh on on the youtube things for the upcoming office hours and thanks for hanging out with me and have a fantastic day stay safe and stay healthy bye bye bye bye