 how can I do?. P-1- our AdSins revenue ,we honestly this is the webmaster hangout so we may not be able to answer that part very well but regarding traffic dropping from when you change our domains ,at least for the search traffic aspect I could say that the backend that you use doesn't really matter as long as Google can properly crawl it But for other traffic that depends on whether people can access it properly, whether it's mobile friendly, whether it's fast enough, whether it's taking too long to load and so on. So I think this question is best, you can either try using the tools that are already existing like mobile friendly tool and so on and the lighthouse tool to just get an idea of how your website is performing. Or if you're still not convinced, you can try the webmaster form. Alright, so yeah, we'll start taking live questions, go ahead any of you guys. So actually you know my friend has a website. So he recently created that website, maybe 6-7 months back. So he's just doing a part time. But what happened is, the website has indexed to all of Google. So when it's indexed, there's mobile usability, right? It's dropping, why? Okay, until I understand it, when it's indexed, it's dropping out of mobile usability. Honestly, I don't understand the questions. Because there's a feature in mobile usability search console. If there are any issues related to mobile, I'll show it to you. For example? Exactly, there's a number that's valid. There's no problem. Maybe it's 2000. It's now dropping to 150-140, right? Why is it dropping? There's a lot of reasons for that. One is, if 1000 pages are valid and 150 pages are valid. If the pages are related to one crawl, it can be an issue to access Google. Okay. There's a URL inspection tool in the search console. If you use the URL inspection tool, you can understand it very clearly. There are some mobile issues or something like that. So we thought that if any of the pages are dropped, we'll put it in the URL inspection tool. If you check it, you can see that it's a very clear issue in India. Because there's a lot of things related to mobile, even if it's mobile-friendly, they block CSS, JavaScript, some resources are blocked. Some resources are blocked by robots or firewalls. You can see that. But due to lack of CSS files, you can't properly align the elements. So we don't understand mobile-friendly. There's a doubt here. With mobile-friendly visibility, if we go to the inspection tool and check the live URL, we'll get a live screenshot. There will be load issues and resources blocked. There's a mobile-friendly tool. If we go to the same URL and check it, the URL is working fine. There's a fetch and render in the old version. If we go to the same URL and check it, they're crawling fine. But in the new version, it's a bit of a problem to do it in the mobile test. I don't understand why the resources are blocked. So generally, Googlebot, if you think about one browser, it's not blocked. It's blocked in many ways. But don't think that there's a lot of block response time. It might fail out. In the instance, the network response time in the resources load might fail out. If it's blocked, it might fail out. If it fails out in some instances, it might fail out in the past. And mobile-useability might fail out. There might be low thresholds. If it's blocked, it might drop out. There are a lot of requests. There are a lot of requests. We'll take another... Ashish, let's proceed to the next question. I have some questions on this specific. As I understand, you guys are talking about some page load issues. So most probably, we are facing this issue with our website. We have testing with the index version and the live test itself. so we are kind of having a lot of resources like in terms of GIS, CSS and the images is not getting loaded loaded for the bots actually so we just wanted to comment what exactly that can be the issue because the error is not something very described there it only mentions other error and only things mentioned over there and we are not able to figure out like what exactly the issue then why Google is not able to load those specific resources so I would generally say if you can try maybe using the there are two tools that I would recommend one is the chrome experience report where you chrome user experience report where you can check go to the network tools and check the exact like you know whichever resources what's happening and how are they loading and two I would also just in general run a lighthouse evaluation and check like you know what's happening because as far as I understand most of the times there are these can be temporary errors because of network time ups and server load issues so also another one would be like just maybe like seeing you know how much host load issues like how much host load your server is handling right now and is it like you know always going up and above the threshold like you know and when that happens generally Google tries to optimize by not you know crawling so many URLs and so on but again I would say both chrome user experience report and lighthouse you can just try and check if you know if something is going wrong or is it just an issue on our side if it could be an issue but mostly I would say it's probably you would agree okay the thing is like I mean we are using Akamai on that basically I don't think so that hosting or the kind of problem yeah I mean our kind of other intra team and all this network team also figuring out like what exactly the issue so yesterday also had a meeting with Akamai team they said like I mean this everything is fine okay and basically in terms of like I mean the putting all those resources and loading wise it just that it is happening with all the URLs not just something for the one of those okay then is there any if it's happening across I'm sure you're already checked robots yeah there's nothing wrong there how about is there any rules in your firewall or something that's blocking certain IP ranges not exactly for the Googlebot but yes we are kind of blocking the robots which are spoofing as kind of they are Googlebot something like that maybe you can try and test if you know remove the blog for certain maybe a small section of your website and check if any if it's still happening and you know you can try and test it out there okay so any other way to kind of test apart from this Chrome report and the lighthouse thing I think those are the best options because pretty much everything is aggregated into lighthouse a lot of the tools are aggregated I mean you can in fact like you can simply as a user you can just run it use the Chrome network tools itself and like that also exists within that so I think that's one way you can check or you can actually use the mobile friendly tool also even that has like a JavaScript console when you click the other section it has a JavaScript console where it shows error but there again it's slightly vague it doesn't show it for error sometimes it may not show the exact resource so I would say it is better and is this kind of mobile friendly test and the new search console is faction tool is the same thing or kind of the different like in terms of the discoverability while showing the errors so from Google's perspective the tests that are run are the same the tests that are run are the same of course there could be some differences between say the old search console and structured data tool there was a lot of data issues there but the live tests I think they're the same okay because I see less kind of you can say outputs in terms of blocked or the kind of not loading resources when I check it through the mobile friendly test but when I check it through the search console so it gives me more resources that can be blocked or somehow not loading so precisely again if I check it through the mobile first indexing tool it gives you only one resource from my website the others will be blocked by the robot you see some adsense or something I assume I'm not sure but I'm assuming maybe mobile friendly test tool has a much higher threshold where it waits for a little bit more time so that you don't know the resources are loaded while the search console one could probably be taking lesser time and if the resources are taking too longer it's marking them as errors that could be one of the reasons why but I mean I can always check all right sure thank you we'll take other questions hi any other side how are you I'm good good how are you I'm okay my question is regarding my website is based on API and all the links which are getting fetched to my websites is there any way other than sitemap or other than fetching through Google that Google takes my all API links and crawls into Google I mean one generally Google does it automatically but again like you mentioned sitemaps is the best method like you can submit a sitemap either XML or you can even submit feeds you don't need to submit only XML sitemaps you can submit feeds too actually while submitting the links it only fetches only few if I submit 5,000 active links it fetches only 14 or 15 it could be like because of my page rank is not high that's why okay so one the links if for example if someone submits say 5,000 links in a sitemap like you mentioned the crawling doesn't really depend on page rank it depends on one whether it's easy to crawl your website obviously it crawls happens in batches it doesn't happen all at once but the other factors that come in is a crawl budget as in how much does Google think it can crawl your website per day without putting strain on your website so that if Google crawls your pages multiple times maybe your website can't take the little sorry to interrupt you there is no links on my website as said again it's all on API the other website I'm fetching the whole the data through the script your voice is breaking there is no link as such on my website which will getting which is fetched like all the data is coming through other API and my domain is prefixed with that URL I am not understanding can you give an example sorry like mydomain.com slash product and the API link which are getting fetched the old data is from outside there is no one link on my website which can be seen but it is an active website when the URL is searched it will show in my website okay so on your home page there are no links to products but it exists there is a search box but there are no links on your home page but you can still submit a site map there is nothing wrong with it ideally one way is to have proper navigation structure but if that is not possible for you you can still submit a site map with the exact same links that you are pulling through an API the exact same links you can still submit a site map you can automatically generate a site map and doing that today I have submitted 5000 links through site map it only took index only 14 links so today where it was submitted sorry like you mentioned you submitted it today so again it is not it does not depend on page rank but crawling for even a neutral website it happens on patches so all your pages won't get crawled at once but solely over time I don't have a particular timeline but solely over time your pages should get crawled so I have to wait for the 5000 site map to crawl 5000 links or 5000 site map because I think you can submit only 5000 links of my web yeah I mean it shouldn't have to wait too long I think generally a lot of people submit site maps and I think under couple weeks I have no clue I am not sure about the exact timeline because it is different for each site depending upon how easy it is to crawl a website host load and all of those things are taken into consideration but your website should get crawled like you know in the next few weeks those pages should get crawled but another thing is just because those pages get crawled doesn't mean they will get indexed so for indexing like you mentioned there are 5000 links some of them could be the same product with different URLs for example same product and if it is different colors then probably only one of those URLs will be indexed others will be marked as duplicate of that one particular canonical one thing more is there any technical like for all links through site map or feeds sorry you are muted halfway through you were muted can you hear me yes okay go ahead sorry okay just one more question if I do not put site map or feeds through console and the visitor visited my website in the search box it sees a page through api does google record that page and submit to automatically it submits to search engine is there any way like that depends it comes the page only opens for few minutes right then visitor goes out then maybe not because so there needs to be some way for google to discover a particular page right only when google can discover a page can it crawl it and then index it so for example lot of people may have URLs created out of searches like for example you mean if that page only appears that search link only appears for a minute or so then no maybe google should not be able to discover those pages but some websites create links through their search results for example they can have a custom search option and they create links through search results through which google can actually go and access the actual content or actual page behind that search link so that way yes google can discover new content in your page, in your website if you don't submit them through site maps but if the link is temporary then I don't see it thank you very much let's take another question and then we'll move on to the comment section anyone okay Ashish nobody is asking so I'll ask okay so first thing like I mean who says like I mean it doesn't matter like I mean if you have a website or a blog basically you treat them as one most of the time and that happens like I mean we are trying to kind of competing in the international market and what really happened over there is one of my competitors who is actually kind of a posting content on their main website as well on the blog so as a result what happens is like if I kind of want to rank for certain keywords so there are four listings that are coming up on the google on the very top like I mean that's count as a five because the first listing coming up as a feature simply and then the two listings from the website and then the two listings on the blog and I assume somehow treats sub domains then there needs to be some way for google to discover a polyp I have no clue what's happening I think so guys can you guys just mute if you're not asking a question yeah so the problem we are facing is like I mean so anything we do is like we are not able to kind of before let's say four or kind of the on that specific position so reason being like in the first four is going towards the same website those are covering same content so I assume we kind of take into consideration you can rank sub domain but when they are somehow covering the different topics not the same topics but here it seems like when they are proactively doing for all of their pages targeting same sort of content with a different kind of strategy wise for one for the main website and one for the sub domain wise and somehow google thinking that content different in certain manner than the ranking on the top just wondering like how is it like I mean good things like I mean when seeing what kind of situations okay so I will try to summarize your question because there was like there was a question in between I think I missed part of the question so let me try and summarize you mean to say there is a website that is targeting certain query space and then there is a sub domain on that same website which is also targeting the same query space and they are ranking for certain queries in that query space so they are targeting certain queries in that particular query space and they are looking at the top with both their main domain and sub domain so honestly I do not like I just take an example like you know let's take a car reviewing website and it has a sub domain for say Mercedes and on the main site and the sub domain they are reviewing the latest Mercedes AMG car or something and both reviews covered different aspects of the car for example one person is looking at right quality or whatever the other person is looking at right quality too but maybe from a different aspect in terms of suspension and so on whatever so both are looking at targeting that Mercedes AMG whatever car, right quality query I honestly don't see an issue content is relevant and it's not like they are trying to churn out for the sake of churning it out so in that particular instance I don't see an issue because it's legitimate content and they have sort of like portioned out certain parts of their site to focus on certain things and it's working out in their favor so I don't see an issue there but obviously there are people who try and do this just for the sake of gaming the system and so on it's I think it's a very thin line that Google tries to handle through its algorithms where try and identify if the content is relevant or not and how well it's answering users queries and also another aspect is whether they are trying to build this content just for the sake of manipulating ranking or is it actually beneficial to users so in some cases it happens that they built up a good reputation and both content in their main domain and subdomains are very relevant so this I wouldn't say it's like giving specific preference to subdomains or subdirectories it's just like everything is looked at on a page by page basis as far as I understand most of the content is looked at on a page by page basis apart from I mean I'm sure there are some algorithms where site level if there's content bad across the site sometimes there could be site level things but mostly it's page by page basis and if content is good enough I don't see why two pages from the same website one main domain one subdomain shouldn't be ranking that said again like Google also does try to get a diverse set of results and so on there's like this fine balance but again almost everything is algorithmic there's no manual intervention here so I would this is very very subjective stuff so I don't see like from your side I can only say try and look at it from a different perspective in terms of like how that why they're ranking for that query and how can you do better and so on not just in terms of the content you can also in marketing and those kinds of stuff hi hi I'm just connecting to the same answer that you just gave right now but doesn't Google have sometimes Google see subfolders to be a different website for the same website not always but sometimes subdomains subdomains are treated as subfolders as well are treated as a different version of the website in some cases I mean sure like for example like blogger.com is blogger.com but then lot of people have sorry blogspot.com is a website and lot of people have abc.blogspot.com which is a subdomain right so again that is as far as I understand that's common knowledge so I want to understand like do you see an issue with how Google is seeing the subdomains yeah it has been notice that couple of times when sites have subdomains there is some sort of a algo that is going around where sometimes Google thinks okay yeah this particular subdomain is part of this website so let's treat it as one website but sometimes what happens is a particular subdomain is treated differently as a different entity altogether apart from the main domain so there are a couple of examples of sites I don't want to take the names but you know when so what they did was you know they had this main website where they were offering services and stuff and they created a subdomain only for the sole purpose of getting links from so they were just adding links and stuff like that and they were ranking pretty well so my understanding from that was you know because when I checked the when used to like ARFs and stuff you can check for backlinks so we saw that the indexation was happening in such a way the subdomain was treated as a different entity altogether although it passed the same level of authority in terms of I'm sure Google has the domain authority ratings parameters as well where you treat a website itself as authority apart from the web pages itself but again it also passes on the authority through the subdomain as well so is it something that is going on as well let me try and answer so I only missed parts of your question but bear with me so from what I understand as far as I understand we do not at least from my knowledge I don't think we have any domain authority signal per se there is no domain authority like how we have a page rank or how we treat other signals like speed,htps links and those kinds of things as far as I understand there is no domain authority signal that helps a website but from your question in terms of how sometimes subdomains may be treated as part of a website sometimes they may be treated as different that is probably because that's what the webmaster is intending to because let's say for take the previous example where there is a main website which has talks about cars and then subdomain which talks about specific aspects of that cars so there all of them are under one domain one niche and the content in the subdomain is also very relevant to the content in the main domain which is why they are probably linking to each other because it's quite relevant like for example if someone is talking specifically about right quality under the subdomain then they could simply link for more information on other aspects of this car why not take a look at the main article or main review in this particular page which happens to be in the main domain so I do not see an issue there because there is a legitimate use case on the other hand there could be websites where there is no actual relation and they could be doing it just for the sake of passing links in which case if the content is not relevant the value of those links won't really be all that much when it comes to ranking the content also needs to be relevant and if I am leaving internal links outside if you are looking at creating subdomains just for getting external links getting external links and those external links if they do not fall under the webmaster guidelines either the link schemes or the guidelines that we have for links then technically it's bad of course there could be some cases where we may not be catching everything but I feel like sometimes just because they do not have a manual action doesn't mean we are not catching links sometimes algorithmically we try and not count links that are coming from bad domains or spammy domains or spammy websites so it's not necessarily everything is manual or not necessarily everything is unborn I hope that answers your question yeah it does answer that but thank you so much we will take a question from the google plus community so we have a question from David his question is site has been hit with the medic update trying to recover his rankings but doesn't see much improvement he is trying to look at what's the ranking in search console and try to make sure that's the best resource for answers for those particular queries so I do not know anything about a core update that's upcoming in fact we have a lot of updates that come on a daily basis but if there is any core update I think it should normally be announced on the twitter search liaison account but from this particular question his question was how do I recover from the medic update so we have a free chat I have no clue what's happening okay there's an echo so please mute yourselves guys okay regarding the medic update question I would say it depends upon your websites for example if you feel like there are a lot of pages on your website where you can improve content then I would look at an overhaul of the website not just trying to change you can try and tweak things a little bit here and there titles making better snippets adding structure data and so on but if that doesn't change anything I would say actually go for a bigger take a bigger step and see what are the queries going forward this year, next year and how can I make overall holistic changes to my website rank better instead of trying to just make smaller tweak if you feel like you've been hit by an algorithm then that probably means Google doesn't feel like a lot of content on your website is not very relevant so making smaller tweaks may not really push the needle all that much so you probably have to make larger changes across the website for a lot more pages where you can try and see like in the question you mentioned that you can actually look at the bigger theme of those queries and try to create content in that aspect so that you can actually push the needle instead of making smaller okay we can take Ashish on this question you said you need to improve the quality of your overall website will it be also applicable if my website is doing well for one particular section but one section of my website is not doing so well so to improve that specific section I mean do I need to kind of rework on that specific section or I still need to work on my better half so again this is very very subjective so thing is like the answer I gave previously was because he was seeing drop across the website so all of the content but if you think one section of your website like say for example if you are a banking website the loan section is doing well but what other things could be there the commerce section of the banking website maybe that part is not doing well maybe you can just try and look at the content just for that aspect because why worry about things that are already doing well right why change I mean you can still change to target the future audience but for if you are feeling maybe one section is not doing well so I think maybe you can take this in steps like first try and tweak certain things for some articles where you think you are not doing well and if that's working out or if you see improvement you can try putting down your foot there more and trying to improve there but if you don't see improvement then you can step back and see what other things I can change on a site like site website I have one more question regarding 404 errors on site sorry I didn't understand yes are you audible yes by the way if this is also a Telugu hangout so if you want to speak in Telugu it's okay I can tell you in Telugu okay 404 errors see in site we created a page like example.com page.html we created a page but due to some reasons the page from .html from .html to .html we couldn't change it but what happened the problem is first you already indexed that is somewhere in our search console like 404 not point errors it will show you again but in the server the URL is not available because the name space is not redirected the existing name space we changed the URL so because of these not point errors there is an issue on the site yes that is the search console data the search console data is kept like that for many days so the data is there so you can always have links for example even if you delete that in your server you can have links you can have links so when you see 404 you can say 404 page it can be seen in your error report but it doesn't matter because clearly the page is not in the website so you can remember the google you can remember the information in the report there is a page that needs to be there if it doesn't come then you have to worry because if you write the content and see 404 page there is some problem but if you see the page without any page there is already a page but we renamed the first page the first URL index is there but we renamed so we didn't create any new URL we just renamed the existing URL yes if you rename the URL the address is different it will tell you hi Vishnu let me finish that yes what you said in the same name space it will tell you the address is different it will tell you the address is very simple you just named the new address put it in the URL inspection tool and see I know it will tell you the value of the URL index so there is a problem but sometimes in search console in the last scene by there is an error 2-3 days before it was like in the server maybe 5-6 hours but in the last 2 days it will tell you the link you can create the site maps the feeds the links will aggregate there is a link not from the site map you can access it you can access the link so if you access it from here if you see the link search console will tell you the link from there is a link from but there is a link from tab so what to do I can ignore because invalid URL is invalid if it is your site there is a problem what is the problem if the invalid URL is not there so if there is a problem if there is a problem if there is a problem if there is a problem and one more thing see in mobile there is a problem but same in desktop in desktop there is no problem so is there any when indexed and desktop indexed now phone indexed mobile indexed beshnu you had a question beshnu okay doesn't seem like anyone has questions let me check if there are any questions in the community yeah i just have one question about one of the problems i'm facing right now so i uploaded a file like a pdf file in the public.html section using the file manager am i audible? yes i didn't get the questions can you repeat it so what i did was i uploaded a pdf file a pdf into the public.html section under docs folder under the file manager so now i got the url and i submitted that url into the search console but it returns me back it detects that but when i click on request indexing it shows me that there is a problem and please try back later so i have been trying to get that url index for a long time now like it has been almost a month now but it gives me same errors it is indexing other urls of the website so the url structure say if the url structure is my website.com forward slash say anything like dot right forward slash anything and then the other pdf folder is my website.com forward slash docs forward slash then the pdf dot pdf so that pdf file is not getting indexed somehow there is a problem because my hosting provider they don't have an answer to that they checked everything and i haven't contacted cloud fare as well cloud fare doesn't have any answer so i am just wondering if you guys know why this is happening so one like if that page is properly opening on a browser then that means at least from a live perspective it is working well so if it is working well on a live page and google is unable to access it or not able to fetch it then most of it is probably an access issue and access issue could be from as far as i understand there could be multiple things one would be robots where that particular box section is blocked under the robots file or another thing could be it is taking too long for google to access the url in terms of like say timeouts there could be like the url is valid but for google or maybe for other ip's there could be some firewall blocking or something or doing some multiple checks or something like that which is making sure google is taking a lot more time or is unable to access that content so i would just checking that part generally for most crawl or access issues url inspection tool is your best bet like search console url inspection tool that generally gives you a clear idea of if google can access it or not if google can't access it then generally that's an issue either with your robots file or with your hosting service provider where something is blocking google from accessing the content because once google is able to access it if it's not able to rank it well those kinds of issues you can try and check with google but with regards to whether it's able to access it or not then as far as i mean mostly it's probably a problem on the hosting service provider side or the person who's managing your servers like setting the access and firewall and ip range blocks and those kinds of things okay okay so i have i see some questions in the chat section and looks like ramesh is already responded there so mr salotra if you're yeah we have a question from rakish how can we optimize content to get featured on the google feature snippet so with featured snippets there's no there's no like set guidelines where you'll there's a guarantee that you'll get on the feature snippets in fact it's the same for regular rich results or the amp section and so on but as long as your content is good and you have applied the right markup you can test your markup in structure data testing tool so as long as your markup is valid and it also is supported by google because in schema.org you will see vocabulary for hundreds of different types of markups but google only supports a limited set of rich results so you have to make sure like whether google supports the markup and whether your markup is valid and at the same time your content is also good and relevant because featured snippet is pretty much google telling users that this is a great answer so your content also should be very good so as long as you meet those three criteria so i have a response one of the page got featured snippet position but i also have similar pages with the same markup again like i said it's not a guarantee it's also about the content itself so even if the technical aspects are great if the content if google doesn't think for that particular query you don't have enough confidence that this is a great answer maybe it won't be marked as featured snippet hi Ashish hi i have one question yes what's your name my site you can speak in Telugu my site is a manual action in July after that we validated the backlink and then the manual action went in September so when the manual action was there the ranks didn't go down but from December the ranks started to drop over so what's the reason what's the reason what's the reason as we saw on the server side it's so popular all the pages are growing even the traffic it's a little drop but it's not a drop but the main keyword rankings are all drop so what's the reason what's the reason obviously the ranking if you study the website properly you can come up with an idea but generally general suggestions are if you don't have a website and you don't have any manual action so i think it's true so what's the reason what's the reason what's the reason for the manual action for the hack content no, no, backlink is here it's a message from the manual action it says backlink is bad it says so we removed the backlink we removed the backlink then the manual action went so generally backlink when it comes to manual action the ranking drop will definitely be seen that's why backlink can be used temporarily temporarily so if you don't have it it will definitely drop but going forward in december if you see drops maybe it will gradually drop or if you see high traffic if you see high traffic backlink will come so technically you shouldn't have high traffic so if you see traffic in december you can get actual traffic it's a guesstimate why i got the idea but i'm assuming there is no problem with traffic there is no problem with height there is no problem with content but when it comes to backlink you shouldn't have traffic so that is the loss in backlink and that is completely normal but the other thing you can do so maybe the competitors can perform in the same queries same queries you can also check what are the queries what is your traffic if you see queries you can also check what is the traffic search performance search performance in the performance matrix in the queries what are the queries what are the queries there are no queries but the main keywords have been dropped for example i can say financial financial loans the main keywords have been dropped we are trying to improve what are the impressions or what are the clicks impressions impressions have been dropped clicks have been dropped but compared to impressions clicks have been dropped why clicks have been dropped is because impressions clicks have been dropped so if you don't have clicks maybe you do not find the users the positions have been dropped if you don't have the impressions you can drop them ranking fluctuations are very common if you don't have any problems ranking fluctuations are very common because content people can search for content they can search for better content so that content is relevant when the traffic has been dropped i didn't worry general testing where you can enter what query space you are low ranking then the query space any content will rank well so if i use query space the content will be better and of course there are other things general branding how you can say big fact that it can be for a user experience or the usability of the website like I mean I'm seeing like I'm in the website like us when we try to compete with others. I can say technology wise we are good up in just to look at our user experience the layouts and all those things I feel like I mean we're kind of lagging or maybe our content is something it's very not about a section wise or it's something that may be making it more valuable kind of in terms of rankings wise or the purely for search ranking right user experience user experience just in terms of like you know how the design is and so on it matters but I wouldn't say too much because at the end of the day as long as you know it's good enough for most people that's fine like for example let's take page speak like you know if it loads under a second if and if there are like you know five different websites that load under different speeds under a second then I wouldn't say like you know the fastest loading website has like a bigger advantage over like the page that's loading under a second because all the websites are loading pretty fast so that then it's not really a big ranking boost for that particular instance so in similar way I wouldn't put too much search ranking SEO related weight under like you know usability definitely it's very important but like it only comes after the main things like ranking people linking to you and so on okay though major reason for asking this thing is because the primary content of a page is loading from the Ajax now I assume like maybe it's something not visible in the first kind of rendered view maybe then it's less of a content thing and more of a technical thing and that is very very important because so the first step is having Google properly see your content properly see and understand your content in fact like yesterday are Googler Martin released a blog post on dynamic rendering and how you can make sure Google and other search engines access your content properly make sure they get you know the search engines understand all the content on your website properly good enough so that you know on the technical aspect you're not losing out so I would please go through that blog post to make sure that part you're you know you don't have any issues but once you once Google understands your content properly then it all falls under ranking I don't know if I if that answers your question I already have gone through this specific blog post and like I mean mostly Martin talking about the whole JS framework website basically but it's something we have mostly server side rendering but part of it it's coming from the client's attending and say I mean I don't know if that can be done like he's given multiple options like you can also use rendered run to pre render and so on so okay you can pick which option is the best for you all right because there are multiple options and apart from client side you can the other ones would work best I would and all right thank you and is there any way to kind of test key whether Google now after making all the changes crawl and render all the content with those things like I mean if something is coming from the JS or like mobile friendly tool is a great way to check if your JavaScript files are being properly seen because whether the screenshot or the console that you see in the mobile friendly tool all of them show whether Google is properly rendering your website or not all right so I assume that I mean I have to check it the rendered HTML part only basically like I mean the screenshot only shows me the first fold not the whole kind of my yeah it only shows you the first fold I think URL inspection tool is also a good option may I can you check that because I haven't used it for that particular purpose but I'm sure that shows rendering aspect too because that's again a first fold only okay I have no clue that I'll actually check back on this but yeah but I think even if you know screenshot just shows the first fold you can still look at the JavaScript console to see if there are actual you know files that are not loading or rendering as how you want it okay thank you guys so we're well run out of time so thank you for joining this hangout and I hope to do more of these bilingual hangouts going forward thank you