Page Disappearing From Top Results

Q.  (0:35) It is related to the pages which are 8 at 10 or 12 and suddenly they disappear from the sub. Is it really like Google is still trying to understand the overall website quality? Or there could be some other technical issues? Because when the pages are live, they are in the sublinks. When they get indexed they are placed at like 8th position, 10 or 12 or 15., so being very close or near to the first page.

  • (01:12) If you’re creating something new or if you are just updating something existing like if you have a page where you’re just updating the prices then you don’t need to make a new page for every price change that you make. You just update those prices. On the other hand, if you’re providing information then that feels like something that should live on a separate URL where people can go directly to that specific place of information. The advantage of keeping the same URL is that over time it builds a little bit more value and then people understand that it’s actually the place to go for this piece of information. For example, every day for a new price you would create a new page then if people search for, what is the current price for this product, then they’re going to find some of these, but it’s going to be unclear to us which one of them they should show. On the other hand, if you have one page, where you just update the current price, then we know, for the price, this is the page.

Traffic and Engagement and Its Impact on Rankings

Q. (6:57) We recently added a page for our site that is consistently driving significant traffic at levels we’ve never seen before. So it’s really through the roof. Can a single page with extremely high engagement and traffic have an influence on the domains as a whole? Do these signals trickle to other pages on the site and play a positive role at that domain level?

  • (07:23)  I don’t think we would use engagement as a factor. But it is the case that, usually, pages within a website are interlinked with the rest of the website. And through those internal links, across the website, we do forward some of the signals. So if we see that a page is a really good page and we would like to show it in search a lot, maybe it also has various external links going there, then that gives us a lot of additional contexts about that page. And we can kind of forward some of that to the rest of the website. So usually that’s a good thing.

Core Web Vitals for Lower Value Pages Drag the Site Down?

Q. (8:37) We prioritise our high-search pages for product improvements like anyone else would do. We prioritise our high-search pages for product improvements like anyone else would do. Can a subset of pages with poor LCP or CLS, say only the video page on the site that aren’t the main or secondary or even tertiary search-traffic-driving pages on the site, impact the rest of the site’s overall Core Web Vitals score? So what I mean by this is, can a group of bad pages with little search traffic in the grand scheme of things actually impact—drag the overall score of the site down? And do we need to prioritise those bad pages even though they aren’t high-traffic pages?

  • (09:14) Usually, that wouldn’t be the problem. There are two aspects there. On the one hand for the Core Web Vitals, we look at a sample of the traffic to those pages, which is done through, I don’t know, the Chrome User Experience Report functionality. I believe that’s documented on the Chrome side somewhere. It’s essentially a portion of the traffic to your website. That means that, for the most part, the things that we will look at the most are the pages that get the most visits. So if you have random pages on the side that nobody ever looks at and they are slow, then those wouldn’t be dragging your site down.

Internal Links Will Play a Role in Indexing Priority

Q. (16:39) We found that many category pages didn’t get indexed faster than other specific pages like product pages. And these category pages are in a formal place like they are close to the home page. Wondering if the theory is correct?

  • (17:09) I think a difficult part there is also that kind of linked closer from their home page is a general rule of thumb. But it doesn’t have to be the case. Because we have a lot of systems also in play to try to figure out how often we should recrawl a page. And that depends on various factors. It also depends on how well it’s linked within the website, but also based on what we expect will happen with this page, how often do we think it will change, or how often do we think it will change significantly enough that it’s worthwhile to recrawl and re-index it.

Flat Hierarchy vs. URL Hierarchy

Q.  (18:13) Can you comment on the flat hierarchy versus a strict kind of URL? Because there is no such thing as a closer flat structure.

  • (18:58) We don’t care so much about the folder structure. We essentially focus on the internal link. And it also kind of links from the home page, not links to the home page. So from that point of view, if you have a URL structure that doesn’t have any subdirectories at all, we still see that structure based on the internal linking. And a lot of times, the architecture of the website is visible in the URL structure, but it doesn’t have to be the case.

Website Quality

Q. (22:28) How do you improve the perceived quality of a whole website at Google side?

  •  (22:53) I think we’ve given some types of things that you can focus on with the reviews updates think we’ve given some types of things that you can focus on with the reviews updates. that we’ve done for product reviews. Some of that might apply. But I don’t think there is one solution to improving the overall quality of any larger website. And especially on an e-commerce site, I imagine that’s quite tricky. There are sometimes things like working  to improve the quality of the reviews that people leave if it’s user-generated reviews, for example, making sure that you’re highlighting the user reviews, for example.

Crawl Statistics Report

Q. (25:47) We have looked at the crawl stats reports on the Search Console and have been trying to identify if there might be some issue on the technical side with Google crawling our website. What are some of the signals or things to identify that will point us to if Google is struggling to crawl something or if Googlebot is distracted by irrelevant files and things that it’s trying to index that are irrelevant to us?

  • (26:34) Crawl reports will not be useful in that case. You are looking at an aggregate view of the crawling of your website. And usually, that makes more sense if you have something like, I don’t know, a couple hundred thousand pages. Then you can look at that and say, on average, the crawling is slow. Whereas if you have a website that’s maybe around 100 pages or so, then essentially, even if the crawling is slow, then those 100 pages, we can still get that, like once a day, worst case, maybe once a week. It’s not going to be a technical issue with regards to crawling. It’s essentially more a matter of understanding that the website offers something unique and valuable that we need to have indexed. So less of an issue about the crawling side, and more about the indexing side.

Google Search Console Not Matching Up to Real Time Search

Q. (30:09) On 1st March, my website home page, was gone from the search results, completely gone from there. OK, it’s a kind of Google thing. The home page was not in the search results. But the interesting thing, in the Google Search Console, for every keyword, Search Console is saying I’m still ranking in the first position for every keyword that was ranking before 1st March. But the significant amount of impressions and clicks had gone. About 90% had gone. Rankings and the CTR are the same. For about one week, I tried everything, but nothing works out for me. Google Search Console is saying I am ranking in the first position still.

  • (32:06) Try to figure out whether it’s a technical issue or not. And one way you could try to find out more there is to use the URL inspection tool. it’s indexed but it’s not ranking, at least when you search. And the thing with, I think, the performance report in Search Console, especially the position number there, that is based on what people saw. 

Service and Recipe Website

Q.  (41:12) So on the service website, I have different FAQs based on different city pages for my services. Do I have to create separate pages for FAQs, or can I just add them to the same city pages? And from our point of view, you can do whatever you want?

  • (41:35) If you make a separate page with the FAQ markup on it and that separate page is shown in the search results, then we can show it. If that separate page is not shown, then we wouldn’t be able to show it. And let’s see, I think the same applies to the recipe website example there. We’ve seen that most recipe websites are not providing very useful information in my country, and we try to change that by providing more useful information and going to the point of adding FAQs to every recipe.

Shortcut Handlings

Q.  (48:42) On our website, we have a lot of shortcuts like the English version will be EG, for example. How does Google handle that?

  • (48:58) We don’t do anything special with those kinds of things. We essentially treat them as tokens on a page. And a token is essentially a kind of like a word or a phrase on a page. And we would probably recognise that there are known synonyms for some of these and understand that a little bit, but we wouldn’t do anything specific there in that we’d have a glossary of what this abbreviation means and handle that in a specific way. Sometimes plays a role with regards to elements that do have a visible effect on schema.org the requirements are sometimes not the same as in Google Search. So it will have some required properties and some optional properties and it kind of validates based on those Google Search sometimes we have a stricter set of requirements that we have documented in our Help Center as well. So from our point of view, if Google doesn’t show it in the search results then we would not show it in the testing tool and if the requirements are different, Google’s requirements are stricter and you don’t follow these guidelines then we would also flag that as I don’t know.

Sign up for our Webmaster Hangouts today!

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH