April 2022 - Premium eCommerce marketing services

WebMaster Hangout – Live from March 18, 2022

Changing Page Title and Descriptions

Q.  (1:01) A few days ago we optimised a page title and description, after that we saw the title and description changed when using the site to search in Google. After a long while, the title and description have been the one in the first place. In this case, does Google think the formal title and description are better than the one we optimised and would there be any other possible reasons that may cause this?

  • (1:36) I would not necessarily assume that if Google changes it to something that Google thinks it’s better, then you should use that title too. It is more than our systems have selected a different title. And usually, the title is picked on a per-page basis.

Best Practice to Have Content Display

Q. (4:29) Every franchisee has a microsite after the domain, like domain/branch1 and their corresponding URLs. What would be the best practice to have content displayed well? It is really hard to have 100 branches and 100 different contents. Is there a problem from the perspective of Google? What does Google think about it?

  • (05:48) Guidelines, usually doorway pages are more about swapping out one word on a page essentially, where the rest of the content is the same. If you have a specific service you create pages for every city nearby or every street or every region nearby, and you create hundreds of these pages that are essentially just driving traffic to the same business. I think franchises probably that’s a lesser issue because these are essentially separate businesses. It is essential to separate the businesses.

Decrease in Getting Articles Indexed

Q. (09:25) I work in a content creation agency. And we work with different blogs. And the agency has been running for more than 10 years. And we’ve never had problems with getting our articles indexed. But for the last six months, all of the new blogs especially are having big problems getting their articles indexed. So recently, we’re working on organic traffic. And if we’re not getting our articles indexed, then we can’t work on that or optimise any content. I was just wondering if there’s any specific reason, maybe there’s been some sort of change, or if we just have to wait and see what happens? Because we have done a lot of technical revisions of the site amps and everything. But we have just noticed a decrease in articles getting indexed.

  • (10:14) It is hard to say in general without being able to kinda look at some of these example sites. If you have examples of pages that you have that are not super fresh, a couple of weeks old that are still not getting indexed I would love to get some of those examples. In general, though, I think what I see with a lot of these questions that tend to come up around my content not being indexed is that from a technical point of view, a lot of these sites are really good, that they’re doing the right things about site maps, the internal linking is set up well. It is more than on our side from a quality point of view, it is something where it feels like the bar is slowly going up and that more people are creating content that is technically okay but not from a quality point of view.

SEO Tool in Duplicate Content

Q. (15:40) We have had a news publishing website since 2009. We post articles related to recipes, health, fitness and stuff like that. We have articles that are considered personal SEO tools as duplicate content. We tend to recreate another version of this recipe or tweak it around maybe sugar-free or salt-free everything related to that. What the SEO tool suggested is to remove it because none of the duplicate content is being ranked or indexed by Google. What is the solution for this?

  • (16:53) To make assumptions with regards to what Google will do and what will happen. And sometimes those assumptions are okay and sometimes they are not correct. This kind of feedback from SEO tools is useful because it is still something that you can take a look at it and make a judgment call. You might choose to say, I’m ignoring the tool in this case and I’m kind of following the guidance in a different case. If you are seeing something even from a very popular SEO tool that tells you, you should disavow these links and delete this content. Always use your judgment first before blindly following that.

Ranking Service Pages to Get More Leads

Q.  (23:16) I’m currently working on a website that is based in India and we get leads from all over India. We can provide services all over the world, but first I want to rank my service pages to get more leads from the USA. Can you help me know what things I can do so that I can rank top of my competitors?

  • (24:16) If you’re going from a country-specific website to something more global then it helps to make sure that from a technical point of view, your website is available for that. Using a generic top-level domain instead of a country-specific top-level domain can help. Any time when you go from a country-level website to a global-level website, the competition changes completely.

Getting the Best Approach for Client Credits

Q. (27:18) We are working with an eCommerce client, and it is an open-source online store management system. Their blog is WordPress. The main URL is example.com, whereas the blog is blog.example.com. What would be the best approach for this client to get credit from the blogs?

  •  (28:06) Some SEOs have very strong opinions about subdomains and subdirectories and would probably want to put this all on the same domain, from our point of view you could do it like this as well. This setup would be fine. If you did want to move it into the same domain, then practically speaking, that usually means you have to do some technical tricks, where essentially you proxy one subdomain as a subdirectory somewhere else. You have to make sure that all of the links work.

Describing Products Using Alt Text

Q. (37:22) Should I write alt text for products for an e-Commerce site since there is already text beneath that describes the product?

  • (37:35) The alt text is meant as a replacement or description of the image. That is particularly useful for people who cannot look at individual images, who use things like screen readers. It also helps search engines to understand what this image is about

Using Alt Tags for Longer Text With Decoration

Q.  (40:07) Would you use alt tags for images that use only decoration within the longer text? How would you treat those mostly stock images?

  • (40:28) From an SEO point of view, the alt text helps us to understand the image better for image search. If you do not care about this image for image search then that is fine. You would focus more on the accessibility aspect there rather than the pure SEO aspect. It is not the case that we would say a textual webpage has more value. It is just well, we see the alt text and we apply it to the image.

Added Links in an Underscore Cell

Q.  (44:09) So one of my technical members has added the links in the form of an underscore target or underscore equally to blank. How are Google bots able to crawl these things? Do they understand that there are links added to this particular node that typed something over there?

  • (46:15) I think we just ignore it. Because it makes more sense from a browser point of view what happens. The target attribute refers to how that link should be open. If you have a frame on a page then it will open that link in a frame. What we focus on is if there is a href value given there, then essentially that link goes to the same page and we ignore that.

Home Page Disappearance and Ranking

Q.  (50:33) For every query there is only the — know my home page is getting ranked, and all of the other pages are ignored, suddenly disappear from this. How’s Google treating it?

  • (51:19) Sometimes we think the home page is a better match for that specific query and it could be that some of the information is on the home page itself. The more detailed page is seen as not such a good page from our point of view. It is something where you can experiment with removing some of that information from the home page.

Sign up for our Webmaster Hangouts today!

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH 

WebMaster Hangout – Live from March 11, 2022

Page Disappearing From Top Results

Q.  (0:35) It is related to the pages which are 8 at 10 or 12 and suddenly they disappear from the sub. Is it really like Google is still trying to understand the overall website quality? Or there could be some other technical issues? Because when the pages are live, they are in the sublinks. When they get indexed they are placed at like 8th position, 10 or 12 or 15., so being very close or near to the first page.

  • (01:12) If you’re creating something new or if you are just updating something existing like if you have a page where you’re just updating the prices then you don’t need to make a new page for every price change that you make. You just update those prices. On the other hand, if you’re providing information then that feels like something that should live on a separate URL where people can go directly to that specific place of information. The advantage of keeping the same URL is that over time it builds a little bit more value and then people understand that it’s actually the place to go for this piece of information. For example, every day for a new price you would create a new page then if people search for, what is the current price for this product, then they’re going to find some of these, but it’s going to be unclear to us which one of them they should show. On the other hand, if you have one page, where you just update the current price, then we know, for the price, this is the page.

Traffic and Engagement and Its Impact on Rankings

Q. (6:57) We recently added a page for our site that is consistently driving significant traffic at levels we’ve never seen before. So it’s really through the roof. Can a single page with extremely high engagement and traffic have an influence on the domains as a whole? Do these signals trickle to other pages on the site and play a positive role at that domain level?

  • (07:23)  I don’t think we would use engagement as a factor. But it is the case that, usually, pages within a website are interlinked with the rest of the website. And through those internal links, across the website, we do forward some of the signals. So if we see that a page is a really good page and we would like to show it in search a lot, maybe it also has various external links going there, then that gives us a lot of additional contexts about that page. And we can kind of forward some of that to the rest of the website. So usually that’s a good thing.

Core Web Vitals for Lower Value Pages Drag the Site Down?

Q. (8:37) We prioritise our high-search pages for product improvements like anyone else would do. We prioritise our high-search pages for product improvements like anyone else would do. Can a subset of pages with poor LCP or CLS, say only the video page on the site that aren’t the main or secondary or even tertiary search-traffic-driving pages on the site, impact the rest of the site’s overall Core Web Vitals score? So what I mean by this is, can a group of bad pages with little search traffic in the grand scheme of things actually impact—drag the overall score of the site down? And do we need to prioritise those bad pages even though they aren’t high-traffic pages?

  • (09:14) Usually, that wouldn’t be the problem. There are two aspects there. On the one hand for the Core Web Vitals, we look at a sample of the traffic to those pages, which is done through, I don’t know, the Chrome User Experience Report functionality. I believe that’s documented on the Chrome side somewhere. It’s essentially a portion of the traffic to your website. That means that, for the most part, the things that we will look at the most are the pages that get the most visits. So if you have random pages on the side that nobody ever looks at and they are slow, then those wouldn’t be dragging your site down.

Internal Links Will Play a Role in Indexing Priority

Q. (16:39) We found that many category pages didn’t get indexed faster than other specific pages like product pages. And these category pages are in a formal place like they are close to the home page. Wondering if the theory is correct?

  • (17:09) I think a difficult part there is also that kind of linked closer from their home page is a general rule of thumb. But it doesn’t have to be the case. Because we have a lot of systems also in play to try to figure out how often we should recrawl a page. And that depends on various factors. It also depends on how well it’s linked within the website, but also based on what we expect will happen with this page, how often do we think it will change, or how often do we think it will change significantly enough that it’s worthwhile to recrawl and re-index it.

Flat Hierarchy vs. URL Hierarchy

Q.  (18:13) Can you comment on the flat hierarchy versus a strict kind of URL? Because there is no such thing as a closer flat structure.

  • (18:58) We don’t care so much about the folder structure. We essentially focus on the internal link. And it also kind of links from the home page, not links to the home page. So from that point of view, if you have a URL structure that doesn’t have any subdirectories at all, we still see that structure based on the internal linking. And a lot of times, the architecture of the website is visible in the URL structure, but it doesn’t have to be the case.

Website Quality

Q. (22:28) How do you improve the perceived quality of a whole website at Google side?

  •  (22:53) I think we’ve given some types of things that you can focus on with the reviews updates think we’ve given some types of things that you can focus on with the reviews updates. that we’ve done for product reviews. Some of that might apply. But I don’t think there is one solution to improving the overall quality of any larger website. And especially on an e-commerce site, I imagine that’s quite tricky. There are sometimes things like working  to improve the quality of the reviews that people leave if it’s user-generated reviews, for example, making sure that you’re highlighting the user reviews, for example.

Crawl Statistics Report

Q. (25:47) We have looked at the crawl stats reports on the Search Console and have been trying to identify if there might be some issue on the technical side with Google crawling our website. What are some of the signals or things to identify that will point us to if Google is struggling to crawl something or if Googlebot is distracted by irrelevant files and things that it’s trying to index that are irrelevant to us?

  • (26:34) Crawl reports will not be useful in that case. You are looking at an aggregate view of the crawling of your website. And usually, that makes more sense if you have something like, I don’t know, a couple hundred thousand pages. Then you can look at that and say, on average, the crawling is slow. Whereas if you have a website that’s maybe around 100 pages or so, then essentially, even if the crawling is slow, then those 100 pages, we can still get that, like once a day, worst case, maybe once a week. It’s not going to be a technical issue with regards to crawling. It’s essentially more a matter of understanding that the website offers something unique and valuable that we need to have indexed. So less of an issue about the crawling side, and more about the indexing side.

Google Search Console Not Matching Up to Real Time Search

Q. (30:09) On 1st March, my website home page, was gone from the search results, completely gone from there. OK, it’s a kind of Google thing. The home page was not in the search results. But the interesting thing, in the Google Search Console, for every keyword, Search Console is saying I’m still ranking in the first position for every keyword that was ranking before 1st March. But the significant amount of impressions and clicks had gone. About 90% had gone. Rankings and the CTR are the same. For about one week, I tried everything, but nothing works out for me. Google Search Console is saying I am ranking in the first position still.

  • (32:06) Try to figure out whether it’s a technical issue or not. And one way you could try to find out more there is to use the URL inspection tool. it’s indexed but it’s not ranking, at least when you search. And the thing with, I think, the performance report in Search Console, especially the position number there, that is based on what people saw. 

Service and Recipe Website

Q.  (41:12) So on the service website, I have different FAQs based on different city pages for my services. Do I have to create separate pages for FAQs, or can I just add them to the same city pages? And from our point of view, you can do whatever you want?

  • (41:35) If you make a separate page with the FAQ markup on it and that separate page is shown in the search results, then we can show it. If that separate page is not shown, then we wouldn’t be able to show it. And let’s see, I think the same applies to the recipe website example there. We’ve seen that most recipe websites are not providing very useful information in my country, and we try to change that by providing more useful information and going to the point of adding FAQs to every recipe.

Shortcut Handlings

Q.  (48:42) On our website, we have a lot of shortcuts like the English version will be EG, for example. How does Google handle that?

  • (48:58) We don’t do anything special with those kinds of things. We essentially treat them as tokens on a page. And a token is essentially a kind of like a word or a phrase on a page. And we would probably recognise that there are known synonyms for some of these and understand that a little bit, but we wouldn’t do anything specific there in that we’d have a glossary of what this abbreviation means and handle that in a specific way. Sometimes plays a role with regards to elements that do have a visible effect on schema.org the requirements are sometimes not the same as in Google Search. So it will have some required properties and some optional properties and it kind of validates based on those Google Search sometimes we have a stricter set of requirements that we have documented in our Help Center as well. So from our point of view, if Google doesn’t show it in the search results then we would not show it in the testing tool and if the requirements are different, Google’s requirements are stricter and you don’t follow these guidelines then we would also flag that as I don’t know.

Sign up for our Webmaster Hangouts today!

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH 

WebMaster Hangout – Live from March 04, 2022

Content Project

Q. If I have content that is updated daily, like the rates of cryptocurrency or similar. What is the best way to do it? Is it to create a new article every day? Or is it to update existing articles daily, so the URL stays the same?

  • (01:12) If you’re creating something new or if you are just updating something existing like if you have a page where you’re just updating the prices then you don’t need to make a new page for every price change that you make. You just update those prices. On the other hand, if you’re providing information then that feels like something that should live on a separate URL where people can go directly to that specific place of information. The advantage of keeping the same URL is that over time it builds a little bit more value and then people understand that it’s actually the place to go for this piece of information. For example, every day for a new price you would create a new page then if people search for, what is the current price for this product, then they’re going to find some of these, but it’s going to be unclear to us which one of them they should show. On the other hand, if you have one page, where you just update the current price, then we know, for the price, this is the page.

Heading Hierarchy

Q. Can Google separate the heading from the main content?

  • (05:44) Usually, that should just work now. In HTML 5 there’s also an element that you can use for the header and footer. Within those elements, you can have your headings again essentially. Even from a semantic point of view, you could set that up properly. It should be possible and I don’t think it would cause any problem on Google’s side either way. What usually happens with these kinds of common elements is we can recognise them across the site because they’re more or less repeated across the site. Then we can try to de-emphasise them when it comes to Search because we realise it’s the same text on all of these pages. We will just pick one of these pages to rank essentially for the text in their footer section.

Testing Tools

Q. Google Search Console is throwing an error with this error in a required structured data element. But when I check the same on validator.schema.org, it does not show any warnings or any errors. So the first question is, is it the right site to check the AMP implementation of a web page? And if so, there is a contradiction. What should be the step over?

  • (08:12) Testing tools are for slightly different purposes. That’s probably why you’re seeing that difference. The testing tool in schema.org is more about understanding schema.org markup in general.

Internal Links in the Footer Section

Q. Is it likely to be seen as problematic by Google because the links are not contextual?

  • (09:49) Most parts that wouldn’t cause any problems. I would see this more as these are links on these pages. They’re normal internal links. For example, if you have a larger website and essentially every page is linked with every other page, there’s no real context there. So it’s hard for us to understand what the overall structure is, and which of these pages is more important. Because if you’re linking to everything, then it’s like everything is not important. So that’s the kind of element that I would watch out for. Whether or not they’re in the footer from my point of view is irrelevant. If they’re generated by a plug-in or added manually, I don’t think that matters either. I would just kind of watch out, from a structural overall, based on the requirements that schema.org has. And the testing tool in Search Console is focused purely on what we can pull out of the structured data and use to show in a search feature. It is really focused on the Search part of that story. And within Search, we only use a small part of the schema.org markup. And sometimes we have slightly different requirements that maybe we require a specific element more than the base schema.org markup would require.

Page Speed

Q. We did one experiment on our website, wherein we improved the page speed of our website like we moved from somewhere close to 30, and we are now at 45, 50 on our [INAUDIBLE] PageSpeed Insights score. And in the next couple of days, we saw a massive improvement in our ranks. So I just wonder, can this be like– this correlation, is there a possibility that it is true or there could be other external factors that can also be impacting so quickly in two days and we are seeing a jump?

  • (11:05) The speed aspect is something that we have to pick up through the Chrome User Experience Report data and that takes a while to collect and to be aggregated. That’s not something you would see within a couple of days.

Sitemap

Q. When Googlebot crawls the site map in the server address, does it first crawl the submitted sitemaps in the backend, or does it go directly to our server?

  • (12:11) We don’t crawl them at the same time. It’s not that there’s a specific order form. It’s more that for individual sitemap files we try to figure out how often they change and how often they give us useful information. And based on that, for individual files, we will crawl them at different rates. So it can happen that one in your Search Console account is crawled very frequently, but also one that you submit directly is crawled frequently, and maybe another in Search Console is crawled very infrequently. So it doesn’t depend on where you submit it.

Not Showing Up for Branded Keyword

Q. We’ve got two sites. One’s a global site and one’s Australia’s site. Suddenly we were the Australia site ranking for our branded term for our number one position for two, three months. And suddenly, in the last one week, it was replaced by the global website, .com website, for a couple of days. And I just wanted to understand why that could be the case?

  • (18:50)  It’s hard to know without looking at the sites. But, I mean, if these are two sites that are essentially part of the same bigger group, we can switch between which one we would show for a ranking like that. With annotations, you can give us a little bit more information on how you want us to treat that pair of pages. Geotargeting can help a little bit. But then, it can happen that we show a global version of a page in a country where you actually also have a local version of the page, perhaps just because the global version is so much stronger than the local version.

Category Pages Update

Q. We run an eCommerce website and we are now in a stage where we want to make major updates to our category pages. One draft wants—or in one draft we want to get rid of the product listings. So you have the product listing with the faceted search where you can filter for the products you are looking for. My question is when we remove the whole products listing of category pages would we have a disadvantage in the ranking because first of all the other competitors have these kinds of product listings? Second, my guess is this is such an established element like for eCommerce pages that the users expect something like this.

  • (23:05) From an SEO point of view. I think there are different things you would want to watch out for, which probably you will so that we can still find all of the individual products that we have clean links to there. But if you’re just kind of redesigning this kind of a category page and making it look more like an informational page, I wouldn’t expect my problems with that. From Google’s point of view, you’re just changing the design.

Breadcrumb Setup

Q.  If you have structured data for a breadcrumb is internal linking still important for SEO?

  • (25:27) It’s something where internal linking is supercritical for SEO. I think it’s one of the biggest things that you can do on a website to kind of guide Google and guide visitors to the pages that you think are important. And what you think is important is totally up to you. You can decide to make things important where you earn the most money, or you can make things important where you’re the strongest competitor, or maybe you’re the weakest competitor. With internal linking, you can kind of focus things on those directions and those parts of your site. And that’s not something that you can just replace with structured data.

Multiple Product Schemas

Q. For product listing can we implement multiple product schemas on the product listing page?

  • (30:01) From Google’s point of view I don’t think you should be doing that, at least the last time I checked the policies around structured data because for product structured data, we really want that to apply to the primary element of the page. And if you have multiple products on a page, it’s not that one of them is the primary element of the page. You should not use multiple products structured data elements on a category page or something like that.

Sitemap Renovate

Q. If we have a really huge website with millions of URLs, and, right now, we are currently—the sitemaps are being renovated. And our IT team is considering storing the new files, as in the new sitemap files, in our cloud service. That means from example.com/sitemaps to cloud.com/sitemaps, we are wondering, is that a problem if we store the sitemaps in the cloud? And if that’s not a problem, shall we also create a permanent redirect for the old URL for this example.com/sitemap, or how should we plan the move?

(47:13) It’s definitely possible to host the sitemap file somewhere also. There are two ways that you can do that. One is if you have both of those domains verified in Search Console, then that works. The other way is if you submit it with the robots.txt file, where you 

specify the sitemap colon and then the URL of the sitemap. That can also go to a different domain. if you have a separate server where you’re creating sitemap files, or if you have a staging setup where it crawls and checks the files and then creates a sitemap file somewhere else, that would all work. I would also redirect the old sitemap file to the new location just to be clean, but probably even if you just delete the old sitemap URL and make sure to submit the new one properly, then that should just work. What might be a little bit tricky.

Sign up for our Webmaster Hangouts today!

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH