Google’s Removal tool
Q. Removal tool doesn’t interfere with the indexing or the crawling of a page
- (04:41): The removal tool in Google Search Console basically just hides the page in the search results. During the time that the removal is active, Google will recrawl and re-index that page normally. If you add a no-index meta tag to those pages, we will notice that usually within that half-year time, and then Google will drop the page from the index naturally.
Q. When the removal tool is implemented, Google still uses the removed pages to assess the website, but only to a certain extent
- (09:36): The pages where the removal tool was implemented are still indexed if they don’t have a no-index or any other kind of block. Google just doesn’t show them in the search results. Usually, when talking about individual pages, that’s not going to skew Google’s overall assessment of the quality of the website. But if a removal tool is implemented on a significant part of the website, then that’s something when it could affect the understanding of the overall quality of the website
Internal Link Equity
Q. The value of the links pointing to no-index pages gets lost
- (10:45): There are some situations where Google keeps no-index pages with links in its index when it freshly processes them, and it understands there’s a no-index on it, so it doesn’t use them for anything, but it can process them for sure. However, for the most part, when Google recognises that no-index is a persistent state on a page, then it won’t do anything with this page. Google will ignore it completely, and those links pointing to the no-index page go nowhere, and the links get dropped.
Broken media attachments
Q. Media attachment bugs don’t really affect Google’s impression of the website. The relevance of the website matters more
- (14:12): Sometimes, media attachment bugs happen, but Google doesn’t see them as a huge problem since they don’t tend to be shown in search anyway. John says that getting rid of these kinds of pages makes sense, but it’s more of a matter of making sure that the website is clean, strong and relevant. Google often indexes a lot of not-so-valuable pages, but it doesn’t worsen the quality of the website.
Duplicate pages
Q. Having pages that are kind of the same but actually different makes your job harder. Stick to creating one stronger page instead
- (20:10): The example comes from the person asking the question: he has a website that promotes the services of fixing Apple products, and he has multiple pages for different models of iPhone. They talk about almost the same things, but for different iPhone models. John points out that this person is “competing with himself” because if someone types “iPhone screen repair” all his pages related to that will compete with each other and have to rank for this keyword on their own. John suggests that it’s better if all the information about fixing the screens of different iPhone models is on the one really strong page. However, he also mentions that it makes sense in areas where the competition is strong – in a case when the website is one of its kind, it might be better to have all those multiple pages.
Schema Markup Plugin
Q. Before implementing something out of Google’s functionality on your website, it’s always good to check in with the Google policies
- (23:00): The person asking the question uses schema markup of Google My Business Reviews on his website, with a special plug-in that allows him to put that as a widget on the footer. He wants to know whether having that on all pages would be considered a duplicate. John argues that even if it’s not a duplicate per se, the problem lies in the fact that the reviews are not collected directly on the website itself, so it might be against Google policies. Before using any plug-ins outside Google, it’s always better to see if this is not breaking any rules.
Website Speed
Q. The website’s speed, according to Google Search Console, is not an important assessment factor compared to the real experience of people accessing the website from the same country where the website is hosted
- (24:04): There might be a difference in what Google Search Console shows as an average response time of the website and what a tool that measures average response time from the local access displays, as the Google bots do crawling mainly from the US. The good thing is that for ranking purposes, Google pays more attention to what people from where the website is hosted are experiencing in terms of response time and not what the crawlers from the US are getting.
Content Silos
Q. Content silos are a great move to let your users understand your website better
- (26:01): Content silos are not primarily an SEO move, but more of something that might be done for users’ convenience, in the sense of if it’s clear for users that the site is really strong on a specific topic. That makes it a lot easier for them to understand the context of the individual things on the website, and indirectly Google understands things a bit better for SEO as well. John argues that thought like “those internal links are coming from this theme page and going to that theme page that should match exactly, and then Google will rank our pages better” shouldn’t be the primary focus here.
Localisation
Q. There is no clear number on how many localisations a website should optimally have
- (31:16): There is no such thing as “too many localisations”, but having a lot of versions of the same page in a way “dilutes” the website – pages might turn out to be less strong overall, and it makes it harder for Google to understand what the website’s strengths are. As a result, each individual version of the website might rank worse than it could have. Having website versions for every country in the world is not a great idea, neither is having an international website with only one version – every website has its own optimal point depending on the website itself and its users.
Interstitials
Q. Making interstitials is still not welcomed, but the way they appear on the website makes a difference
- (34:58): Google looks at intrusive interstitials as a factor that would affect rankings because it’s a page element and a part of the page experience. However, John says: “this is essentially focused on that moment when a user comes to your website. So if you’re using interstitials as something in between the normal flow of when a user goes to your website, then from our point of view that’s less of an issue. But really, if someone comes to your website and the first thing that they see is this big interstitial, and they can’t actually find the content that they were essentially promised in search, then from our point of view, that looks bad”.
Disavow Files
Q. There is no fixed time for deleting and processing disavow files
- (44:21) Google picks up disavow files immediately, but “unblocking” the links takes time. Over some period of time, as the links are recrawled, Google recognises that there is no disavow file and takes that into account. John suggests that it’s better to update the existing disavow file to what you want and expect Google to pick up on that after some time, rather than to delete the file, wait till it’s gone and then update a new file.
Q. Disavow files that were previously created from one account in the Search Console can be accessed and updated from a new account.
- (45:38) Even if, for some reason, there is a need to set up a new Search Console account for a verified website, the new account still has an option to access, download and update previously created disavow files.
Search Console and Lighthouse
Q. Google Search Console is a field test; Google Lighthouse is a lab test
- (46:45) Google Search Console and Google Light House might show different numbers about a website, and it might seem confusing. John reveals that Google Search Console is basically a field test, and it shows what the website users’ experience is actually like, while Google Lighthouse is a lab text and shows what could be potentially improved: things like iterative debugging and optimisation processes that might not be directly showing up to users. That’s why indicators in these two tools might not match, and it’s good to consider both.
Domains and subdomains
Q. For a website that hosts a lot of user-generated content, it’s always safer to have the users under subdomains or to move the website’s own content to a whole domain
- (52:16) Sometimes a website that hosts a lot of users who create their own blogs, generates content etc., wants to actually rank for some target keywords on its own. However, the content generated by users, which can sometimes be of low quality, spam or some other things that are considered problematic might not let the website rank on its own. If a user’s blog, for example, contains a lot of spam, it is regarded as “the whole website has a lot of spam”. So, it’s always better to have users on subdomains of the website instead of the main domain, or even to move the website’s own original blogs and content to another domain and to isolate it from user-created content.
Uncommon Downloads
Q. The uncommon downloads problem might come from hosting something unique, almost per user type of thing
- (56:17) If a website hosts something like a software tool, that a user can sign up and download, then uncommon download warnings might keep popping up quite often. That’s because in this case, a user gets a zip file or executable specifically for him, so Google can’t scan it and tell if it’s malware or not. But the website owner can ask for the review request, and the file gets double-checked, and the issue might be resolved.
John points out, that this is one of the reasons why it’s good to have users isolated from the main website domain or have the main website on some other domain: uncommon downloads problem might be happening because of the files uploaded by a user, and it might affect everything that is hosted on the same domain.
Sign up for our Webmaster Hangouts today!