LION Digital, Author at Premium eCommerce marketing services - Page 6 of 10

Why are customer reviews important for eCommerce, and how can they be managed efficiently?

There is a general understanding among eCommerce business owners that reviews are important. However, the true significance of client reviews frequently eludes due to various, at first glance, more essential priorities in overall business processes. In a year-on-year comparison, more consumers read online reviews while searching for products and services. According to the 2022 State of Reviews provided by LION’ partner REVIEWS.io, 94% of users claim that reviews left by preceding business clients are influential when making a purchase decision. Moreover, 62% of respondents say that reviews significantly impact them and only 6% report no impact.

Where to find eCommerce business, product and service reviews?

Although eCommerce clients use multiple channels on the internet, there are three of the most influential points for review management:

  • Google. Google remains the first source to search for new businesses for 75% of consumers. Google Seller Ratings and Google My Business helps to build trust at the first point of contact for paid and organic channels. Google Seller Ratings helps improve the performance of Google paid marketing by increasing an ad’s click-through rate, thus lowering SEM’s cost-per-click (PPC). Whereas Google My Business and, properly integrated through data markup, Google 5-star ratings for individual products and services help businesses to stand out in organic SERP results and capture top-of-funnel traffic before competitors.
  • Social Media. Only 36% of customers go on social to directly search for products and services, which is more than twice lower as on Google. Nevertheless, the speed of information spread and the value for the user expressed in the amount of everyday dedicated time makes social media one of the most critical sources for client reviews.
  • Review sites and marketplaces. For some businesses, “Yelp” and other specialised review sites, Amazon and similar marketplaces could be the core source for customer reviews that can’t be neglected.

Trends and eCommerce customer reviews management

New requirements for trust

Preceding years of fake review generation finally gave the fruits, eCommerce prospects now frequently request if companies can fake reviews and question the perfect picture of 5-star reviews. They inspect reviews’ relevance, authenticity, recency and consistency through a critical lens, which found proof in 81% of respondents, claiming that reviews should be recent and contain relevant information to have significant influence. Hence, customers expect more balanced ratings and quality reviews from verified sources with factual insight into a business, products and services.

Average rating matters

Even if it is only a part of the bigger picture, 68% of respondents answer that before engaging with the business, they preferably search for the company with a 4 as the average star rating. In contrast, only 3% of consumers appeal to companies with 1 or 2 average star ratings. At the same time, 68% somewhat agree that a high rating could be trusted only if a significant number of such reviews support it.

Reviews before price

With the massive spread of online shopping and eCommerce businesses as a response to demand, shoppers’ behaviour is also gaining more sophistication. As a matter of fact, the most influential aspect of the decision-making process when it comes to online store choice became the reviews with the share of 40% of respondents, which overtook even the price with 27%, delivery time and free returns with 20% and 13%, respectively. Interpreting the numbers, it is an opportunity for retailers with higher prices to sell more than those with lower prices for the same goods just by having better reviews.

Fewer purchases proceed solely based on the product representation messages in marketing channels initiated by the company, and more people instead rely on the experience of others. Reviews increase the probability of unknown brands being discovered by customers and competing with top brands in their categories. At the same time, the competitiveness in the eCommerce market allows not to endure poor customer experiences, which amplifies the importance of client reviews.

Company’s response to a feedback

If there is something equally important in eCommerce client feedback management as past client experiences wrapped into words and images, it is the company’s response. Especially the one to negative feedback since the question “Do you read replies to negative reviews?” received “Yes” as the answer from 90% of eCommerce users that were approached. Most merchants seem to understand the importance of feedback, and 62% claim that they respond to all or most of the reviews they receive, in contrast to 15% that say they never or rarely respond to online reviews.

Negative reviews first

Research demonstrated that the first thing e-shoppers do while studying reviews nowadays is apply a filter for 1-star to check possible cons and evaluate the risks. Compared with the past, when an unsatisfied customer could most commonly influence people from his inner circle, the negative eCommerce review placed immediately alongside the goods and services descriptions can abruptly change the intention of any user that came on the page. Thus, responding to negative customer feedback promptly and adequately increases the positive impact on the client’s decision-making process even more.

Review collection strategy

As a part of nature, people are more eager to share their opinions in the extreme grades of perception – when experience exceeded or was below expectations. Therefore, an average customer with an intermediate level of satisfaction with the product or service is usually not eager to leave a review without encouragement. For instance, over half of respondents admit to leaving online reviews four times a year or even less, and 26% have never left a review at all. At the same time, only 5% of consumers say they never leave reviews based on a positive experience. Thus, eCommerce businesses should focus on an effective review collecting strategy that would include a 360o-degree view and engagement motivation at the final customer journey stages.

Review collecting systems

According to 81% of businesses that participated in the study, review collection systems provide a profitable return on investment.

REVIEWS.io provides tools for collecting and managing company and product reviews, user-generated content and other reputation management technologies. The system integrates with all popular eCommerce solutions, including Shopify, Google, WooCommerce, Klaviyo, Magento and many more. Reviews.io is trusted by over 8,200+ brands, such as Cake Vaay, BoxRaw, Bloom & Wild, helping businesses to grow through customer trust & advocacy.

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH

Contact Us

Article by

Asselya Sekerova –
Marketing &

Project Director

6 Tips for Google Merchant Center

Introduction

ALAN KENT: (00:07) Google Merchant Center is a great way to share data about your eCommerce business with Google. Hi. My name is Alan Kent, and I’m a developer advocate at Google. In this episode, I’m going to share six tips on how to get the most out of Merchant Center for your presence in search results. The most common use for Merchant Center is to upload product data via structured feeds. Because feeds are designed to be read by computers, data is extracted more reliably than Googlebot crawling your site and extracting data from web page markup. If you’re familiar with structured data, you may wonder whether to embed structured data in web pages or provide a feed to the Merchant Center. Google’s recommendation is to do both. Google may cross-check feed data against your website. So product-structured data in web pages is still recommended even if you also provide Merchant Center feeds. If you have physical stores, you can also share inventory location data with Google. This can then be used by Google when answering queries for products near me.

Tip 1. Ensure products are indexed

(01:50) The Googlebot web crawler attempts to locate all products on your site by following links between pages. Googlebot, however, may miss pages in some circumstances. For example, you may have some products only reachable from on-site search results. Google typically does not enter search terms into the on-site search box to discover new pages. If you have a product page and are unsure if it is indexed, you can use the URL Inspection tool. This will report what Google Search knows about your page. You can also use the site colon URL as a search term to search for that specific URL. In a previous episode, I described creating a Sitemap file to list the important pages to index on your site. The Sitemap file is used by the Googlebot crawler to find pages on your site without relying solely on links between pages. But there is another way. Creating a Merchant Center product feed will help Google discover all the product pages on your website. These product page URLs are shared with the Googlebot crawler to potentially use as starting points for crawls of additional pages. It is, however, important to note that this and some other Merchant Center features are not available in all countries. Please refer to the Merchant Center Help Center for up-to-date lists of countries’ features available.

Tip 2. Check your prices are correct in the Search results

(03:26) The second tip is to check the accuracy of product pricing data used by Google. If Google incorrectly extracts pricing data from your product pages, it may show your original price instead of your discounted price in search results. To check if Google is extracting price data accurately, quickly test a sample of results. You can search for a product page and check the price displayed if rich results are displayed. Search using the site colon URL for your product page to return the web page as a search result. To accurately provide product information, such as list price, discounts, and net price, it is recommended to add structured data to your web pages and provide Merchant Center with structured feeds of your product data. This will help Google correctly interpret pricing shown on product pages.

Tip 3. Minimise price and availability lag

(04:24) Tip number 3 is to minimise inconsistencies in pricing and availability data between your website and Google’s understanding of your site due to timing lags. For example, Google crawls web pages on your site according to its schedule. Changes on your site may not be noticed until the next Googlebot crawl. On the other hand, Merchant Center can be updated on a more consistent schedule, such as once a day or even once an hour. These delays can result in Merchant Center and search indexes lagging behind site changes, such as when a product goes out of stock. I described how to check Google’s understanding of your pricing data in the previous tip using a site colon URL query. In addition, Merchant Center may identify products that have different pricing data according to your website due to delays in processing. This can negatively impact your products’ search results until the discrepancy is resolved. Merchant Center also allows you to download all pricing data in bulk if you want to do a more exhaustive reconciliation of pricing data in Merchant Center against your website. To reduce lag, you can request Merchant Center to process your feeds more frequently. This can reduce the time lag between the product data changing on your website, and Google is aware of it. Another approach is to enable automated item updates in Merchant Center. This causes Merchant Center to automatically update collected pricing and stock-level data based on web page contents when discrepancies are detected. This is based on the assumption that your website updates in real-time when pricing or availability changes.

Tip 4. Ensure your products are eligible for rich product results

(06:18) Tip number 4 is to check that your products are getting rich results treatment in search results. Rich results are displayed at Google’s discretion but rely on Google having rich product data. To check if your product pages are receiving rich results presentation treatment, you can use a site colon URL query to search for a specific web page. If not found, the page may not be indexed. You can also use the Google Search URL Inspection tool to verify if Google is indexing your product page. To get the special rich products presentation format, it is recommended to provide structured data in your product pages and a product feed to Merchant Center. This will help ensure that Google correctly understands how to extract product data from your product pages needed for rich text product results. Also, check for error messages in Google Search Console and Merchant Center.

Tip 5. Share your product inventory data

(07:18) Tip number 5 is to ensure, if you have physical stores, that your products are being found when users add phrases such as “near me” to the queries. To test if locality data is being processed correctly, you may need to be physically near one of your physical stores and then search for your product with “near me”, or similar added. Register your physical store locations in your Google Business Profile, and then provide a local inventory feed to Merchant Center. The local inventory feed includes product identifiers and store codes, so Google knows where your inventory is physically located. You might also like to check out Pointy from Google. Pointy is a device that connects to your in-store point of sale system and automatically informs Google of inventory data from your physical store.

Tip 6. Sign up for tab Shopping tab

(08:15) The final tip is related to the shopping tab. You may find your products are available in search results but do not appear. To see if your products are present, the easiest way is to go to the Shopping tab and search for them. To be eligible for the shopping tab, provide product data feeds via Merchant Center and opt-in to Surfaces Across Google. Structured data and product pages alone are not sufficient to be included in the Shopping Tab search results.

Conclusion

(08:45) This is the final episode in the current series on improving the presence of your commerce website in search results. If you have topics, you would like to see included in a future series. Please leave a comment. If you have found the series useful and want to see more similar content, make sure to Like and Subscribe. Google Search Central publishes new content every week. Until next time, take care.

Sign up for eCommerce Essentials today!

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH 

Is your business suffering
from the September slump?

Seasonality is an unavoidable challenge for any business. No matter if your business sells products that are geared toward Winter or Summer pursuits, or if you have a product that is in demand all year round, we all have to weather up and downs throughout the calendar year.

The majority of ecommerce clients’ peak season is unsurprisingly Oct-Dec, with Black Friday, Cyber Monday and the holiday gift-giving period boosting conversion rates and driving up revenue. However, this often means dealing with a much softer market in the month of September. The IMRG Online Retail Index noted a 12.5% drop in online sales YoY in 2021, and we can see a similar trend across the majority of clients this year.

On average, across our accounts, we can see a drop in conversion rate by a full percentage point or more compared to August, which has affected performance across a wide range of industries; however, CTRs on average are up by 25%, suggesting the consumers are in a stage of “browsing not buying”.

Considering the current economic climate, with consumers seeing a constant barrage of news around supply chain issues, rising inflation rates, and reports of an impending recession, this drop in performance is, of course, a concern to many. However, it’s not all bad news.

“Early data from Morning Consult, a global intelligence company, finds that people plan to spend about the same amount on gifts as they did last year.” Inflation rates and concerns around cost saving, however, mean they will be in the market for deals and discounts.

With this in mind, here are a few tips from the LION team on how to weather the storm and win in the holiday season:

  1. Capitalise on any low-cost traffic to the site now. Consumers who are visiting your site have put you in their consideration set and may come back to purchase in the following months. Invest in owned channels, like SEO, Email and CRO and make the most of the visitors already have and how you can expand this.
  2. Start planning for sales and promotions now, and talk to the team about the best way to market these. You might want to consider adding retargeting to your strategy to let people know about discounts or flesh out your email strategy to capture low-hanging fruit. Think creatively about how you will stand out from the crowd during Black Friday and other upcoming holiday sales periods.
  3. Consider your ROAS thresholds carefully. While we don’t recommend going dark during this time, don’t spend at the cost of margin to the business when the money can be better used later in the year.
  4. Leverage new formats like YouTube shopping and awareness channels to bring new customers to the brand.

Reach out to the team at LION for advice and strategy tips that are personalised to your business.

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH

Contact Us

Article by

Leonidas Comino – Founder & CEO

Leo is a, Deloitte award winning and Forbes published digital business builder with over a decade of success in the industry working with market-leading brands.

Like what we do? Come work with us

SEO & web developers: Why we need to talk about audits

SEOs & web developers: Why we need to talk about audits

Do devs listen to SEOs recommendations?

(1:13) MARTIN: So, Bartosz, there is a thing that I want to talk to you about, and that is that I do hear, and I do observe that SEOs are often struggling actually to get developers to do stuff. Is that an experience that you share? It’s like you make recommendations, you give them input, and then it just doesn’t happen? Or is that not something that you would say is a particular problem?

BARTOSZ: I think that you might have touched on a very sensitive topic in the industry, and I know where you’re coming from with that. So, in general, this is a problem. Some agencies solved this problem and kind of went, past it. We cannot complain at all about our relationship with devs. At the same time, there are so many ways that things are done in the web development and the SEO space that doesn’t help.

Why we shouldn’t throw PDF reports over the fence

(2:10) BARTOSZ: For example, PDF audits are one of the things, just to name, I think, the elephant in the room. So if you’re going to create a PDF audit explaining how to fix core vitals, not knowing their stack, not knowing their technology, not knowing their resources`. How many devs are there in the team? Is it a Shopify-based platform, or is it a custom CMS? And in our experience, when you create a PDF audit, and the dev is going to run into a problem, they’ll skip that because there’s no fallback into what has to be done. So this can be unpacked in so many ways.

MARTIN: But I know exactly what you mean. So I come from a developer background, and I have worked with SEOs on both sides, both as a Google developer advocate, basically helping SEOs to do the right things and identify problems and solve problems. And both– also from the perspective of the developer working in the team. And the thing with the PDF report really struck a chord with me. Because I remember being a developer. I had so many different things on my plate already. And then, out of the blue, in the middle of a sprint, someone from the SEO department descended upon me and said, “Martin, here is a PDF with all the things that are wrong.” Bye. And then they just flew off. And I’m like, uh, OK. It says we are using JavaScript, which is very accurate because what we are actually building right now is a VR application that runs on your phone in the browser. You have to use JavaScript to do that. And the recommendation is to not use JavaScript? So that’s not really a thing we can do because then we don’t have VR capabilities. And because then that’s our product, we kind of have to build our product to have our product with the technologies that enable the features of our product. So a lot of these things are so unhelpful and so not reflective of the circumstances in which I, as a developer, work in.

Why do SEOs advise against JavaScript?

(4:21) BARTOSZ: So you work for Google. So I can ask you before we get into how to solve this problem, let me ask you a question. Is Google OK with JavaScript?

MARTIN: Yeah. We are OK with JavaScript.

BARTOSZ: So if I have a news website that’s 100% CSR, you’re going to be OK with that?

MARTIN: We’re going to be OK with it. It might take a little longer than you would like us sometimes because we have to actually render everything. And if you are rendering specifically badly designed, then that might take us a while. But in general, if you are doing things right, and if you’re doing things following the best practices and making sure that you test your setup up, we would be fine with that, yeah.

BARTOSZ: So I don’t want to argue with that statement. Obviously, this is not this kind of video. But just what I’m trying to say is there are so many complexities on all these fronts. So we have clients coming in with a news website that’s 100% JavaScript. And there is this kind of demon in the industry that all the SEOs would say that JavaScript is evil and that JavaScript is so bad. But on the other side of things, there is Google saying we’re OK with JavaScript. And there is this kind of reality when there are a lot of websites packed with JavaScript, and Google is not picking them up properly. They’re not getting rendered for, like, all these other problems. So then we hop on the call with a dev team of our client. And they’re like, but Martin said that JavaScript doc is OK. Why do we need to do why do you want us to do server-side rendering? How do you answer? You know what I mean. This makes us look like SEO wizards. So these are the problems that require a lot of knowledge. We don’t struggle with them as much anymore because we have so much research. We have our experiments, whatever. But if you think about that to a level when he or she can handle those questions, it is going to take years. And it is, for us, it slows down growth. And at the same time, if we’re going to look at the whole internet asking all of the SEO agencies to be as advanced as we are when we only do technical SEO and we specialize in JavaScript SEO is difficult. And so this is a crack, and I don’t have any solutions, and I’m not blaming Google or anything like that. I’m just saying that this is a change that’s happening, but it requires time. So there are some maybe more than some moments when it basically requires goodwill from both ends. So if devs want to understand it, and we want to explain it, we’ll make it happen.

MARTIN: I think there are lots of touchpoints where you can actually create this understanding and this cooperation. Because, as you said, there are lots of complexities and lots of background and lots of considerations. So if you are asking me, and this is also tricky for us Googlers, because if you’re asking me a question like is JavaScript OK? Then, in general, yes. Is it the best idea? No, not necessarily. If you can do it without JavaScript, do it without JavaScript. Server-side rendering is a recommendation that we give out as well. We need to make that more prominent in our docs. I’ve taken that point. But I really like the point where you said like, oh, so SEOs have this challenge that when you get a new SEO to join your team, they need to spend so much time on actually getting the knowledge that they need to work. Developers have the exact same challenge. Because the entire industry, the entire ecosystem, just keeps moving and keeps changing. So someone who becomes a developer today sees everyone else working with so many tools and so many things. And there is a tendency to skip the understanding. Because most developers who have been around a couple of years have started with some tool, learned the things that this tool does well and these things that this tool does not so well. And then they might be like, oh, you’re building a news website. I think in that case, with the interactivity that you described to me, we might be able to just do this better with server-side rendering. Whereas, if you want a highly engaging social network, you might actually want to use a client-side rendering for all the interactivity that is embedded, and that is not necessarily impairing your performance in search. So they learn these tools, and they learn the trade-offs, and then they make better decisions as they grow. But then people come in and might skip the entire learning process and go like, oh, everyone uses this framework, so I’ll build everything in this framework. Because if everyone uses it, it must be fantastic, without understanding the decision-making process behind it. And I think which is then exacerbated is the problem that if an SEO who cargo cults recommendations that they read or heard somewhere without understanding the background and the complexities they are encompassing to a developer who does the same thing, then there must be a clash. Because now they are running into territory where they think they know what they are doing when they actually don’t really know what they are doing. Would you say that might be a challenge that we are seeing playing out?

Complexities and differences between technical SEO and content marketing

(10:25) BARTOSZ: So let me unpack that one by one. There are a few statements within that. So, for example, the way you described the news website, you and I have known you for a few years. I know you’re not going to take that the wrong way. So most of our clients wouldn’t understand what you said. So if you’re talking to the key stakeholders, I’m assuming maybe not the CEO, but you know CMO, someone who’s making that decision along the lines, you will have this conversation. This is one of the problems that we were doing back in the days when you would hop on a call with five people from our client’s company. And we would start talking to dev, and you would lose everyone else. So simplifying that as much as possible is just it has to happen, so then everyone is kind of included in this conversation. But secondly, what you mentioned about dev teams is that this is so dynamic. This is also how SEO looks like. Maybe some SEO agencies didn’t realize it. I don’t think too many. So if someone is coming to us with a question, can you do a JavaScript SEO, web performance, and a little bit of content marketing? So this is extremely difficult to pack in one agency and do all these things well. So I think that we slowly need to normalize using a technical SEO agency, for one thing, using a content marketing agency for something else, and just trying to branch out so then everyone understands their goals. And then onboarding that one person is easier because that junior SEO, she only has to learn like JavaScript SEO, web performance, whatever, crawler budget, understand those technical aspects. At the same time, some agencies want those people to also do link-building and all these other aspects. So just like with devs, it got so complex. Sometimes I’m looking at job offers for devs, and I’m like, what does it even mean? 

MARTIN: Basically, one job offer is an entire IT department. I love it.

BARTOSZ: Basically, you need to divide organizations into aware in the web space and those that are not aware. And then if we’re going to so if someone is aware of how SEO works like it looks like what’s the difference between CSR and SSR, I’m assuming a lot of even high-level people in some organizations like for example, Germany is pushing a lot of people in the management position to know a lot about development, which I love. Talking to companies from Germany, most of the time, they’re just so aware of that. Some other companies would come to you and say, so if we’re going to fix this problem with rendering or with web performance, how much traffic can we expect and when? And that’s the main topic of discussion. And this is something I have daily, two-three times a day, that I have to answer. And this is usually showing me that there are so many ways to answer this question. Like, I never do that the same way. But anyhow, this is showing me that maybe they need a little bit more help understanding what has to be done and why it’s done. Sometimes it’s just beyond our scope of work, let’s call it that, that we cannot push them.

Web performance metrics and reaching stakeholders needs and wants

(13:35) BARTOSZ: So now that we have that, let’s assume we have someone that’s aware of or willing to learn about why we’re doing that, about– that “why” is kind of important here. Because if they only do that for traffic, and that’s the only KPI they look at, it’s very difficult sometimes not to skip a lot of important metrics. If you look at that, you can have a ton of traffic, and this is still going to be a terrible website in theory.

MARTIN: So are you saying that sometimes you literally have to rethink an organization’s KPIs?

BARTOSZ: Yeah. Very often, we would have just to give you an example. We have a call with a massive company, and they would be asking us what do we have to do to rank for the term “houses”? And just this question lets you unpack so many problems with the whole organization. And then we usually don’t want to offend anyone. You don’t want to get their ego involved in that. But at the same time, you want to explain that. And so that’s one. Let’s assume we have the stakeholders sold on the idea of what we want to do. They understand that. That’s amazing. That’s usually when things start to go well.

MARTIN: That is great. Because that also unlocks the possibility to basically have them on board with whatever the dev team will be doing about it. These things have basically been invisible to me. But as a developer, I just noticed that the stakeholders, the key stakeholders in the company that have an influence on the dev team, told me to do one thing, and then SEOs or marketing told me to do another thing. And then usually I picked the organization goals because that’s what I was measured by. So you are saying by bringing in the key stakeholders and making sure that they understand what they need to look for and adjusting their KPIs, you unlock the key to actually getting the development team on board with what you are trying to accomplish?

BARTOSZ: Yes and no. So usually, we don’t really– like as a technical SEO agency, we don’t struggle that much to get devs on board. This is not that big of a deal with us. The problem is for the stakeholders to understand what we want to do with them. Because sometimes, it’s like this is this technical SEO agency, and this is our dev team. Let them have fun with this project. And like almost literally, that’s how that might look in some cases. And this is usually a problem. But then if we know, OK, we want to get here. And this is the umbrella term. We want to have amazing web performance. We want to get rendered, indexed quickly, whatever. And they know that. I don’t even imagine how this could go wrong because the whole organization is just growing in one direction. And this is our Holy Grail, and this is happening very, very often.

MARTIN: How do you get that to happen? Because I have been in so many organizations where that did not happen.

BARTOSZ: This is a very simple answer. We did it wrong so many times. So we tried for years. Because when we started back in, I think 2013, ’14-ish, we wanted to focus only on the technical aspects of my team and me. People would make fun of us. They would be OK, there are white hat SEOs, and there are people who have traffic and all of these kinds of amazing jokes going our way that you cannot really. I can create a stand-up show for you with just the feedback we would get from the SEO community back in the day, basically moving to a technical side of things only.

MARTIN: Bonus episode right there.

Meeting with stakeholders, finding problems, and SEOs listening to devs to find solutions

(17:22) BARTOSZ: This was very weird for a second. Usually, we start with stakeholders. I’m going to condense this really quick. We talk to the stakeholders. We hop on the call before creating any offers or anything. We hop on the call and talk about what’s the KPI? What’s the problem? What are the challenges? Why are we even doing that? Why is it so important? Why do you want to fix that? Because if traffic is the only metric, we still will work with them, but we know how this might go. So we’re going to start with that. Then after the call, we look into their website. We create a statement of work. So we tell them, OK, this is what we’re going to do. This is the list of problems we’re seeing with your website. This is how we want to fix it and prioritize it. So the first month, we solve all of the most terrible aspects, like 404s or, I don’t know, 10 seconds to load a page, whatever. And with that, it’s extremely transparent. Because we tell them, OK, this project is going to take four months. We’re going to hop in, and now this is actually a spoiler alert. We’re going to hop into a PM, so like Jira or Trello with your dev team, and we’re going to make it happen.

MARTIN: OK. So you meet the dev team where they are anyway.

BARTOSZ: And we adjust to whatever solution that they go with. So if they work in sprints, we try to join that. However, this is, we had to kind of morph into this team that joins them without any interruptions. So this is the only way. We are aware that, like in a medium or large company, the dev team is seriously the most important part of the business. So then we tell them, OK, this has to be fixed. But we have to understand their tech stack. We have to understand all these boring aspects boring, like, we loved it. But usually, during that call with stakeholders, they don’t really want to talk about it, or they don’t know. Very often, stakeholders have no idea what kind of tech stack they are running.

MARTIN: To be fair, they should not have to, right? That’s something unless it’s like the CTO. I don’t think the CEO needs to know which tech stack they’re on as long as they know what their core business is, how it works, and what’s the vision? What’s the mission of the organization? I think that’s exactly what you have a development team for, to define these things based on requirements coming from elsewhere.

BARTOSZ: One more conversation that we have to schedule. But let’s move forward with that. Then we hop into Jira, Trello, whatever. We give them tasks. And they come back to us. Like, we cannot really do that. We have a custom solution around this one that doesn’t allow whatever. So then and again, we have a team that’s extremely technical. This is something we have been building for a few years. And they hop on the call. We either sometimes, when they really don’t get that, and this is an edge case, we write a snippet of code just to show them how to optimize the CSS or whatever. But most of the time, we just go and talk to them. And they would like devs, in my opinion, they’re very hardworking people. And they would tell us. We cannot do this. We have so many limitations. And we try to work within those limitations. If we hit the wall, we go back to stakeholders and say, maybe we could try to, and devs appreciate that. Because someone is really it’s not only them coming to stakeholders for budget, for more resources, whatever. But we’re going to come in and say, OK, guys, this is going to be difficult with so many places where you cut corners.

MARTIN: So you would say that you would also have to somehow support developers to get the right resources and to get the right environment to work in sometimes? That’s interesting.

BARTOSZ: Sometimes. Sometimes. This is going to sound funny. But sometimes we have clients who have 50 devs, and not in a country where there would be cheaper, but 50 devs in a very high-earning city somewhere in the world. And they would listen to us rather than to them because they are like, oh, we’re paying your invoice. We want to get the most bang for our buck. So because of that and they would tell us that openly we want to really move this project forward. And that’s why they would change things around in the dev team. And I guess you have to know that out of all people. Sometimes when you work in a team, you come to your manager, and you say, OK, this is a problem every day. They won’t listen to you until someone from the outside is going to come in and say, like, dude.

MARTIN: I’ve been there. I’ve done that. I’ve been on both sides of this. I’ve been the consultant that came in and basically just sat down with the development team, listened to them for a day, and then presented what I heard to the stakeholders. And they’re like, oh, these are really valuable insights. And I always thought, I’m billing you for this, but you could have just listened to your developers.

BARTOSZ: Exactly. And I guess every single dev listening to or watching this video series right now is going to have the story of this way, of this kind.

MARTIN: I’ve been on the developer side of that, too. It was like, hey, we need to do this. Oh, I don’t think so. And then I was like, OK. And then the consultant came in and said the same thing.

BARTOSZ: And also, just to elaborate on this story, we have usually once a quarter we have someone reaching out to us. Usually, the dev teams we know, saying “Partners, when we need an audit around core vitals”. But don’t go too deep. We just want them to hear what we told them from someone else. And they’re willing to spend the budget just to have a backup document just to say. We need to fix this.

Always ask questions

(23:36) MARTIN: But that is so smart. I really like that. Because so many developers are like, ugh, I don’t want to work with these people because they just tell them what we told them already, and they charge them for it. Why don’t developers more often leverage these external voices like they do in your case, where it’s like, Bartosz, I need this audit. I know the result of the audit, but please tell them what we already told them because they don’t listen. That is smart. I like that.

BARTOSZ: If there’s any SEO agency listening to that, or SEO like someone frustrated with dev teams. 100% honest. We don’t struggle with devs. Talking to them openly, speaking their language, and you might have to get a little bit of technical, or maybe just have one or two people on your team who can get the message across.

MARTIN: Oh, and just to add on to your last point, and that goes to all the SEOs out there listening, struggling with developers, lose your ego. It is not a problem to ask us developers questions. It’s like if you think that you need to know everything, no, you don’t. You’re not a developer. It’s OK. Developers don’t know about SEO. You don’t have to know everything about development. So if you don’t understand why they can’t do what you ask them to do, remember that developers are intrinsically motivated by or to solve problems. That’s what they love. That’s what they want to do. So if you give them something to solve, they’ll be excited to solve it. And then they hit the limitations, the limits of what they can do in the tech stack in the environment that they are in. And then they tell you, I can’t do that because of XYZ. If you don’t understand XYZ, that is OK. Ask clarifying questions until you get there because you, and I think, Bartosz, you said that very nicely, you may have to simplify that message that comes from the developers so that other people who are not developers in the organization that you work with understand why it doesn’t work.

BARTOSZ: Let me just build on that. And this is something that all the SEOs are going to love. We had that vision years back that we had to learn all of the frameworks. We have to know front and back and whatever. It was so stressful. Like, I was learning all these like, I was trying to know it all. But this is something leave it to devs. What we have to do as technical SEOs, we have to have an in-depth, massive understanding of how rendering works, how a browser works, how Google is rendering and what they render, like, rules around that. Chrome DevTools has to be your go-to place. And you need to understand what’s happening, and once you understand that, you add a little bit of documentation from different frameworks, from different technologies, but don’t learn how to write all of this code. Obviously, basics are OK, but basics. This is what they pay you to do. They don’t want you to know what they know. So you have to know this is complex enough. Understanding how a browser works step by step is something you can do for the rest of your life and never have enough of learning. Just wanted to add now we were talking from the SEO standpoint with the ego. If you own the company, if you’re on a dev team, go through a lot of agencies. Talk to them. Hop on the call with them. Ask them questions. See if you understand what they’re trying to do. If you’re talking to an agency that’s going to tell you nothing about the scope of work, what they’re going to do, and how they’re going to do that, this would raise a red flag. If you go with your car to have it fixed, and they won’t tell you what they did, I would be worried about driving this car. So basically, go through that. Talk to as many people. As soon as you feel the vibe, “OK, these guys, they understand what we want to accomplish” and get this conversation going.

MARTIN: And the same the other way is absolutely true. Developers really don’t like it when someone comes like, hey, you need to do this. And then they ask you why? And then they don’t get a proper answer. You can do the same thing with developers. If you say, hey, I want you to implement the canonicals. And then they say, we can’t do that. Then ask them why. If they say our solution doesn’t allow it. Then it’s like, why does it not allow you, like, does it not allow you to add anything to the head? Oh, it does, but not the canonicals. Why not the canonical? It might be just a knowledge gap on their side, or it might be an actual hard technical limitation of the environment and the stack and the platform they’re working with. But they need to be able to explain this to you in simple terms. If they are like, oh, it’s algorithmically impossible, that’s just a developer’s way of saying, bugger off. Ask questions. Don’t think like, “oh, they said something that I don’t understand, so I must stop questioning here”. No. If they can’t express it in simple terms so that you understand what they mean, they haven’t done the work themselves either. So hold them accountable, but be ready to be held accountable, too.

BARTOSZ: Yeah. Just wanted to hop on. We don’t know the stuff very, very often. We have five calls per day with five different stacks. Some we never heard about. So if we don’t know something, we’re very open about it. We have no idea what we have WAMP PWA recently. I was like, I never heard of WAMP. We just went and read the documentation, came back, and scheduled a call. Like, now we can talk in the same language. Again, this is an ego thing. You don’t assume that you need to know stuff. I’m very open about things I don’t know. There are tons of them.

MARTIN: I mean, even I was asked recently at a core vitals session, I was asked questions that I don’t know the answer to. I won’t give you a random non-answer or try to mettle my way through. I just say I don’t know, but I can check.

BARTOSZ: Yeah. And one thing that I feel is a deal-breaker between devs and SEOs a lot of times, and we back in the day were guilty of that as well, is devs will ask you a question of, why shouldn’t we just point canonicals to a page that’s most important, so we sculpt we push the most important page with different canonicals because this is just like a link. They would have all these ideas as well, like both SEOs and devs, on how to cheat the search engine. Yeah. SEO, you have to explain that step by step. But if we’re talking about SEOs and devs, we need to leave all of the conspiracy theories or urban legends behind. Just fall back on documentation, and that’s it. Because as soon as open the door into, if we point some canonicals here, or if we do this, or if we do that, some things might happen. We heard about it. We tested that. This is going to put you in the shoes of those snake oil salesmen. So be technical. If you’re talking to devs, be technical. Drop all of those. Even if you deeply believe this is the case, I think this opens the door to what we as SEOs want to run away from.

MARTIN: Another thing is if you are not very technical, that is perfectly OK. But then don’t try to find solutions because you are in a territory where you are not necessarily experienced. And if you are not just basically present the problem, and then work with the development team to solve that, don’t try to come up with a solution because it’s likely that your solution will not work in the tech stack or the environment that your development team works in. And then if you are getting attached to that solution, you’re like, but why can’t we just do it like this? It might not get through to the development team the right way, and it might feel like you are just obsessed with something rather than actually trying to solve the problem. And that’s what development teams need to do. So you want to be on their side, and you want to it’s OK to go the way together like basically, to research options, to experiment with things together. But trying to come up with a turnkey solution for development teams usually backfires.

The start of technical SEO and web developers working together

(32:30) BARTOSZ: Just one thing that I hope is going to clean the air a little bit. Technical SEO is fairly new. It’s maybe three, four, or five years old. Obviously, some will argue that this was because I was doing technical SEO in 1993. But it’s fairly new in the way that this is getting so popular. It’s getting so needed. It’s almost essential. So this is a brand new field. And if you look at that and drop all of the histories, this is going to get really exciting. Because now, I would assume that devs and SEOs will only get closer throughout the next few years. Because obviously, we’re all seeing the need for technical SEO.

MARTIN: I think that would clear the air. And for all of those who are scared and confused now on the SEO side, don’t be. You get to choose if you want to get into technical SEO or not. Technical SEO is a field of its own. It is complex. It is big. It is broad. It is new. It is fresh. But I think content still is an important field, and all the other things, all the other aspects inside SEO, will not become irrelevant or anything. It will continue to be a broad field. You get to pick your niche. But if you want to go technical, do it right. I think that’s a very, very nice way of looking at it.

BARTOSZ: And go technical.

MARTIN: Go technical. Awesome. Thank you so much, Bartosz, for joining me in this conversation. I think this was really interesting and insightful. And I hope to see more from you guys and also to hear what the community is saying about these things as well looking forward. Stay safe, stay healthy, everybody.

Sign up for SEO & Web Developers today!

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH 

What Happens When you Stop Doing SEO?

The 2020s are shaping up to be quite the outlier from decades past. In the last two years, we saw significant growth and focus on digital channels off the back of at-home pandemic buying; now with consumer confidence dipping, we’re seeing growth slow and marketers are increasingly focused on the effectiveness of their channel mix and may be considering where they can consolidate or reduce spend.

Business owners and marketers alike frequently wonder where SEO should sit – is it eCommerce, marketing or IT? Should it be a sustained marketing cost once we are happy with our visibility for core search terms? When is SEO’s job done?

SEO can sit anywhere in an organisation, but it makes the most sense to sit close to content, technical implementation and website changes, and new product launches. SEO is the art of being the least imperfect player in the search results so there is always work to do. Following core category content optimisation and technical audit fixes, research can uncover opportunities to develop content earlier in the journey to capture more users for paid search and email audiences to nurture them into being customers down the line.  In this way, the SEO job is never done and its absence in either activity or advisory can see good growth come undone. 

HERE ARE A FEW WATCH OUTS WORTH CONSIDERING IF SEO IS NEGLECTED:

  1. You may lose keyword growth momentum – Google values freshness and algorithm updates happen every time. If you’re not pruning and cultivating a healthy website and fresh content, you may fall out of favour and see rankings you worked hard on decline.
  2. Competitors can outperform your website by continuing optimisation works – as we said before, SEO is about being the least imperfect, so if you’re not investing time and effort, you can expect competitors who are to overtake you
  3. You may take a significant hit to your organic revenue – if you lose crucial Page 1 keywords to a competitor, their brand may be considered over yours and this can affect your bottom line as Organic commonly generates 35-60% of a company’s revenue.
  4. All websites aren’t created equally and neither are their budgets – unlike paid search, it’s difficult to gauge how much your competitors are investing in SEO. Content and link velocity, alongside internal team growth, is a good way to compare yourself to your competitors. A good SEO partner should be able to provide you with this view and help you outsmart your competitors where you can’t outspend them.

Get in touch for an obligation-free discussion with our growth strategists to find out how we can make your company take the LION’s share of the market online. We have achieved great results in the form of visibility, visitation and revenue growth that you can find on the case studies section of our website. 

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH

Contact Us

Article by

Leonidas Comino – Founder & CEO

Leo is a, Deloitte award winning and Forbes published digital business builder with over a decade of success in the industry working with market-leading brands.

Like what we do? Come work with us

Shopify Announces Launch of YouTube Shopping

Shopify announced the launch of YouTube shopping this week, outlining benefits including:

  • Customers can buy products right when they discover them
  • Instantly sync products to YouTube
  • Create authentic live shopping experiences
  • Sell on YouTube, manage on Shopify

What does this mean for our clients?

There are some eligibility restrictions for this product at the moment. You must already have 1000 subscribers to your YouTube channel and at least 4000 hours of annual watch time. This means as a brand, you will need to have an already well-established YouTube channel or look to start working with content creators who do.

Consider content creators who align with your brand or category and research their channels and content. There are specific websites and agencies that can help source content creators for a fee, including theright.fit and hypeauditor.com

YOUTUBE FOR ACTION WITH PRODUCT FEEDS

For clients who don’t meet the eligibility requirements, but still want to explore video for retail, there is another option. YouTube for action campaigns allow us to promote videos on the YouTube network, and attach a product feed through the merchant centre, creating a virtual shop front for the watcher, with an easy “shop now” functionality.

This powerful format allows brands to generate both awareness and engagement with their brand, whilst also driving bottom line sales. This can be managed through your Google Ads account allowing you to optimise towards the same conversions and use the same audience signals as your other Google campaigns.

What is YouTube for Action?

Previously named TrueView for Action, this product allows users to buy video ads on the YouTube network which are optimised towards a performance goal rather than pure reach or video views.

You can optimise towards:

  • Website traffic
  • Leads
  • Sales/Purchases

And have the option to choose your bud strategy based on:

  • Cost per View
  • Cost per Action
  • Maximise Conversions
  • Cost per thousand impressions

Who can I target?

YouTube and Google’s shared data provide a wealth of information to help us build audience segments that will fit your brand and services. The options include but are not limited to:

  • Demographic targeting: Age, gender, location –  based on signed-in user data
  • Affinity audiences: Pre-defined interest/hobby and behavioural data based on users browsing history
  • In-market audiences: Users deemed to be “in-market” for a product or service based on their searching behaviour and browsing history
  • Life-Events: Based on what a user is actively researching or planning, e.g. graduation, retirement etc
  • Topics:  Align your content with similar  themes to video content on the YouTube network
  • Placement: Align your content to specific YouTube channels, specific websites, or content on channels/websites.
  • Keyword: Similarly, to search, build portfolios of keywords to target specific themes on YouTube

The team at LION will work with you to select and define the right audiences to test and optimise to get the best results.

What content should I use?

Like any piece of content, there is no right or wrong answer, and what works for some brands may not for others. Your video should align with your brand tone of voice and guidelines. 

Think about what action you want the users to take and ensure the video aligns with this, e.g. if you want users to buy a specific product, show the product in the video and talk about its benefits. Testing multiple types of video content is the best way to learn about what your potential customers like and do not like.

What do I need to get started?

  1. At least one video uploaded to YouTube (we recommend 30 seconds in length)
  2. A Google merchant centre account & Google Ads account
  3. A testing budget of at least $1,000

YOU CAN CHAT WITH THE TEAM AT LION DIGITAL AND WE CAN HELP YOU TO SELECT AND DEFINE THE RIGHT AUDIENCES TO TEST AND OPTIMISE TO GET THE BEST RESULTS

LION stands for Leaders In Our Niche. We pride ourselves on being true specialists in each eCommerce Marketing Channel. LION Digital has a team of certified experts and the head of the department with 10+ years of experience in eCommerce and SEM. We follow an ROI-focused approach in paid search backed by seamless coordination and detailed reporting, thus helping our clients meet their goals.

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH

Contact Us

Article by

Leonidas Comino – Founder & CEO

Leo is a, Deloitte award winning and Forbes published digital business builder with over a decade of success in the industry working with market-leading brands.

Like what we do? Come work with us

WEBMASTER HANGOUT – LIVE FROM JULY 01, 2022

Which number is correct, Page Speed Insights or Search Console?

Q: (00:30) Starting off, I have one topic that has come up repeatedly recently, and I thought I would try to answer it in the form of a question while we’re at it here. So, first of all, when I check my page speed insight score on my website, I see a simple number. Why doesn’t this match what I see in Search Console and the Core Web Vitals report? Which one of these numbers is correct?

  • (01:02) I think maybe, first of all, to get the obvious answer out of the door, there is no correct number when it comes to speed when it comes to an understanding of how your website is performing for your users. In PageSpeed Insights, by default, I believe we show a single number that is a score from 0 to 100, something like that, which is based on a number of assumptions where we assume that different things are a little bit faster or slower for users. And based on that, we calculate a score. In Search Console, we have the Core Web Vitals information based on three numbers: speed, responsiveness, and interactivity. And these numbers are slightly different because it’s three numbers, not just one. But, also, there’s a big difference in the way these numbers are determined. Namely, there’s a difference between so-called field data and lab data. Field data is what users see when they go to your website. And this is what we use in Google Search Console. That’s what we use for search, as well, whereas lab data is a theoretical view of your website, like where our systems have certain assumptions where they think, well, the average user is probably like this, using this kind of device, and with this kind of a connection, perhaps. And based on those assumptions, we will estimate what those numbers might be for an average user. And you can imagine those estimations will never be 100% correct. And similarly, the data that users have seen will change over time, as well, where some users might have a really fast connection or a fast device, and everything goes really fast on their website or when they visit your website, and others might not have that. And because of that, this variation can always result in different numbers. Our recommendation is generally to use the field data, the data you would see in Search Console, as a way of understanding what is kind of the current situation for our website, and then to use the lab data, namely, the individual tests that you can run yourself directly, to optimise your website and try to improve things. And when you are pretty happy with the lab data you’re getting with your new version of your website, then over time, you can collect the field data, which happens automatically, and double-check that users see it as being faster or more responsive, as well. So, in short, again, there is no absolutely correct number when it comes to any of these metrics. There is no absolutely correct answer where you’d say this is what it should be. But instead, there are different assumptions and ways of collecting data, and each is subtly different.

How can our JavaScript site get indexed better?

Q: (04:20) So, first up, we have a few custom pages using Next.js without a robots.txt or a sitemap file. Simplified, theoretically, Googlebot can reach all of these pages, but why is only the homepage getting indexed? There are no errors or warnings in Search Console. Why doesn’t Googlebot find the other pages?

  • (04:40) So, maybe taking a step back, Next.js is a JavaScript framework, meaning the whole page is generated with JavaScript. But as a general answer, as well, for all of these questions like, why is Google not indexing everything? It’s important first to say that Googlebot will never index everything across a website. I don’t think it happens to any kind of non-trivial-sized website where Google would completely index everything. From a practical point of view, it’s impossible to index everything across the web. So that kind of assumption that the ideal situation is everything is indexed, I would leave that aside and say you want Googlebot to focus on the important pages. The other thing, though, which became a little bit clearer when, I think, the person contacted me on Twitter and gave me a little bit more information about their website, was that the way that the website was generating links to the other pages was in a way that Google was not able to pick up. So, in particular, with JavaScript, you can take any element on an HTML page and say, if someone clicks on this, then execute this piece of JavaScript. And that piece of JavaScript can be used to navigate to a different page, for example. And Googlebot does not click on all elements to see what happens. Instead, we go off and look for normal HTML links, which is the kind of traditional way you would link to individual pages on a website. And, with this framework, it didn’t generate these normal HTML links. So we could not recognise that there’s more to crawl and more pages to look at. And this is something that you can fix in how you implement your JavaScript site. We have a tonne of information on the Search Developer Documentation site around JavaScript and SEO, particularly on the topic of links because that comes up now and then. There are many creative ways to create links, and Googlebot needs to find those HTML links to make them work. Additionally, we have a bunch of videos on our YouTube channel. And if you’re watching this, you must be on the YouTube channel since nobody is here. If you’re watching this on the YouTube channel, go out and check out those JavaScript SEO videos on our channel to get a sense of what else you could watch out for when it comes to JavaScript-based websites. We can process most kinds of JavaScript-based websites normally, but some things you still have to watch out for, like these links.

Does it affect my SEO score negatively if I link to HTTP pages?

Q: (07:35)Next up, does it affect my SEO score negatively if my page is linking to an external insecure website?

  • (07:44) So on HTTP, not HTTPS. So, first off, we don’t have a notion of an SEO score. So you don’t have to worry about the kind of SEO score. But, regardless, I kind of understand the question is, like, is it wrong if I link to an HTTP page instead of an HTTPS page. And, from our point of view, it’s perfectly fine. If these pages are on HTTP, then that’s what you would link to. That’s kind of what users would expect to find. There’s nothing against linking to sites like that. There is no downside for your website to avoid linking to HTTP pages because they’re kind of old or crusty and not as cool as on HTTPS. I would not worry about that.

Q: (08:39) With Symantec and voice search, is it better to use proper grammar or write how people actually speak? For example, it’s grammatically correct to write, “more than X years,” but people actually say, “over X years,” or write a list beginning with, “such as X, Y, and Z,” but people actually say, “like X, Y, and Z.”

  • (09:04) Good question. So the simple answer is, you can write however you want. There’s nothing holding you back from just writing naturally. And essentially, our systems try to work with the natural content found on your pages. So if we can crawl and index those pages with your content, we’ll try to work with that. And there’s nothing special that you need to do there. The one thing I would watch out for, with regards to how you write your content, is just to make sure that you’re writing for your audience. So, for example, if you have some very technical content, but you want to reach people who are non-technical, then write in the non-technical language and not in a way that is understandable to people who are deep into that kind of technical information. So kind of the, I would guess, the traditional marketing approach of writing for your audience. And our systems usually are able to deal with that perfectly fine.

Should I delete my disavow file?

Q: (10:20) Next up, a question about links and disavows. Over the last 15 years, I’ve disavowed over 11,000 links in total. I never bought a link or did anything unallowed, like sharing. The links that I disavowed may have been from hacked sites or from nonsense, auto-generated content. Since Google now claims that they have better tools to not factor these types of hacked or spammy links into their algorithms, should I just delete my disavow file? Is there any risk or upside, or downside to just deleting it?

  • (10:54) So this is a good question. It comes up now and then. And disavowing links is always kind of one of those tricky topics because it feels like Google is probably not telling you the complete information. But, from our point of view, we do work hard to avoid taking this kind of link into account. And we do that because we know that the disavow links tool is a niche tool, and SEOs know about it, but the average person who runs a website doesn’t know about it. And all those links you mentioned are the links that any website gets over the years. And our systems understand that these are not things you’re trying to do to game our algorithms. So, from that point of view, if you’re sure that there’s nothing around a manual action that you had to resolve with regards to these links, I would just delete the disavow file and move on with life and leave all of that aside. I would personally download it and make a copy so that you have a record of what you deleted. But, otherwise, if you’re sure these are just the normal, crusty things from the internet, I would delete it and move on. There’s much more to spend your time on when it comes to websites than just disavowing these random things that happen to any website on the web.

Can I add structured data with Google Tag Manager?

Q: (12:30) Adding schema markup with Google Tag Manager is that good or bad for SEO? Does it affect ranking?

  • (12:33) So, first of all, you can add structure data with Google Tag Manager. That’s an option. Google Tag Manager is a simple piece of JavaScript you add to your pages and then does something on the server-side. And it can modify your pages slightly using JavaScript. For the most part, we’re able to process this normally. And the structured data you generally like can be counted, just like any other structured data on your web pages. And, from our point of view, structured data, at least the types that we have documented, is primarily used to help generate rich results, we call them, which are these fancy search results with a little bit more information, a little bit more colour or detail around your pages. And if you add your structured data with the Tag Manager, that’s perfectly fine. From a practical point of view, I prefer to have the structured data on the page or your server so that you know exactly what is happening. It makes it a little bit easier to debug things. It makes it easier to test things. So trying it out with Tag Manager, from my point of view, I think, is legitimate. It’s an easy way to try things out. But, in the long run, I would try to make sure that your structured data is on your site directly, just to make sure that it’s easier to process for anyone who comes by to process your structured data and it’s easier for you to track and debug and maintain over time, as well, so that you don’t have to check all of these different separate sources.

Is it better to block by robots.txt or with the robots meta tag?

Q: (14:20) Simplifying a question a little bit, which is better, blocking with robots.txt or using the robots meta tag on the page? How do we best prevent crawling? 

  • (14:32) So this also comes up from time to time. We did a podcast episode recently about this, as well. So I would check that out. The podcasts are also on the YouTube channel, so you can click around a little bit, and you’ll probably find them quickly. In practice, there is a subtle difference here where, if you’re in SEO and you’ve worked with search engines, then probably you understand that already. But for people who are new to the area, it’s sometimes unclear exactly where these lines are. And with robots.txt, which is the first one you mentioned in the question, you can essentially block crawling. So you can prevent Googlebot from even looking at your pages. And with the robot’s meta tag, you can do things like blocking indexing when Googlebot looks at your pages and sees that robot’s meta tag. In practice, both of these results in your pages do not appear in the search results, but they’re subtly different. So if we can’t crawl, we don’t know what we’re missing. And it might be that we say, well, there are many references to this page. Maybe it is useful for something. We just don’t know. And then that URL could appear in the search results without any of its content because we can’t look at it. Whereas with the robot’s meta tag, if we can look at the page, then we can look at the meta tag and see if there’s no index there, for example. Then we stop indexing that page and drop it completely from the search results. So if you’re trying to block crawling, then definitely, robots.txt is the way to go. If you just don’t want the page to appear in the search results, I would pick whichever is easier for you to implement. On some sites, it’s easier to set a checkbox saying that I don’t want this page found in Search, and then it adds a noindex meta tag. For others, maybe editing the robots.txt file is easier. Kind of depends on what you have there.

Q: (16:38) Are there any negative implications to having duplicate URLs with different attributes in your XML sitemaps? For example, one URL in one sitemap with an hreflang annotation and the same URL in another sitemap without that annotation.

  • (16:55) So maybe, first of all, from our point of view, this is perfectly fine. This happens now and then. Some people have hreflang annotations in sitemap files separated away, and then they have a normal sitemap file for everything. And there is some overlap there. From our point of view, we process these sitemap files as we can, and we take all of that information into account. There is no downside to having the same URL in multiple sitemap files. The only thing I would watch out for is that you don’t have conflicting information in these sitemap files. So, for example, if with the hreflang annotations, you’re saying, oh, this page is for Germany and then on the other sitemap file, you’re saying, well, actually this page is also for France or in French, then our systems might be like, well, what is happening here? We don’t know what to do with this kind of mix of annotations. And then we may pick one or the other. Similarly, if you say this page was last changed 20 years ago, which doesn’t make much sense but say you say 20 years. And in the other sitemap file, you say, well, actually, it was five minutes ago. Then our systems might look at that and say, well, one of you is wrong. We don’t know which one. Maybe we’ll follow one or the other. Maybe we’ll just ignore that last modification date completely. So that’s kind of the thing to watch out for. But otherwise, if it’s just mentioned multiple sitemap files and the information is either consistent or kind of works together, in that maybe one has the last modification date, the other has the hr flange annotations, that’s perfectly fine.

How can I block embedded video pages from getting indexed?

Q: (19:00) I’m in charge of a video replay platform, and simplified, our embeds are sometimes indexed individually. How can we prevent that?

  • (19:10) So by embeds, I looked at the website, and basically, these are iframes that include a simplified HTML page with a video player embedded. And, from a technical point of view, if a page has iframe content, then we see those two HTML pages. And it is possible that our systems indexed both HTML pages because they are separate. One is included in the other, but they could theoretically stand on their own, as well. And there’s one way to prevent that, which is a reasonably new combination with robots meta tags that you can do, which is with the indexifembedded robots meta tag and a noindex robots meta tag. And, on the embedded version, so the HTML file with the video directly in it– you would add the combination of noindex plus indexifembedded robots meta tags. And that would mean that, if we find that page individually, we would see, oh, there’s a noindex. We don’t have to index this. But with the indexifembedded, it essentially tells us that, well, actually, if we find this page with the video embedded within the general website, then we can index that video content, which means that the individual HTML page would not be indexed. But the HTML page embedded with the video information would be indexed normally. So that’s kind of the setup that I would use there. And this is a fairly new robots meta tag, so it’s something that not everyone needs. Because this combination of iframe content or embedded content is kind of rare. But, for some sites, it just makes sense to do it like that.

Q: (21:15)Another question about HTTPS, maybe. I have a question around preloading SSL via HSTS. We are running into an issue where implementing HSTS into the Google Chrome preload list. And the question kind of goes on with a lot of details. But what should we search for?

  • (21:40) So maybe take a step back when you have HTTPS pages and an HTTP version. Usually, you would redirect from the HTTP version to HTTPS. And the HTTPS version would then be the secure version because that has all of the properties of the secure URLs. And the HTTP version, of course, would be the open one or a little bit vulnerable. And if you have this redirect, theoretically, an attacker could take that into account and kind of mess with that redirect. And with HSTS, you’re telling the browser that once they’ve seen this redirect, it should always expect that redirect, and it shouldn’t even try the HTTP version of that URL. And, for users, that has the advantage that nobody even goes to the HTTP version of that page anymore, making it a little more secure. And the pre-load list for Google Chrome is a static list that is included, I believe, in Chrome probably in all of the updates, or I don’t know if it’s downloaded separately. Not completely sure. But, essentially, this is a list of all of these sites where we have confirmed that HSTS is set up properly and that redirect to the secure page exists there so that no user ever needs to go to the HTTP version of the page, which makes it a little bit more secure. From a practical point of view, this difference is very minimal. And I would expect that most sites on the internet just use HTTPS without worrying about the pre-load list. Setting up HSTS is always a good practice, but it’s something that you can do on your server. And as soon as the user sees that, their Chrome version keeps that in mind automatically anyway. So from a general point of view, I think using the pre-load list is a good idea if you can do that. But if there are practical reasons why that isn’t feasible or not possible, then, from my point of view, I would not worry about only looking at the SEO side of things. When it comes to SEO, for Google, what matters is essentially the URL that is picked as the canonical. And, for that, it doesn’t need HSTS. It doesn’t need the pre-load list. That does not affect at all on how we pick the canonical. But rather, for the canonical, the important part is that we see that redirect from HTTP to HTTPS. And we can kind of get a confirmation within your website, through the sitemap file, the internal linking, all of that, that the HTTPS version is the one that should be used in Search. And if we use the HTTPS version in Search, that automatically gets all of those subtle ranking bonuses from Search. And the pre-load list and HSTS are not necessary there. So that’s kind of the part that I would focus on there.

How can I analyse why my site dropped in ranking for its brand name?

Q: (25:05) I don’t really have a great answer, but I think it’s important to at least mention, as well what are the possible steps for investigation if a website owner finds their website is not ranking for their brand term anymore, and they checked all of the things, and it doesn’t seem to be related to any of the usual things?

  • (25:24) So, from my point of view, I would primarily focus on the Search Console or the Search Central Health Community and post all of your details there. Because this is where all of those escalations go and where the product and the Help forum, they can take a look at that. And they can give you a little bit more information. They can also give you their personal opinion on some of these topics, which might not match 100% what Google would say, but maybe they’re a little bit more practical, where, for example, probably not relevant to this site, but you might post something and say, well, my site is technically correct and post all of your details. And one of the product experts looks at it and says it might be technically correct, but it’s still a terrible website. You need to get your act together, write, and create better content. And, from our point of view, we would focus on technical correctness. And you need someone to give you that, I don’t know, personal feedback. But anyway, in the Help forums, if you post the details of your website with everything that you’ve seen, the product experts are often able to take a look and give you some advice on, specifically, your website and the situation that it’s in. And if they’re not able to figure out what is happening there, they also have the ability to escalate these kinds of topics to the community manager of the Help forums. And the community manager can also bring things back to the Google Search team. So if there are things that are really weird and now and then, something really weird does happen with regards to Search. It’s a complex computer system. Anything can break. But the community managers and the product experts can bring that back to the Search team. And they can look to see if there is something that we need to fix, or is there something that we need to tell the site owner, or is this kind of just the way that it is, which, sometimes, it is. But that’s generally the direction I would go for these questions. The other subtly mentioned here is that I think the site does not rank for its brand name. One of the things to watch out for, especially with regards to brand names, is that it can happen that you say something is your brand name, but it’s not a recognised term from users. For example, you might say I don’t know. You might call your website bestcomputermouse.com. And, for you, that might be what you call your business or what you call your website. Best Computer Mouse. But when a user goes to Google and enters “best computer mouse,” that doesn’t necessarily mean they want to go directly to your website. It might be that they’re looking for a computer mouse. And, in cases like that, there might be a mismatch of what we show in the search results with what you think you would like to have shown for the search results for those queries if it’s something more of a generic term. And these kinds of things also play into search results overall. The product experts see these all the time, as well. And they can recognise that and say, actually, just because you call your website bestcomputermouse.com I hope that site doesn’t exist. But, anyway, just because you call your website doesn’t necessarily mean it will always show on top of the search results when someone enters that query. But that’s kind of something to watch out for. But, in general, I would go to the Help forums here and include all of the information you know that might play a role here. So if there was a manual action involved and you’re kind of, I don’t know, ashamed of that which, it’s kind of normal. But all of this information helps the product experts better understand your situation and give you something actionable that you can do to take as a next step or to understand the situation a little bit better. So the more information you can give them from the beginning, the more likely they’ll be able to help you with your problem.

Sign up for our Webmaster Hangouts today!

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH 

SEO & web developers: friends or foes?

Is SEO just another constraint?

(0:59) MARTIN: So you are saying that for you as a web developer, SEO is just another constraint. What do you mean by that?

SURMA: I mean, as web developers, as far as web development goes, we and I are pretending to be a representative of all web developers, which I’m clearly not. But the usual struggle involves stuff like, how many browsers do I support, or how far back in the browser versions do I go? Do I support IE11? Do I not support IE11? How do I polyfill certain things? Do I polyfill them, or do I use progressive enhancement? What kind of implications do both of these choices have? Do we actually make design fully responsive? Do we do a separate mobile site? Like, there are all these choices that I already have to make. Four and then now frameworks come along and are like; we’re just going to do most of the stuff on the client-side because we want to write a single-page app. And then you either have to say, do I set up something for server-side rendering or static rendering and build time, or do I go all-in on the single-page app, and just everything happens on the client? What do search engines think about that? And then search engines come in like, well, you should be doing this, and you should not be doing that because that gets penalised. And actually, we don’t even understand what you’re doing here because search engines are running Chrome 12 or something. And it’s just like it’s yet another constraint that I have to balance and make decisions on whether following their restrictions is more important or following my business restrictions, my marketing restrictions, or my project management restrictions. And I feel like some people are frustrated about that sometimes.

MARTIN: I remember when I was a web developer, there’s also this entire user first no, mobile-first, no, content first, no this first, no, that first. That’s probably also going in the same direction, and I understand the frustration. And I see that there are lots of requirements, and sometimes these requirements might even contradict each other. But I think as developers, we should understand what SEOs are trying to help us with and what search engines, separately from what we are building and doing, are actually trying to accomplish. And then we would see that it’s basically all of these requirements are important, but maybe some of them are more important than others, and they are important in different stages, I would say. So, for instance, you mentioned mobile-first versus or a responsive design versus having separate versions, right? I would say that’s a decision that you need to make relatively early on, right? In the technology process before you, yeah, whereas then should I use this feature, should I polyfill this feature, should I not use this feature because I need to support an old browser that doesn’t support it, and the polyfill is tricky, that’s something that probably happens a little later in development, right?

SURMA: Yeah, I think I agree with that. It depends on how much flexibility you’re given as a developer. I think we all may or may not have lived through the experience of working with a designer who insists on a pixel perfect design, which is just not something that works on the web, but sometimes, you’re given a task, and your job is to complete it and not have an opinion. I don’t want to go down. It depends on the route. But in the end, we won’t get whatever we end up talking about. We probably won’t find a definitive answer. Like context matters, and everyone has different constraints. And I think that’s really what it’s about that you need to just be aware of the constraints that you have and make decisions in the context of your situation.

SEO – friend, not foe

(04:41) SURMA: You mentioned something that I find quite interesting. You said SEOs are trying to help us with something because often they’re kind of like villains, almost like the people who just try to get you to the top of the rankings, whether you deserve it or not. But in the end, I feel like there is help going on. Both search engines, as well as the people that want you to do well in the search results, actually are trying to make your site better in the end. Like no search engine actually wants to give you results that are bad. That just doesn’t make sense. In the end, search engines are trying to put the best results on top. And if an SEO helps you get on top, then ideally, what that means is your site has gotten better. 

MARTIN: Yes, exactly. And I love that you are saying, like, oh yeah, you have to look at the context. You have to understand the constraints. And that’s actually something that a good SEO will help you with because if you look at it from a purely SEO standpoint, depending on what you’re building and how you’re building it, you might have different priorities. So, for instance, if you’re saying, oh, this is a test version of a landing page. We just want to see if the customer segment is even interested in what we are potentially building later on, and you don’t want to build for the bin, right? You don’t want to build something that then, later on, you find out doesn’t actually work because there’s no interest in it. So then, for these things, SEO might be relatively important because you definitely want people to find that so that you get enough data on making decisions later on. But you might not be so constrained in terms of oh, yeah, this has to be client-side versus server-side. We don’t really have to make this super fast. We just have to get this into people’s faces, especially through search engines, so that we get some meaningful data to make decisions later on, versus you’re building and improving on an existing product, and that should belong evitable.

Building better websites for everyone

(6:33) MARTIN: So, a good SEO helps you understand what kind of requirements you should take into account. And SEO is a gigantic field, and they should pick the bits and pieces that actually matter for your specific context. So you said like, oh, we want to build a single page application. Maybe. Maybe you do, maybe you don’t. Maybe it’s fine to build a client-side rendering, but maybe consider doing some bits and pieces of server-side rendering because you reap some performance benefits there. And that also influences SEO because, as you say, search engines want to find the good things. So making your site better includes making it faster but also making it accessible because if you think about it, search engines are computers interacting with your website, working through your website and trying to understand what your website says. So they have basic accessibility needs. They don’t necessarily interact with things. They don’t click on stuff. And yet they need to work with your content. So it should be accessible to them. And SEOs will point these things out.

SURMA: That’s really interesting that you bring that up because I was just thinking about both performance, like loading performance, for example, and accessibility. So, on the one hand, it’s kind of accepted that loading performance is important. But now that, for example, we have Core Web Vitals. And one of the core ones of their core statements is that they don’t want to just put a number on a metric or something that’s measurable. They want to measure things that are important to user experience. And so the Core Web Vitals that we currently have, which is just three metrics, LCP, CLS, and FID, right. All of these are statistically correlated to users enjoying the site more or staying on your site longer. And that means if you optimise for those, you actually will get something out of that. You will get users that stay longer. And now that search is looking into those, it means optimising for those metrics not only gets you higher in the rankings potentially but also the people that do see your site will most likely stay longer or engage with it more because we know that these metrics correlate with user behaviour. And I think that’s a really interesting approach, wherein, in the end, actually search engines are helping you do the right thing. And now I’m wondering which I don’t even know like accessibility is something, which we keep talking about, and we know it’s important. And yet it feels like it always falls off the truck. In many projects, it’s an afterthought, and many people know that it needs to be something that needs to be considered from the very beginning of a project because it’s hard to shoehorn in at the end. It needs to be something that works from the start. Has any search engine ever done anything in this space to help developers be better with accessibility?

MARTIN: We are more or less going in that direction, not necessarily from a purely accessible standpoint, but as search engines need to semantically understand what the site is about, we just don’t take the text and take it as plain. We basically try to figure out, oh, so this is a section, this is a section, this is the section that is most important on the page. This is just an explainer for the previous section and so on and so forth. For that, we need the semantics that HTML gives us. And actually, these semantics are also important for accessibility oftentimes because people need to be able to navigate your content differently, maybe with a keyboard, maybe with a screen reader. And for that, the semantics on the page need to be in place from the get-go, basically. So in that direction, having better semantics does help search engines better understand your content and, as a byproduct, also help people better navigate your content who have additional navigational needs. So you could say search engines are a little involved in terms of accessibility. That does not cover accessibility as a whole. There is so much more to accessibility than just that. But at least in the core of the semantics on the web, that is taken care of here. 

Keeping up with web development trends is important for SEOs

(10:37) MARTIN: Another thing that I really found interesting is where you say, oh, you know, SEOs are often seen as just coming with all of these additional constraints and requirements. What is there that they could do differently that you would think would help you and other developers understand where they’re coming from or have a meaningful discussion about these things and turn that into a positive, constructive input?

SURMA: I don’t know if this is the answer you’re looking for, but one thing I have seen is that some SEOs need to put a bit more effort into being up to date on what is good and what is not good guidance, or more specifically, what search engines are and are not capable of processing correctly. I think– I know that you have been fighting the no, no JavaScript is the fine fight for a long time now, but I think to this day, there are still some SEOs out there who insist that anything that is in JavaScript is invisible to the search engine. I think in general, I think it goes back to the trade-off thing, where I think web developers need to realise that SEOs are trying to help you be better, and SEOs need to realise that that they can’t give advice as a either you do this, or you’re screwed kind of approach. Like, it’s a trade-off. You can say that this is one way where you can make a site better. This is another way, and this is yet another thing you can do. And all of these things will accumulate to make your site better, ideally resulting in giving you a higher rank. But it’s not like an all or nothing approach, I think. Sometimes certain constraints just outweigh other constraints, and you then make a decision to go with plan A rather than plan B or stick to what you currently got. We have recently seen a lot of shifts from purely client-side up to like this hybrid approach, where the app is rendered on the server-side or even at build time but then turns into a single page app once loaded onto the client, and that has upsides, and it has downsides. Like we know that statically rendered content is very good for your first render, your largest loading time, that all goes down. But now we have this additional blob of JavaScript state that is somehow inserted into the document, and then often, the full dynamic client-side re-render happens, which can create an iffy user experience at times. And all these things are working for or against you in certain aspects. And I think that’s just something that the SEOs need to be mindful of as well, that the developer cannot just follow everything that they say because they’re different; they’re not the only deciding force on a project. I’m not saying that all SEOs behave like this, of course, because I’m honestly quite inexperienced in working with an SEO directly. But just based on stories that I hear and people that I see on Twitter, it’s all a trade-off. And I think people need just to realise that everyone is in 90% of the cases trying to do the best they can and do their job well. And just keep that in mind. And then probably find a solution that works for both or is a compromise.

Change your perspective

(13:57) MARTIN: Yeah. No, that makes perfect sense. And I wish that both SEOs and developers would look at it from a different perspective. Like both SEOs and developers want to build something that people are using, right? You don’t want to build something and no one uses it. That’s neither going to pay your bills very long. Nor is it making you happy to see like, oh, yeah, we built something that helps many people. That’s true for me. When I was a developer, I wanted to build things that have an impact, and that means that they need to be used by someone. And if we are building something that we genuinely are convinced is a good thing, then that should be reflected by the fact that search engines would agree on that and say like, oh, yeah, this is a good solution to this problem or this challenge that people might face and thus want to showcase your solution basically. But for that, there needs to be something that search engines can grasp and understand and look at and put into their index accordingly. So basically, they need to understand what is the page about, what it offers the users, is it fast, is it slow, is it mobile-friendly, all these kinds of things. And SEOs are then the ones who are– because you as a developer are focused on making it work in all the browsers that it needs to work in, making it fast, using all the best practices, using tree shaking, bundle splitting, all that kind of stuff. And then SEOs come in and help you make sure that search engines understand what you’re building and can access what you’re building and that you are following the few best practices that you might not necessarily be aware of yet. But you are right. For that, SEOs need to follow up-to-date best practice guidance, and not all of them do. Well, at the beginning of 2021, I ran a poll in one of the virtual events, asking if people were aware that the Google bot is now using an evergreen Chrome. So we are updating the Chromium instance that is used to render pages. And I think like 60% of the people were like, oh, I didn’t know that even though we announced that in 2019 at Google I/O in May.

SURMA: How was that?

MARTIN: That was amazing. I mean, launching this has been a great initiative. But I’m surprised that I think we have gotten developers to notice that, but not necessarily all SEOs have noticed. And it’s things that are not necessarily easy or not even your job as a developer, where SEOs can really help you or at least make the right connections for you. For instance, I know you build squoosh. The app, right?

SURMA: Well, not just me, but I was part of the team that built it.

MARTIN: Right. You were part of the team who built squoosh app. And I think squoosh.app is awesome. For those who don’t know it, it’s a web app that allows you to experiment with different image compression settings and then basically get the image that you put into the application in your browser. It’s all working from the browser. You don’t have to install anything. And basically, get like the best settings for the best gains in terms of file size, right? That’s roughly what it does.

SURMA: Yeah. It’s an image compressor, and you can fiddle with all the settings and can try out the next generation codecs now that are coming to the web. But yeah, you have more control than I think any other image compression app that I know.

MARTIN: And it’s really, really cool, and I really admire the work that the engineering put into this, that all the developers put into this to make this work so smoothly, so quickly, so nice. It implements lots of best practices. But for a search engine, if you were to sell that as a product, this might not be very good. And that’s because if you look at it, it’s an interface that allows me to drag and drop an image into it, and then it does a bunch of stuff in terms of user interface controls to fine-tune settings. But if I was robot access that page, it’s a bunch of HTML controls, but not that much content, right?

SURMA: Agreed

MARTIN: So would you want to have to sit down and figure out how you would describe this and how you probably don’t want to do all that work by yourself. You want to focus on building cool stuff with the new image algorithms and fine-tuning how to make the settings work better or more accessible, or easier to understand, right? That’s where you want to focus on.

SURMA: Yeah. And I think I actually would like to get help from someone who knows whether this site like I wouldn’t have been able to say if like I think our loading performance is excellent because we spend lots of time on making it good and trying to pioneer some techniques. But I wouldn’t have been able to tell you whether it gets a good ranking from a search bot or a bad ranking, to be honest. I mean, the name is unique enough that it’s very Google-able, so I think even if it didn’t do so well, people would probably find it. But in the end, it’s actually a very interesting example because you’re completely right. The opening page, because it’s about images, it mostly consists of images. The only text we have on the landing page is the name and the file size of the demo images, and the licensing link. So there’s not much going on for a bot to understand what the site does, especially because something this specific, there’s not even much to do with semantic markup, as you said. Right, OK, cool, there’s an image and an input tag. You can drag and drop an image. But even that, even the drag and drop is actually only communicated via the user interface, not via the markup. And so yeah, that’s a really interesting example. Like, I would have no idea how to optimise. I would have probably said like meta description tag. I don’t know. And then John Miller told me that apparently, we don’t pay attention to the meta description tag anymore.

MARTIN: Well, we do. It’s the keywords that we don’t.

SURMA: Oh, the keywords are the one. OK, I take that back. Yeah, exactly. So I think you’re right that it’s very easy for developers to sometimes also guess what is good for SEO and what is bad and actually get input from someone who put in the time to learn what is actually going on. Keep up to date with the most recent updates. As you say, people apparently don’t even know that Google bot is now evergreen Chrome, which is amazing. So there are probably a lot of SEOs who go around saying like, no, no, no, no, you can’t use Shadow Dom or something like that, even if they know the JavaScript actually works. I agree. Get someone who knows.

Making things on the web is a team sport.

(20:26) SURMA: I mean, I’ve been saying that even as a very enthusiastic experimenter and web developer, one single person cannot really understand and use the entire web platform. It’s now so incredibly widespread in the areas that cover. So you can do web audio, web assembly, web use B, MIDI, and all these things. Like, you will not have experience in all of these things. And some of these holes, like WebGL itself, is a huge rabbit holes to fall into. So, pick some stuff. Get good at it. And for things you don’t know, get help because otherwise, you’re going to work on half-knowledge that might end up very likely going to end up making actually counterproductive for what you’re trying to achieve.

Sign up for SEO & Web Developers today!

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH 

WebMaster Hangout – Live from June 03, 2022

Can I use two HTTP result codes on a page?

Q: (01:22) All right, so the first question I have on my list here is it’s theoretically possible to have two different HTTP result codes on a page, but what will Google do with those two codes? Will Google even see them? And if yes, what will Google do? For example, a 503 plus a 302.

  • (01:41) So I wasn’t aware of this. But, of course, with the HTTP result codes, you can include lots of different things. Google will look at the first HTTP result code and essentially process that. And you can theoretically still have two HTTP result codes or more there if they are redirects leading to some final page. So, for example, you could have a redirect from one page to another page. That’s one result code. And then, on that other page, you could serve a different result code. So that could be a 301 redirect to a 404 page is kind of an example that happens every now and then. And from our point of view, in those chain situations where we can follow the redirect to get a final result, we will essentially just focus on that final result. And if that final result has content, then that’s something we might be able to use for canonicalization. If that final result is an error page, then it’s an error page. And that’s fine for us too.

Does using a CDN improve rankings if my site is already fast in my main country?

Q: (02:50) Does putting a website behind a CDN improve ranking? We get the majority of our traffic from a specific country. We hosted our website on a server located in that country. Do you suggest putting our entire website behind a CDN to improve page speed for users globally, or is that not required in our case?

  • (03:12) So obviously, you can do a lot of these things. I don’t think it would have a big effect on Google at all with regards to SEO. The only effect where I could imagine that something might happen is what users end up seeing. And kind of what you mentioned, if the majority of your users are already seeing a very fast website because your server is located there, then you’re kind of doing the right thing. But of course, if users in other locations are seeing a very slow result because perhaps the connection to your country is not that great, then that’s something where you might have some opportunities to improve that. And you could see that as something in terms of an opportunity in the sense that, of course, if your website is really slow for other users, then it’s going to be rarer for them to start going to your website more because it’s really annoying to get there. Whereas, if your website is pretty fast for other users, then at least they have an opportunity to see a reasonably fast website, which could be your website. So from that point of view, if there’s something that you can do to improve things globally for your website, I think that’s a good idea. I don’t think it’s critical. It’s not something that matters in terms of SEO in that Google has to see it very quickly as well or anything like that. But it is something that you can do to kind of grow your website past just your current country. Maybe one thing I should clarify, if Google’s crawling is really, really slow, then, of course, that can affect how much we can crawl and index from the website. So that could be an aspect to look into. In the majority of websites that I’ve looked at, I haven’t really seen this as being a problem with regards to any website that isn’t millions and millions of pages large. So from that point of view, you can double-check how fast Google is crawling in the Search Console and the crawl stats. And if that looks reasonable, even if that’s not super fast, then I wouldn’t really worry about that.

Should I disallow API requests to reduce crawling?

Q: (05:20) Our site is a live stream shopping platform. Our site currently spends about 20% of the crawl budget on the API subdomain and another 20% on image thumbnails of videos. Neither of these subdomains has content which is part of our SEO strategy. Should we disallow these subdomains from crawling, or how are the API endpoints discovered or used?

  • (05:49) So maybe the last question there first. In many cases, API endpoints end up being used by JavaScript on our website, and we will render your pages. And if they access an API that is on your website, then we’ll try to load the content from that API and use that for the rendering of the page. And depending on how your API is set up and how your JavaScript is set up, it might be that it’s hard for us to cache those API results, which means that maybe we crawl a lot of these API requests to try to get a rendered version of your pages so that we can use those for indexing. So that’s usually the place where this is discovered. And that’s something you can help by making sure that the API results can also be cached well, that you don’t inject any timestamps into URLs, for example, when you’re using JavaScript for the API, all of those things there. If you don’t care about the content that’s returned with these API endpoints, then, of course, you can block this whole subdomain from being crawled with the robots.txt file. And that will essentially block all of those API requests from happening. So that’s something where you first of all need to figure out are these API results are actually part of the primary content or important critical content that I want to have indexed from Google?

Q: (08:05) Is it appropriate to use a no-follow attribute on internal links to avoid unnecessary crawler requests to URLs which we don’t wish to be crawled or indexed?

  • (08:18) So obviously, you can do this. It’s something where I think, for the most part, it makes very little sense to use nofollow on internal links. But if that’s something that you want to do, go for it. In most cases, I will try to do something like using the rel=canonical to point at URLs that you do want to have indexed or using the robots.txt for things that you really don’t want to have crawled. So try to figure out is it more like a subtle thing that you have something that you prefer to have indexed and then use rel=canonical for that? Or is it something where you say actually, when Googlebot accesses these URLs, it causes problems for my server. It causes a large load. It makes everything really slow. It’s expensive or what have you. And for those cases, I would just disallow the crawling of those URLs. And try to keep it kind of on a basic level there. And with the rel=canonical, obviously, we’ll first have to crawl that page to see the rel=canonical. But over time, we will focus on the canon that you’ve defined. And we’ll use that one primarily for crawling and indexing.

Why don’t site:-query result counts match Search Console counts?

Q: (09:35) Why don’t the search results of a site query, which returns so many giant numbers of results, match what Search Console and the index data have for the same domain?

  • (09:55) Yeah, so this is a question that comes up every now and then. I think we’ve done a video on it separately as well. So I would double-check that out. I think we’ve talked about this a long time already. Essentially, what happens there is that there are slightly different optimisations that we do for site queries in terms of we just want to give you a number as quickly as possible. And that can be a very rough approximation. And that’s something where when you do a site query, that’s usually not something that the average user does. So we’ll try to give you a result as quickly as possible. And sometimes, that can be off. If you want a more exact number of the URLs that are actually indexed for your website, I would definitely use Search Console. That’s really the place where we give you the numbers as directly as possible, as clearly as possible. And those numbers will also fluctuate a little bit over time. They can fluctuate depending on the data centre sometimes. They go up and down a little bit as we crawl new things, and we kind of have to figure out which ones we keep, all of those things. But overall, the number in Search Console for in, I think the indexing report that’s really the number of URLs that we have indexed for your website. I would not use the about number for any diagnostics purposes in the search results. It’s really meant as a very, very rough approximation.

What’s the difference between JavaScript and HTTP redirects?

Q: (11:25) OK, now a question about redirects again, about the differences between JavaScript versus 301, HTTP, status code redirects, and which one would I suggest for short links.

  • (11:43) So, in general, when it comes to redirects, if there’s a server-side redirect where you can give us a result code as quickly as possible, that is strongly preferred. The reason that it is strongly preferred is just that it can be processed immediately. So any request that goes to your server to one of those URLs, we’ll see that redirect URL. We will see the link to the new location. We can follow that right away. Whereas, if you use JavaScript to generate a redirect, then we first have to render the JavaScript and see what the JavaScript does. And then we’ll see, oh, there’s actually a redirect here. And then we’ll go off and follow that. So if at all possible, I would recommend using a server-side redirect for any kind of redirect that you’re doing on your website. If you can’t do a server-side redirect, then sometimes you have to make do. And a JavaScript redirect will also get processed. It just takes a little bit longer. The meta refresh type redirect is another option that you can use. It also takes a little bit longer because we have to figure that out on the page. But server-side redirects are great. And there are different server-side redirect types. So there’s 301 and 302. And I think, what is it, 306? There’s 307 and 308, something along those lines. Essentially, the differences there are whether or not it’s a permanent redirect or a temporary redirect. A permanent redirect tells us that we should focus on the destination page. A temporary redirect tells us we should focus on the current page that is redirecting and kind of keep going back to that one. And the difference between the 301, 302, and the 307, and I forgot what the other one was, is more of a technical difference with regards to the different request types. So if you enter a URL in your browser, then you do what’s called a GET request for that URL, whereas if you send something to a form or use specific types of API requests, then that can be a POST request. And the 301, 302 type redirects would only redirect the normal browser requests and not the forms and the API requests. So if you have an API on your website that uses POST requests, or if you have forms where you suspect someone might be submitting something to a URL that you’re redirecting them, then obviously, you would use the other types. But for the most part, it’s usually 301 or 302.

Should I keep old, obsolete content on my site, or remove it?

Q: (14:25) I have a website for games. After a certain time, a game might shut down. Should we delete non-existing games or keep them in an archive? What’s the best option so that we don’t get any penalty? We want to keep informed of the game through videos, screenshots, et cetera.

  • (14:42) So essentially, this is totally up to you. It’s something where you can remove the content of old things if you want to. You can move them to an archive section. You can make those old pages no-index so that people can still go there when they’re visiting your website. There are lots of different variations there. The main thing that probably you will want to do if you want to keep that content is moving it into an archive section, as you mentioned. The idea behind an archive section is that it tends to be less directly visible within your website. That means it’s easy for users and for us to recognise this is the primary content, like the current games or current content that you have. And over here is an archive section where you can go in, and you can dig for the old things. And the effect there is that it’s a lot easier for us to focus on your current live content and to recognise that this archive section, which is kind of separated out, is more something that we can go off an index. But it’s not really what you want to be found for. So that’s kind of the main thing I would focus on there. And then whether or not you make the archive contains no index after a certain time or for other reasons, that’s totally up to you.

Q: (16:02) Is there any strategy by which desired pages can appear as a site link in Google Search results?

  • (16:08) So site links are the additional results that are sometimes shown below a search result, where it’s usually just a one-line link to a different part of the website. And there is no meta tag or structured data that you can use to enforce a site link to be shown. And it’s a lot more than our systems try to figure out what is actually kind of related or relevant for users when they’re looking at this one web page as well? And for that, our recommendation is essentially to have a good website structure, to have clear internal links so that it’s easy for us to recognise which pages are related to those pages, and to have clear titles that we can use and kind of show as a site link. And with that, it’s not that there’s a guarantee that any of this will be shown like that. But it kind of helps us to figure out what is related. And if we do think it makes sense to show a site link, then it’ll be a lot easier for us to actually choose one based on that information.

Our site embeds PDFs with iframes, should we OCR the text?

Q: (17:12) More technical one here. Our website uses iframes and a script to embed PDF files onto our pages and our website. Is there any advantage to taking the OCR text of the PDF and pasting it somewhere into the document’s HTML for SEO purposes, or will Google simply parse the PDF contents with the same weight and relevance to index the content?

  • (17:40) Yeah. So I’m just momentarily thrown off because it sounds like you want to take the text of the PDF and just kind of hide it in the HTML for SEO purposes. And that’s something I would definitely not recommend doing. If you want to have the content indexable, then make it visible on the page. So that’s the first thing there that I would say. With regards to PDFs, we do try to take the text out of the PDFs and index that for the PDFs themselves. From a practical point of view, what happens with a PDF is as one of the first steps, we convert it into an HTML page, and we try to index that like an HTML page. So essentially, what you’re doing is kind of framing an indirect HTML page. And when it comes to iframes, we can take that content into account for indexing within the primary page. But it can also happen that we index the PDF separately anyway. So from that point of view, it’s really hard to say exactly kind of what will happen. I would turn the question around and frame it as what do you want to have to happen? And if you want your normal web pages to be indexed with the content of the PDF file, then make it so that that content is immediately visible on the HTML page. So instead of embedding the PDF as a primary piece of content, make the HTML content the primary piece and link to the PDF file. And then there is a question of do you want those PDFs indexed separately or not? Sometimes you do want to have PDFs indexed separately. And if you do want to have them indexed separately, then linking to them is great. If you don’t want to have them indexed separately, then using robots.txt to block their indexing is also fine. You can also use the no index [? x-robots ?] HTTP header. It’s a little bit more complicated because you have to serve that as a header for the PDF files if you want to have those PDF files available in the iframe but not actually indexed. I don’t know. Timing, we’ll have to figure out how long we make these.

Q: (20:02) We want to mask links to external websites to prevent the passing of our link juice. We think the PRG approach is a possible solution. What do you think? Is the solution overkill, or is there a simpler solution out here?

  • (20:17) So the PRG pattern is a complicated way of essentially making a POST request to the server, which then redirects somewhere else to the external content so Google will never find that link. From my point of view, this is super overkill. There’s absolutely no reason to do this unless there’s really a technical reason that you absolutely need to block the crawling of those URLs. I would either just link to those pages normally or use the rel nofollow to link to those pages. There’s absolutely no reason to go through this weird POST redirect pattern there. It just causes a lot of server overhead. It makes it really hard to cache that request and take users to the right place. So I would just use a nofollow on those links if you don’t want to have them followed. The other thing is, of course, just blocking all of your external links. That rarely makes any sense. Instead, I would make sure that you’re taking part in the web as it is, which means that you link to other sites naturally. They link to you naturally, taking part of the normal part of the web and not trying to keep Googlebot locked into your specific website. Because I don’t think that really makes any sense.

Does it matter which server platform we use, for SEO?

Q: (21:47) For Google, does it matter if our website is powered by WordPress, WooCommerce, Shopify, or any other service? A lot of marketing agencies suggest using specific platforms because it helps with SEO. Is that true?

  • (22:02) That’s absolutely not true. So there is absolutely nothing in our systems, at least as far as I’m aware, that would give any kind of preferential treatment to any specific platform. And with pretty much all of these platforms, you can structure your pages and structure your website however you want. And with that, we will look at the website as we find it there. We will look at the content that you present, the way the content is presented, and the way things are linked internally. And we will process that like any HTML page. As far as I know, our systems don’t even react to the underlying structure of the back end of your website and do anything special with that. So from that point of view, it might be that certain agencies have a lot of experience with one of these platforms, and they can help you to make really good websites with that platform, which is perfectly legitimate and could be a good reason to say I will go with this platform or not. But it’s not the case that any particular platform has an inherent advantage when it comes to SEO. You can, with pretty much all of these platforms, make reasonable websites. They can all appear well in search as well.

Does Google crawl URLs in structured data markup?

Q: (23:24) Does Google crawl URLs located in structured data markup, or does Google just store the data?

  • (23:31) So, for the most part, when we look at HTML pages, if we see something that looks like a link, we might go off and try that URL out as well. That’s something where if we find a URL in JavaScript, we can try to pick that up and try to use it. If we find a link in a text file on a site, we can try to crawl that and use it. But it’s not really a normal link. So it’s something where I would recommend if you want Google to go off and crawl that URL, make sure that there’s a natural HTML link to that URL, with a clear anchor text as well, that you give some information about the destination page. If you don’t want Google to crawl that specific URL, then maybe block it with robots.txt or on that page, use a rel=canonical pointing to your preferred version, anything like that. So those are the directions I would go there. I would not blindly assume that just because it’s in structured data, it will not be found, nor would I blindly assume that just because it’s in structured data, it will be found. It might be found. It might not be found. I would instead focus on what you want to have to happen there. If you want to have it seen as a link, then make it a link. If you don’t want to have it crawled or indexed, then block crawling or indexing. That’s all totally up to you.

Sign up for our Webmaster Hangouts today!

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH 

Are You Ready for Google Analytics 4?

With all the new changes in the past decade in the digital marketing landscape, a more sophisticated way to collect and organise user data was much needed. In the fall of 2020, Google introduced an updated software called Google Analytics 4 (GA4), a version that, so far, has worked in parallel with its predecessor Google Universal Analytics (UA). However, Google recently announced that this version would be sunsetting on July 1, 2023, including its premium version 360 Universal Analytics, which will stop processing data in October of next year as well. It is worth noting that the premium features from 360 Universal Analytics will be rolled into the new iteration of GA4 as well.

Getting used to new software takes time, especially in this case, considering that Google Analytics 4 presents an entirely different interface and configuration to UA. This is most certainly why Google made this announcement in advance, to allow businesses still using the UA tool to migrate and get used to the latest version. It is also worth noting that GA4 doesn’t provide any historical data you’ve tracked in Universal Analytics, which adds another good reason why you should start migrating to GA4 since data continuity and reporting are paramount to your business.

Some of the main tools integrated with Google Analytics 4

Event-based data model

Probably the most significant change in the software, Google Analytics 4 introduces an event-based model that offers flexibility in the way we collect data while also providing a new set of reports based on the model.

With Google Analytics Universal, businesses relied on “sessions”, which accounted for a more fragmented model since it only collected data in limited slots. Also, it only worked with specific and predefined information categories, making custom-type data much more challenging to obtain. But now, since everything can be an event, there’s a broader opportunity to understand and compare client behaviour through different custom-type data across various platforms.

Operation across platforms

Previous to GA4, businesses required different tools to analyse both website and app data separately; this made it difficult to obtain a global picture of its user traffic. But now, GA4 added a new kind of property that merges app and web data for reporting and analysis.

Thanks to this update, if you have users coming to you through different platforms, you can now use a single set of data to know which marketing channels are acquiring more visitors and conversions.

No need to rely on cookies

As mentioned at the beginning of this article, a lot has changed in the last decade regarding digital marketing; this includes an ever-growing emphasis on user privacy.

Big tech companies, such as Apple, have started to develop a first-privacy policy, which is why Safari started blocking all third-party cookies in 2020. So it comes as no surprise that Google also announced that they will do the same in 2023 for Chrome.

With GA4, Google is moving away from a cookies-dependent model, no longer needing to store IP addresses for its functionality and looking to be more compliant with today’s international privacy standards.

Audience Triggers

This is a cool feature and lets brands set conditions for a user to move from one audience group to another (like they’ve bought into a specific product category). Then you can better personalise the ad experience, offering complimentary/similar products across the display, video and discovery placements and bring them back to shop more with you.

More Sophisticated insights

GA4 promised a more modernised way of collecting and organising data. Still, the most important thing for businesses is “how” to use this data. Advanced AI learning has been applied in Google Analytics 4 to generate sophisticated predictive insights about user behaviour and conversions, pivotal to improving your marketing.

Integrations

GA4 brings deeper integrations with other Google products, such as Google Ads, allowing you to optimise marketing campaigns by using data to build custom audiences that are more relevant to your marketing objectives and utilising Google Optimise for AB testing

In summary, Google Analytics 4 combines features designed to understand client behaviour in more detail than Universal Analytics previously allowed whilst prioritising user privacy. It also brings about a very friendly interface, with some drag-and-drop functionality to help build reports, reminiscent of Adobe Analytics Workspace.

Adobe Analytics Workspace

GA4 Drag and Drop

You can chat with the team at LION Digital and we can help you set up on GA4

We had a good chat with a colleague at our first Shopify Plus Partner meetup who was developing a Shopify Plus site for their client. They noted GA4 setup they had to do was quite complex and time-consuming as event-tracking needed to be configured, including eCommerce tracking, and Data Studio reports needs to be rebuilt. Took him a good 3 hours that he was keen not to repeat. Thankfully we’ve got a bunch of skilled specialists to help you set up GA4 and we can connect this to our Digital ROI Dashboard to help you get the insights you need, and look at your Channel Action Plans.

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH

Contact Us

Article by

Dimas Ibarra –
Digital Marketing Executive