LION Digital, Author at Premium eCommerce marketing services - Page 4 of 8

SEO & web developers: Why we need to talk about audits

SEOs & web developers: Why we need to talk about audits

Do devs listen to SEOs recommendations?

(1:13) MARTIN: So, Bartosz, there is a thing that I want to talk to you about, and that is that I do hear, and I do observe that SEOs are often struggling actually to get developers to do stuff. Is that an experience that you share? It’s like you make recommendations, you give them input, and then it just doesn’t happen? Or is that not something that you would say is a particular problem?

BARTOSZ: I think that you might have touched on a very sensitive topic in the industry, and I know where you’re coming from with that. So, in general, this is a problem. Some agencies solved this problem and kind of went, past it. We cannot complain at all about our relationship with devs. At the same time, there are so many ways that things are done in the web development and the SEO space that doesn’t help.

Why we shouldn’t throw PDF reports over the fence

(2:10) BARTOSZ: For example, PDF audits are one of the things, just to name, I think, the elephant in the room. So if you’re going to create a PDF audit explaining how to fix core vitals, not knowing their stack, not knowing their technology, not knowing their resources`. How many devs are there in the team? Is it a Shopify-based platform, or is it a custom CMS? And in our experience, when you create a PDF audit, and the dev is going to run into a problem, they’ll skip that because there’s no fallback into what has to be done. So this can be unpacked in so many ways.

MARTIN: But I know exactly what you mean. So I come from a developer background, and I have worked with SEOs on both sides, both as a Google developer advocate, basically helping SEOs to do the right things and identify problems and solve problems. And both– also from the perspective of the developer working in the team. And the thing with the PDF report really struck a chord with me. Because I remember being a developer. I had so many different things on my plate already. And then, out of the blue, in the middle of a sprint, someone from the SEO department descended upon me and said, “Martin, here is a PDF with all the things that are wrong.” Bye. And then they just flew off. And I’m like, uh, OK. It says we are using JavaScript, which is very accurate because what we are actually building right now is a VR application that runs on your phone in the browser. You have to use JavaScript to do that. And the recommendation is to not use JavaScript? So that’s not really a thing we can do because then we don’t have VR capabilities. And because then that’s our product, we kind of have to build our product to have our product with the technologies that enable the features of our product. So a lot of these things are so unhelpful and so not reflective of the circumstances in which I, as a developer, work in.

Why do SEOs advise against JavaScript?

(4:21) BARTOSZ: So you work for Google. So I can ask you before we get into how to solve this problem, let me ask you a question. Is Google OK with JavaScript?

MARTIN: Yeah. We are OK with JavaScript.

BARTOSZ: So if I have a news website that’s 100% CSR, you’re going to be OK with that?

MARTIN: We’re going to be OK with it. It might take a little longer than you would like us sometimes because we have to actually render everything. And if you are rendering specifically badly designed, then that might take us a while. But in general, if you are doing things right, and if you’re doing things following the best practices and making sure that you test your setup up, we would be fine with that, yeah.

BARTOSZ: So I don’t want to argue with that statement. Obviously, this is not this kind of video. But just what I’m trying to say is there are so many complexities on all these fronts. So we have clients coming in with a news website that’s 100% JavaScript. And there is this kind of demon in the industry that all the SEOs would say that JavaScript is evil and that JavaScript is so bad. But on the other side of things, there is Google saying we’re OK with JavaScript. And there is this kind of reality when there are a lot of websites packed with JavaScript, and Google is not picking them up properly. They’re not getting rendered for, like, all these other problems. So then we hop on the call with a dev team of our client. And they’re like, but Martin said that JavaScript doc is OK. Why do we need to do why do you want us to do server-side rendering? How do you answer? You know what I mean. This makes us look like SEO wizards. So these are the problems that require a lot of knowledge. We don’t struggle with them as much anymore because we have so much research. We have our experiments, whatever. But if you think about that to a level when he or she can handle those questions, it is going to take years. And it is, for us, it slows down growth. And at the same time, if we’re going to look at the whole internet asking all of the SEO agencies to be as advanced as we are when we only do technical SEO and we specialize in JavaScript SEO is difficult. And so this is a crack, and I don’t have any solutions, and I’m not blaming Google or anything like that. I’m just saying that this is a change that’s happening, but it requires time. So there are some maybe more than some moments when it basically requires goodwill from both ends. So if devs want to understand it, and we want to explain it, we’ll make it happen.

MARTIN: I think there are lots of touchpoints where you can actually create this understanding and this cooperation. Because, as you said, there are lots of complexities and lots of background and lots of considerations. So if you are asking me, and this is also tricky for us Googlers, because if you’re asking me a question like is JavaScript OK? Then, in general, yes. Is it the best idea? No, not necessarily. If you can do it without JavaScript, do it without JavaScript. Server-side rendering is a recommendation that we give out as well. We need to make that more prominent in our docs. I’ve taken that point. But I really like the point where you said like, oh, so SEOs have this challenge that when you get a new SEO to join your team, they need to spend so much time on actually getting the knowledge that they need to work. Developers have the exact same challenge. Because the entire industry, the entire ecosystem, just keeps moving and keeps changing. So someone who becomes a developer today sees everyone else working with so many tools and so many things. And there is a tendency to skip the understanding. Because most developers who have been around a couple of years have started with some tool, learned the things that this tool does well and these things that this tool does not so well. And then they might be like, oh, you’re building a news website. I think in that case, with the interactivity that you described to me, we might be able to just do this better with server-side rendering. Whereas, if you want a highly engaging social network, you might actually want to use a client-side rendering for all the interactivity that is embedded, and that is not necessarily impairing your performance in search. So they learn these tools, and they learn the trade-offs, and then they make better decisions as they grow. But then people come in and might skip the entire learning process and go like, oh, everyone uses this framework, so I’ll build everything in this framework. Because if everyone uses it, it must be fantastic, without understanding the decision-making process behind it. And I think which is then exacerbated is the problem that if an SEO who cargo cults recommendations that they read or heard somewhere without understanding the background and the complexities they are encompassing to a developer who does the same thing, then there must be a clash. Because now they are running into territory where they think they know what they are doing when they actually don’t really know what they are doing. Would you say that might be a challenge that we are seeing playing out?

Complexities and differences between technical SEO and content marketing

(10:25) BARTOSZ: So let me unpack that one by one. There are a few statements within that. So, for example, the way you described the news website, you and I have known you for a few years. I know you’re not going to take that the wrong way. So most of our clients wouldn’t understand what you said. So if you’re talking to the key stakeholders, I’m assuming maybe not the CEO, but you know CMO, someone who’s making that decision along the lines, you will have this conversation. This is one of the problems that we were doing back in the days when you would hop on a call with five people from our client’s company. And we would start talking to dev, and you would lose everyone else. So simplifying that as much as possible is just it has to happen, so then everyone is kind of included in this conversation. But secondly, what you mentioned about dev teams is that this is so dynamic. This is also how SEO looks like. Maybe some SEO agencies didn’t realize it. I don’t think too many. So if someone is coming to us with a question, can you do a JavaScript SEO, web performance, and a little bit of content marketing? So this is extremely difficult to pack in one agency and do all these things well. So I think that we slowly need to normalize using a technical SEO agency, for one thing, using a content marketing agency for something else, and just trying to branch out so then everyone understands their goals. And then onboarding that one person is easier because that junior SEO, she only has to learn like JavaScript SEO, web performance, whatever, crawler budget, understand those technical aspects. At the same time, some agencies want those people to also do link-building and all these other aspects. So just like with devs, it got so complex. Sometimes I’m looking at job offers for devs, and I’m like, what does it even mean? 

MARTIN: Basically, one job offer is an entire IT department. I love it.

BARTOSZ: Basically, you need to divide organizations into aware in the web space and those that are not aware. And then if we’re going to so if someone is aware of how SEO works like it looks like what’s the difference between CSR and SSR, I’m assuming a lot of even high-level people in some organizations like for example, Germany is pushing a lot of people in the management position to know a lot about development, which I love. Talking to companies from Germany, most of the time, they’re just so aware of that. Some other companies would come to you and say, so if we’re going to fix this problem with rendering or with web performance, how much traffic can we expect and when? And that’s the main topic of discussion. And this is something I have daily, two-three times a day, that I have to answer. And this is usually showing me that there are so many ways to answer this question. Like, I never do that the same way. But anyhow, this is showing me that maybe they need a little bit more help understanding what has to be done and why it’s done. Sometimes it’s just beyond our scope of work, let’s call it that, that we cannot push them.

Web performance metrics and reaching stakeholders needs and wants

(13:35) BARTOSZ: So now that we have that, let’s assume we have someone that’s aware of or willing to learn about why we’re doing that, about– that “why” is kind of important here. Because if they only do that for traffic, and that’s the only KPI they look at, it’s very difficult sometimes not to skip a lot of important metrics. If you look at that, you can have a ton of traffic, and this is still going to be a terrible website in theory.

MARTIN: So are you saying that sometimes you literally have to rethink an organization’s KPIs?

BARTOSZ: Yeah. Very often, we would have just to give you an example. We have a call with a massive company, and they would be asking us what do we have to do to rank for the term “houses”? And just this question lets you unpack so many problems with the whole organization. And then we usually don’t want to offend anyone. You don’t want to get their ego involved in that. But at the same time, you want to explain that. And so that’s one. Let’s assume we have the stakeholders sold on the idea of what we want to do. They understand that. That’s amazing. That’s usually when things start to go well.

MARTIN: That is great. Because that also unlocks the possibility to basically have them on board with whatever the dev team will be doing about it. These things have basically been invisible to me. But as a developer, I just noticed that the stakeholders, the key stakeholders in the company that have an influence on the dev team, told me to do one thing, and then SEOs or marketing told me to do another thing. And then usually I picked the organization goals because that’s what I was measured by. So you are saying by bringing in the key stakeholders and making sure that they understand what they need to look for and adjusting their KPIs, you unlock the key to actually getting the development team on board with what you are trying to accomplish?

BARTOSZ: Yes and no. So usually, we don’t really– like as a technical SEO agency, we don’t struggle that much to get devs on board. This is not that big of a deal with us. The problem is for the stakeholders to understand what we want to do with them. Because sometimes, it’s like this is this technical SEO agency, and this is our dev team. Let them have fun with this project. And like almost literally, that’s how that might look in some cases. And this is usually a problem. But then if we know, OK, we want to get here. And this is the umbrella term. We want to have amazing web performance. We want to get rendered, indexed quickly, whatever. And they know that. I don’t even imagine how this could go wrong because the whole organization is just growing in one direction. And this is our Holy Grail, and this is happening very, very often.

MARTIN: How do you get that to happen? Because I have been in so many organizations where that did not happen.

BARTOSZ: This is a very simple answer. We did it wrong so many times. So we tried for years. Because when we started back in, I think 2013, ’14-ish, we wanted to focus only on the technical aspects of my team and me. People would make fun of us. They would be OK, there are white hat SEOs, and there are people who have traffic and all of these kinds of amazing jokes going our way that you cannot really. I can create a stand-up show for you with just the feedback we would get from the SEO community back in the day, basically moving to a technical side of things only.

MARTIN: Bonus episode right there.

Meeting with stakeholders, finding problems, and SEOs listening to devs to find solutions

(17:22) BARTOSZ: This was very weird for a second. Usually, we start with stakeholders. I’m going to condense this really quick. We talk to the stakeholders. We hop on the call before creating any offers or anything. We hop on the call and talk about what’s the KPI? What’s the problem? What are the challenges? Why are we even doing that? Why is it so important? Why do you want to fix that? Because if traffic is the only metric, we still will work with them, but we know how this might go. So we’re going to start with that. Then after the call, we look into their website. We create a statement of work. So we tell them, OK, this is what we’re going to do. This is the list of problems we’re seeing with your website. This is how we want to fix it and prioritize it. So the first month, we solve all of the most terrible aspects, like 404s or, I don’t know, 10 seconds to load a page, whatever. And with that, it’s extremely transparent. Because we tell them, OK, this project is going to take four months. We’re going to hop in, and now this is actually a spoiler alert. We’re going to hop into a PM, so like Jira or Trello with your dev team, and we’re going to make it happen.

MARTIN: OK. So you meet the dev team where they are anyway.

BARTOSZ: And we adjust to whatever solution that they go with. So if they work in sprints, we try to join that. However, this is, we had to kind of morph into this team that joins them without any interruptions. So this is the only way. We are aware that, like in a medium or large company, the dev team is seriously the most important part of the business. So then we tell them, OK, this has to be fixed. But we have to understand their tech stack. We have to understand all these boring aspects boring, like, we loved it. But usually, during that call with stakeholders, they don’t really want to talk about it, or they don’t know. Very often, stakeholders have no idea what kind of tech stack they are running.

MARTIN: To be fair, they should not have to, right? That’s something unless it’s like the CTO. I don’t think the CEO needs to know which tech stack they’re on as long as they know what their core business is, how it works, and what’s the vision? What’s the mission of the organization? I think that’s exactly what you have a development team for, to define these things based on requirements coming from elsewhere.

BARTOSZ: One more conversation that we have to schedule. But let’s move forward with that. Then we hop into Jira, Trello, whatever. We give them tasks. And they come back to us. Like, we cannot really do that. We have a custom solution around this one that doesn’t allow whatever. So then and again, we have a team that’s extremely technical. This is something we have been building for a few years. And they hop on the call. We either sometimes, when they really don’t get that, and this is an edge case, we write a snippet of code just to show them how to optimize the CSS or whatever. But most of the time, we just go and talk to them. And they would like devs, in my opinion, they’re very hardworking people. And they would tell us. We cannot do this. We have so many limitations. And we try to work within those limitations. If we hit the wall, we go back to stakeholders and say, maybe we could try to, and devs appreciate that. Because someone is really it’s not only them coming to stakeholders for budget, for more resources, whatever. But we’re going to come in and say, OK, guys, this is going to be difficult with so many places where you cut corners.

MARTIN: So you would say that you would also have to somehow support developers to get the right resources and to get the right environment to work in sometimes? That’s interesting.

BARTOSZ: Sometimes. Sometimes. This is going to sound funny. But sometimes we have clients who have 50 devs, and not in a country where there would be cheaper, but 50 devs in a very high-earning city somewhere in the world. And they would listen to us rather than to them because they are like, oh, we’re paying your invoice. We want to get the most bang for our buck. So because of that and they would tell us that openly we want to really move this project forward. And that’s why they would change things around in the dev team. And I guess you have to know that out of all people. Sometimes when you work in a team, you come to your manager, and you say, OK, this is a problem every day. They won’t listen to you until someone from the outside is going to come in and say, like, dude.

MARTIN: I’ve been there. I’ve done that. I’ve been on both sides of this. I’ve been the consultant that came in and basically just sat down with the development team, listened to them for a day, and then presented what I heard to the stakeholders. And they’re like, oh, these are really valuable insights. And I always thought, I’m billing you for this, but you could have just listened to your developers.

BARTOSZ: Exactly. And I guess every single dev listening to or watching this video series right now is going to have the story of this way, of this kind.

MARTIN: I’ve been on the developer side of that, too. It was like, hey, we need to do this. Oh, I don’t think so. And then I was like, OK. And then the consultant came in and said the same thing.

BARTOSZ: And also, just to elaborate on this story, we have usually once a quarter we have someone reaching out to us. Usually, the dev teams we know, saying “Partners, when we need an audit around core vitals”. But don’t go too deep. We just want them to hear what we told them from someone else. And they’re willing to spend the budget just to have a backup document just to say. We need to fix this.

Always ask questions

(23:36) MARTIN: But that is so smart. I really like that. Because so many developers are like, ugh, I don’t want to work with these people because they just tell them what we told them already, and they charge them for it. Why don’t developers more often leverage these external voices like they do in your case, where it’s like, Bartosz, I need this audit. I know the result of the audit, but please tell them what we already told them because they don’t listen. That is smart. I like that.

BARTOSZ: If there’s any SEO agency listening to that, or SEO like someone frustrated with dev teams. 100% honest. We don’t struggle with devs. Talking to them openly, speaking their language, and you might have to get a little bit of technical, or maybe just have one or two people on your team who can get the message across.

MARTIN: Oh, and just to add on to your last point, and that goes to all the SEOs out there listening, struggling with developers, lose your ego. It is not a problem to ask us developers questions. It’s like if you think that you need to know everything, no, you don’t. You’re not a developer. It’s OK. Developers don’t know about SEO. You don’t have to know everything about development. So if you don’t understand why they can’t do what you ask them to do, remember that developers are intrinsically motivated by or to solve problems. That’s what they love. That’s what they want to do. So if you give them something to solve, they’ll be excited to solve it. And then they hit the limitations, the limits of what they can do in the tech stack in the environment that they are in. And then they tell you, I can’t do that because of XYZ. If you don’t understand XYZ, that is OK. Ask clarifying questions until you get there because you, and I think, Bartosz, you said that very nicely, you may have to simplify that message that comes from the developers so that other people who are not developers in the organization that you work with understand why it doesn’t work.

BARTOSZ: Let me just build on that. And this is something that all the SEOs are going to love. We had that vision years back that we had to learn all of the frameworks. We have to know front and back and whatever. It was so stressful. Like, I was learning all these like, I was trying to know it all. But this is something leave it to devs. What we have to do as technical SEOs, we have to have an in-depth, massive understanding of how rendering works, how a browser works, how Google is rendering and what they render, like, rules around that. Chrome DevTools has to be your go-to place. And you need to understand what’s happening, and once you understand that, you add a little bit of documentation from different frameworks, from different technologies, but don’t learn how to write all of this code. Obviously, basics are OK, but basics. This is what they pay you to do. They don’t want you to know what they know. So you have to know this is complex enough. Understanding how a browser works step by step is something you can do for the rest of your life and never have enough of learning. Just wanted to add now we were talking from the SEO standpoint with the ego. If you own the company, if you’re on a dev team, go through a lot of agencies. Talk to them. Hop on the call with them. Ask them questions. See if you understand what they’re trying to do. If you’re talking to an agency that’s going to tell you nothing about the scope of work, what they’re going to do, and how they’re going to do that, this would raise a red flag. If you go with your car to have it fixed, and they won’t tell you what they did, I would be worried about driving this car. So basically, go through that. Talk to as many people. As soon as you feel the vibe, “OK, these guys, they understand what we want to accomplish” and get this conversation going.

MARTIN: And the same the other way is absolutely true. Developers really don’t like it when someone comes like, hey, you need to do this. And then they ask you why? And then they don’t get a proper answer. You can do the same thing with developers. If you say, hey, I want you to implement the canonicals. And then they say, we can’t do that. Then ask them why. If they say our solution doesn’t allow it. Then it’s like, why does it not allow you, like, does it not allow you to add anything to the head? Oh, it does, but not the canonicals. Why not the canonical? It might be just a knowledge gap on their side, or it might be an actual hard technical limitation of the environment and the stack and the platform they’re working with. But they need to be able to explain this to you in simple terms. If they are like, oh, it’s algorithmically impossible, that’s just a developer’s way of saying, bugger off. Ask questions. Don’t think like, “oh, they said something that I don’t understand, so I must stop questioning here”. No. If they can’t express it in simple terms so that you understand what they mean, they haven’t done the work themselves either. So hold them accountable, but be ready to be held accountable, too.

BARTOSZ: Yeah. Just wanted to hop on. We don’t know the stuff very, very often. We have five calls per day with five different stacks. Some we never heard about. So if we don’t know something, we’re very open about it. We have no idea what we have WAMP PWA recently. I was like, I never heard of WAMP. We just went and read the documentation, came back, and scheduled a call. Like, now we can talk in the same language. Again, this is an ego thing. You don’t assume that you need to know stuff. I’m very open about things I don’t know. There are tons of them.

MARTIN: I mean, even I was asked recently at a core vitals session, I was asked questions that I don’t know the answer to. I won’t give you a random non-answer or try to mettle my way through. I just say I don’t know, but I can check.

BARTOSZ: Yeah. And one thing that I feel is a deal-breaker between devs and SEOs a lot of times, and we back in the day were guilty of that as well, is devs will ask you a question of, why shouldn’t we just point canonicals to a page that’s most important, so we sculpt we push the most important page with different canonicals because this is just like a link. They would have all these ideas as well, like both SEOs and devs, on how to cheat the search engine. Yeah. SEO, you have to explain that step by step. But if we’re talking about SEOs and devs, we need to leave all of the conspiracy theories or urban legends behind. Just fall back on documentation, and that’s it. Because as soon as open the door into, if we point some canonicals here, or if we do this, or if we do that, some things might happen. We heard about it. We tested that. This is going to put you in the shoes of those snake oil salesmen. So be technical. If you’re talking to devs, be technical. Drop all of those. Even if you deeply believe this is the case, I think this opens the door to what we as SEOs want to run away from.

MARTIN: Another thing is if you are not very technical, that is perfectly OK. But then don’t try to find solutions because you are in a territory where you are not necessarily experienced. And if you are not just basically present the problem, and then work with the development team to solve that, don’t try to come up with a solution because it’s likely that your solution will not work in the tech stack or the environment that your development team works in. And then if you are getting attached to that solution, you’re like, but why can’t we just do it like this? It might not get through to the development team the right way, and it might feel like you are just obsessed with something rather than actually trying to solve the problem. And that’s what development teams need to do. So you want to be on their side, and you want to it’s OK to go the way together like basically, to research options, to experiment with things together. But trying to come up with a turnkey solution for development teams usually backfires.

The start of technical SEO and web developers working together

(32:30) BARTOSZ: Just one thing that I hope is going to clean the air a little bit. Technical SEO is fairly new. It’s maybe three, four, or five years old. Obviously, some will argue that this was because I was doing technical SEO in 1993. But it’s fairly new in the way that this is getting so popular. It’s getting so needed. It’s almost essential. So this is a brand new field. And if you look at that and drop all of the histories, this is going to get really exciting. Because now, I would assume that devs and SEOs will only get closer throughout the next few years. Because obviously, we’re all seeing the need for technical SEO.

MARTIN: I think that would clear the air. And for all of those who are scared and confused now on the SEO side, don’t be. You get to choose if you want to get into technical SEO or not. Technical SEO is a field of its own. It is complex. It is big. It is broad. It is new. It is fresh. But I think content still is an important field, and all the other things, all the other aspects inside SEO, will not become irrelevant or anything. It will continue to be a broad field. You get to pick your niche. But if you want to go technical, do it right. I think that’s a very, very nice way of looking at it.

BARTOSZ: And go technical.

MARTIN: Go technical. Awesome. Thank you so much, Bartosz, for joining me in this conversation. I think this was really interesting and insightful. And I hope to see more from you guys and also to hear what the community is saying about these things as well looking forward. Stay safe, stay healthy, everybody.

Sign up for SEO & Web Developers today!

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH 

What Happens When you Stop Doing SEO?

The 2020s are shaping up to be quite the outlier from decades past. In the last two years, we saw significant growth and focus on digital channels off the back of at-home pandemic buying; now with consumer confidence dipping, we’re seeing growth slow and marketers are increasingly focused on the effectiveness of their channel mix and may be considering where they can consolidate or reduce spend.

Business owners and marketers alike frequently wonder where SEO should sit – is it eCommerce, marketing or IT? Should it be a sustained marketing cost once we are happy with our visibility for core search terms? When is SEO’s job done?

SEO can sit anywhere in an organisation, but it makes the most sense to sit close to content, technical implementation and website changes, and new product launches. SEO is the art of being the least imperfect player in the search results so there is always work to do. Following core category content optimisation and technical audit fixes, research can uncover opportunities to develop content earlier in the journey to capture more users for paid search and email audiences to nurture them into being customers down the line.  In this way, the SEO job is never done and its absence in either activity or advisory can see good growth come undone. 

HERE ARE A FEW WATCH OUTS WORTH CONSIDERING IF SEO IS NEGLECTED:

  1. You may lose keyword growth momentum – Google values freshness and algorithm updates happen every time. If you’re not pruning and cultivating a healthy website and fresh content, you may fall out of favour and see rankings you worked hard on decline.
  2. Competitors can outperform your website by continuing optimisation works – as we said before, SEO is about being the least imperfect, so if you’re not investing time and effort, you can expect competitors who are to overtake you
  3. You may take a significant hit to your organic revenue – if you lose crucial Page 1 keywords to a competitor, their brand may be considered over yours and this can affect your bottom line as Organic commonly generates 35-60% of a company’s revenue.
  4. All websites aren’t created equally and neither are their budgets – unlike paid search, it’s difficult to gauge how much your competitors are investing in SEO. Content and link velocity, alongside internal team growth, is a good way to compare yourself to your competitors. A good SEO partner should be able to provide you with this view and help you outsmart your competitors where you can’t outspend them.

Get in touch for an obligation-free discussion with our growth strategists to find out how we can make your company take the LION’s share of the market online. We have achieved great results in the form of visibility, visitation and revenue growth that you can find on the case studies section of our website. 

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH

Contact Us

Article by

Leonidas Comino – Founder & CEO

Leo is a, Deloitte award winning and Forbes published digital business builder with over a decade of success in the industry working with market-leading brands.

Like what we do? Come work with us

Shopify Announces Launch of YouTube Shopping

Shopify announced the launch of YouTube shopping this week, outlining benefits including:

  • Customers can buy products right when they discover them
  • Instantly sync products to YouTube
  • Create authentic live shopping experiences
  • Sell on YouTube, manage on Shopify

What does this mean for our clients?

There are some eligibility restrictions for this product at the moment. You must already have 1000 subscribers to your YouTube channel and at least 4000 hours of annual watch time. This means as a brand, you will need to have an already well-established YouTube channel or look to start working with content creators who do.

Consider content creators who align with your brand or category and research their channels and content. There are specific websites and agencies that can help source content creators for a fee, including theright.fit and hypeauditor.com

YOUTUBE FOR ACTION WITH PRODUCT FEEDS

For clients who don’t meet the eligibility requirements, but still want to explore video for retail, there is another option. YouTube for action campaigns allow us to promote videos on the YouTube network, and attach a product feed through the merchant centre, creating a virtual shop front for the watcher, with an easy “shop now” functionality.

This powerful format allows brands to generate both awareness and engagement with their brand, whilst also driving bottom line sales. This can be managed through your Google Ads account allowing you to optimise towards the same conversions and use the same audience signals as your other Google campaigns.

What is YouTube for Action?

Previously named TrueView for Action, this product allows users to buy video ads on the YouTube network which are optimised towards a performance goal rather than pure reach or video views.

You can optimise towards:

  • Website traffic
  • Leads
  • Sales/Purchases

And have the option to choose your bud strategy based on:

  • Cost per View
  • Cost per Action
  • Maximise Conversions
  • Cost per thousand impressions

Who can I target?

YouTube and Google’s shared data provide a wealth of information to help us build audience segments that will fit your brand and services. The options include but are not limited to:

  • Demographic targeting: Age, gender, location –  based on signed-in user data
  • Affinity audiences: Pre-defined interest/hobby and behavioural data based on users browsing history
  • In-market audiences: Users deemed to be “in-market” for a product or service based on their searching behaviour and browsing history
  • Life-Events: Based on what a user is actively researching or planning, e.g. graduation, retirement etc
  • Topics:  Align your content with similar  themes to video content on the YouTube network
  • Placement: Align your content to specific YouTube channels, specific websites, or content on channels/websites.
  • Keyword: Similarly, to search, build portfolios of keywords to target specific themes on YouTube

The team at LION will work with you to select and define the right audiences to test and optimise to get the best results.

What content should I use?

Like any piece of content, there is no right or wrong answer, and what works for some brands may not for others. Your video should align with your brand tone of voice and guidelines. 

Think about what action you want the users to take and ensure the video aligns with this, e.g. if you want users to buy a specific product, show the product in the video and talk about its benefits. Testing multiple types of video content is the best way to learn about what your potential customers like and do not like.

What do I need to get started?

  1. At least one video uploaded to YouTube (we recommend 30 seconds in length)
  2. A Google merchant centre account & Google Ads account
  3. A testing budget of at least $1,000

YOU CAN CHAT WITH THE TEAM AT LION DIGITAL AND WE CAN HELP YOU TO SELECT AND DEFINE THE RIGHT AUDIENCES TO TEST AND OPTIMISE TO GET THE BEST RESULTS

LION stands for Leaders In Our Niche. We pride ourselves on being true specialists in each eCommerce Marketing Channel. LION Digital has a team of certified experts and the head of the department with 10+ years of experience in eCommerce and SEM. We follow an ROI-focused approach in paid search backed by seamless coordination and detailed reporting, thus helping our clients meet their goals.

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH

Contact Us

Article by

Leonidas Comino – Founder & CEO

Leo is a, Deloitte award winning and Forbes published digital business builder with over a decade of success in the industry working with market-leading brands.

Like what we do? Come work with us

WEBMASTER HANGOUT – LIVE FROM JULY 01, 2022

Which number is correct, Page Speed Insights or Search Console?

Q: (00:30) Starting off, I have one topic that has come up repeatedly recently, and I thought I would try to answer it in the form of a question while we’re at it here. So, first of all, when I check my page speed insight score on my website, I see a simple number. Why doesn’t this match what I see in Search Console and the Core Web Vitals report? Which one of these numbers is correct?

  • (01:02) I think maybe, first of all, to get the obvious answer out of the door, there is no correct number when it comes to speed when it comes to an understanding of how your website is performing for your users. In PageSpeed Insights, by default, I believe we show a single number that is a score from 0 to 100, something like that, which is based on a number of assumptions where we assume that different things are a little bit faster or slower for users. And based on that, we calculate a score. In Search Console, we have the Core Web Vitals information based on three numbers: speed, responsiveness, and interactivity. And these numbers are slightly different because it’s three numbers, not just one. But, also, there’s a big difference in the way these numbers are determined. Namely, there’s a difference between so-called field data and lab data. Field data is what users see when they go to your website. And this is what we use in Google Search Console. That’s what we use for search, as well, whereas lab data is a theoretical view of your website, like where our systems have certain assumptions where they think, well, the average user is probably like this, using this kind of device, and with this kind of a connection, perhaps. And based on those assumptions, we will estimate what those numbers might be for an average user. And you can imagine those estimations will never be 100% correct. And similarly, the data that users have seen will change over time, as well, where some users might have a really fast connection or a fast device, and everything goes really fast on their website or when they visit your website, and others might not have that. And because of that, this variation can always result in different numbers. Our recommendation is generally to use the field data, the data you would see in Search Console, as a way of understanding what is kind of the current situation for our website, and then to use the lab data, namely, the individual tests that you can run yourself directly, to optimise your website and try to improve things. And when you are pretty happy with the lab data you’re getting with your new version of your website, then over time, you can collect the field data, which happens automatically, and double-check that users see it as being faster or more responsive, as well. So, in short, again, there is no absolutely correct number when it comes to any of these metrics. There is no absolutely correct answer where you’d say this is what it should be. But instead, there are different assumptions and ways of collecting data, and each is subtly different.

How can our JavaScript site get indexed better?

Q: (04:20) So, first up, we have a few custom pages using Next.js without a robots.txt or a sitemap file. Simplified, theoretically, Googlebot can reach all of these pages, but why is only the homepage getting indexed? There are no errors or warnings in Search Console. Why doesn’t Googlebot find the other pages?

  • (04:40) So, maybe taking a step back, Next.js is a JavaScript framework, meaning the whole page is generated with JavaScript. But as a general answer, as well, for all of these questions like, why is Google not indexing everything? It’s important first to say that Googlebot will never index everything across a website. I don’t think it happens to any kind of non-trivial-sized website where Google would completely index everything. From a practical point of view, it’s impossible to index everything across the web. So that kind of assumption that the ideal situation is everything is indexed, I would leave that aside and say you want Googlebot to focus on the important pages. The other thing, though, which became a little bit clearer when, I think, the person contacted me on Twitter and gave me a little bit more information about their website, was that the way that the website was generating links to the other pages was in a way that Google was not able to pick up. So, in particular, with JavaScript, you can take any element on an HTML page and say, if someone clicks on this, then execute this piece of JavaScript. And that piece of JavaScript can be used to navigate to a different page, for example. And Googlebot does not click on all elements to see what happens. Instead, we go off and look for normal HTML links, which is the kind of traditional way you would link to individual pages on a website. And, with this framework, it didn’t generate these normal HTML links. So we could not recognise that there’s more to crawl and more pages to look at. And this is something that you can fix in how you implement your JavaScript site. We have a tonne of information on the Search Developer Documentation site around JavaScript and SEO, particularly on the topic of links because that comes up now and then. There are many creative ways to create links, and Googlebot needs to find those HTML links to make them work. Additionally, we have a bunch of videos on our YouTube channel. And if you’re watching this, you must be on the YouTube channel since nobody is here. If you’re watching this on the YouTube channel, go out and check out those JavaScript SEO videos on our channel to get a sense of what else you could watch out for when it comes to JavaScript-based websites. We can process most kinds of JavaScript-based websites normally, but some things you still have to watch out for, like these links.

Does it affect my SEO score negatively if I link to HTTP pages?

Q: (07:35)Next up, does it affect my SEO score negatively if my page is linking to an external insecure website?

  • (07:44) So on HTTP, not HTTPS. So, first off, we don’t have a notion of an SEO score. So you don’t have to worry about the kind of SEO score. But, regardless, I kind of understand the question is, like, is it wrong if I link to an HTTP page instead of an HTTPS page. And, from our point of view, it’s perfectly fine. If these pages are on HTTP, then that’s what you would link to. That’s kind of what users would expect to find. There’s nothing against linking to sites like that. There is no downside for your website to avoid linking to HTTP pages because they’re kind of old or crusty and not as cool as on HTTPS. I would not worry about that.

Q: (08:39) With Symantec and voice search, is it better to use proper grammar or write how people actually speak? For example, it’s grammatically correct to write, “more than X years,” but people actually say, “over X years,” or write a list beginning with, “such as X, Y, and Z,” but people actually say, “like X, Y, and Z.”

  • (09:04) Good question. So the simple answer is, you can write however you want. There’s nothing holding you back from just writing naturally. And essentially, our systems try to work with the natural content found on your pages. So if we can crawl and index those pages with your content, we’ll try to work with that. And there’s nothing special that you need to do there. The one thing I would watch out for, with regards to how you write your content, is just to make sure that you’re writing for your audience. So, for example, if you have some very technical content, but you want to reach people who are non-technical, then write in the non-technical language and not in a way that is understandable to people who are deep into that kind of technical information. So kind of the, I would guess, the traditional marketing approach of writing for your audience. And our systems usually are able to deal with that perfectly fine.

Should I delete my disavow file?

Q: (10:20) Next up, a question about links and disavows. Over the last 15 years, I’ve disavowed over 11,000 links in total. I never bought a link or did anything unallowed, like sharing. The links that I disavowed may have been from hacked sites or from nonsense, auto-generated content. Since Google now claims that they have better tools to not factor these types of hacked or spammy links into their algorithms, should I just delete my disavow file? Is there any risk or upside, or downside to just deleting it?

  • (10:54) So this is a good question. It comes up now and then. And disavowing links is always kind of one of those tricky topics because it feels like Google is probably not telling you the complete information. But, from our point of view, we do work hard to avoid taking this kind of link into account. And we do that because we know that the disavow links tool is a niche tool, and SEOs know about it, but the average person who runs a website doesn’t know about it. And all those links you mentioned are the links that any website gets over the years. And our systems understand that these are not things you’re trying to do to game our algorithms. So, from that point of view, if you’re sure that there’s nothing around a manual action that you had to resolve with regards to these links, I would just delete the disavow file and move on with life and leave all of that aside. I would personally download it and make a copy so that you have a record of what you deleted. But, otherwise, if you’re sure these are just the normal, crusty things from the internet, I would delete it and move on. There’s much more to spend your time on when it comes to websites than just disavowing these random things that happen to any website on the web.

Can I add structured data with Google Tag Manager?

Q: (12:30) Adding schema markup with Google Tag Manager is that good or bad for SEO? Does it affect ranking?

  • (12:33) So, first of all, you can add structure data with Google Tag Manager. That’s an option. Google Tag Manager is a simple piece of JavaScript you add to your pages and then does something on the server-side. And it can modify your pages slightly using JavaScript. For the most part, we’re able to process this normally. And the structured data you generally like can be counted, just like any other structured data on your web pages. And, from our point of view, structured data, at least the types that we have documented, is primarily used to help generate rich results, we call them, which are these fancy search results with a little bit more information, a little bit more colour or detail around your pages. And if you add your structured data with the Tag Manager, that’s perfectly fine. From a practical point of view, I prefer to have the structured data on the page or your server so that you know exactly what is happening. It makes it a little bit easier to debug things. It makes it easier to test things. So trying it out with Tag Manager, from my point of view, I think, is legitimate. It’s an easy way to try things out. But, in the long run, I would try to make sure that your structured data is on your site directly, just to make sure that it’s easier to process for anyone who comes by to process your structured data and it’s easier for you to track and debug and maintain over time, as well, so that you don’t have to check all of these different separate sources.

Is it better to block by robots.txt or with the robots meta tag?

Q: (14:20) Simplifying a question a little bit, which is better, blocking with robots.txt or using the robots meta tag on the page? How do we best prevent crawling? 

  • (14:32) So this also comes up from time to time. We did a podcast episode recently about this, as well. So I would check that out. The podcasts are also on the YouTube channel, so you can click around a little bit, and you’ll probably find them quickly. In practice, there is a subtle difference here where, if you’re in SEO and you’ve worked with search engines, then probably you understand that already. But for people who are new to the area, it’s sometimes unclear exactly where these lines are. And with robots.txt, which is the first one you mentioned in the question, you can essentially block crawling. So you can prevent Googlebot from even looking at your pages. And with the robot’s meta tag, you can do things like blocking indexing when Googlebot looks at your pages and sees that robot’s meta tag. In practice, both of these results in your pages do not appear in the search results, but they’re subtly different. So if we can’t crawl, we don’t know what we’re missing. And it might be that we say, well, there are many references to this page. Maybe it is useful for something. We just don’t know. And then that URL could appear in the search results without any of its content because we can’t look at it. Whereas with the robot’s meta tag, if we can look at the page, then we can look at the meta tag and see if there’s no index there, for example. Then we stop indexing that page and drop it completely from the search results. So if you’re trying to block crawling, then definitely, robots.txt is the way to go. If you just don’t want the page to appear in the search results, I would pick whichever is easier for you to implement. On some sites, it’s easier to set a checkbox saying that I don’t want this page found in Search, and then it adds a noindex meta tag. For others, maybe editing the robots.txt file is easier. Kind of depends on what you have there.

Q: (16:38) Are there any negative implications to having duplicate URLs with different attributes in your XML sitemaps? For example, one URL in one sitemap with an hreflang annotation and the same URL in another sitemap without that annotation.

  • (16:55) So maybe, first of all, from our point of view, this is perfectly fine. This happens now and then. Some people have hreflang annotations in sitemap files separated away, and then they have a normal sitemap file for everything. And there is some overlap there. From our point of view, we process these sitemap files as we can, and we take all of that information into account. There is no downside to having the same URL in multiple sitemap files. The only thing I would watch out for is that you don’t have conflicting information in these sitemap files. So, for example, if with the hreflang annotations, you’re saying, oh, this page is for Germany and then on the other sitemap file, you’re saying, well, actually this page is also for France or in French, then our systems might be like, well, what is happening here? We don’t know what to do with this kind of mix of annotations. And then we may pick one or the other. Similarly, if you say this page was last changed 20 years ago, which doesn’t make much sense but say you say 20 years. And in the other sitemap file, you say, well, actually, it was five minutes ago. Then our systems might look at that and say, well, one of you is wrong. We don’t know which one. Maybe we’ll follow one or the other. Maybe we’ll just ignore that last modification date completely. So that’s kind of the thing to watch out for. But otherwise, if it’s just mentioned multiple sitemap files and the information is either consistent or kind of works together, in that maybe one has the last modification date, the other has the hr flange annotations, that’s perfectly fine.

How can I block embedded video pages from getting indexed?

Q: (19:00) I’m in charge of a video replay platform, and simplified, our embeds are sometimes indexed individually. How can we prevent that?

  • (19:10) So by embeds, I looked at the website, and basically, these are iframes that include a simplified HTML page with a video player embedded. And, from a technical point of view, if a page has iframe content, then we see those two HTML pages. And it is possible that our systems indexed both HTML pages because they are separate. One is included in the other, but they could theoretically stand on their own, as well. And there’s one way to prevent that, which is a reasonably new combination with robots meta tags that you can do, which is with the indexifembedded robots meta tag and a noindex robots meta tag. And, on the embedded version, so the HTML file with the video directly in it– you would add the combination of noindex plus indexifembedded robots meta tags. And that would mean that, if we find that page individually, we would see, oh, there’s a noindex. We don’t have to index this. But with the indexifembedded, it essentially tells us that, well, actually, if we find this page with the video embedded within the general website, then we can index that video content, which means that the individual HTML page would not be indexed. But the HTML page embedded with the video information would be indexed normally. So that’s kind of the setup that I would use there. And this is a fairly new robots meta tag, so it’s something that not everyone needs. Because this combination of iframe content or embedded content is kind of rare. But, for some sites, it just makes sense to do it like that.

Q: (21:15)Another question about HTTPS, maybe. I have a question around preloading SSL via HSTS. We are running into an issue where implementing HSTS into the Google Chrome preload list. And the question kind of goes on with a lot of details. But what should we search for?

  • (21:40) So maybe take a step back when you have HTTPS pages and an HTTP version. Usually, you would redirect from the HTTP version to HTTPS. And the HTTPS version would then be the secure version because that has all of the properties of the secure URLs. And the HTTP version, of course, would be the open one or a little bit vulnerable. And if you have this redirect, theoretically, an attacker could take that into account and kind of mess with that redirect. And with HSTS, you’re telling the browser that once they’ve seen this redirect, it should always expect that redirect, and it shouldn’t even try the HTTP version of that URL. And, for users, that has the advantage that nobody even goes to the HTTP version of that page anymore, making it a little more secure. And the pre-load list for Google Chrome is a static list that is included, I believe, in Chrome probably in all of the updates, or I don’t know if it’s downloaded separately. Not completely sure. But, essentially, this is a list of all of these sites where we have confirmed that HSTS is set up properly and that redirect to the secure page exists there so that no user ever needs to go to the HTTP version of the page, which makes it a little bit more secure. From a practical point of view, this difference is very minimal. And I would expect that most sites on the internet just use HTTPS without worrying about the pre-load list. Setting up HSTS is always a good practice, but it’s something that you can do on your server. And as soon as the user sees that, their Chrome version keeps that in mind automatically anyway. So from a general point of view, I think using the pre-load list is a good idea if you can do that. But if there are practical reasons why that isn’t feasible or not possible, then, from my point of view, I would not worry about only looking at the SEO side of things. When it comes to SEO, for Google, what matters is essentially the URL that is picked as the canonical. And, for that, it doesn’t need HSTS. It doesn’t need the pre-load list. That does not affect at all on how we pick the canonical. But rather, for the canonical, the important part is that we see that redirect from HTTP to HTTPS. And we can kind of get a confirmation within your website, through the sitemap file, the internal linking, all of that, that the HTTPS version is the one that should be used in Search. And if we use the HTTPS version in Search, that automatically gets all of those subtle ranking bonuses from Search. And the pre-load list and HSTS are not necessary there. So that’s kind of the part that I would focus on there.

How can I analyse why my site dropped in ranking for its brand name?

Q: (25:05) I don’t really have a great answer, but I think it’s important to at least mention, as well what are the possible steps for investigation if a website owner finds their website is not ranking for their brand term anymore, and they checked all of the things, and it doesn’t seem to be related to any of the usual things?

  • (25:24) So, from my point of view, I would primarily focus on the Search Console or the Search Central Health Community and post all of your details there. Because this is where all of those escalations go and where the product and the Help forum, they can take a look at that. And they can give you a little bit more information. They can also give you their personal opinion on some of these topics, which might not match 100% what Google would say, but maybe they’re a little bit more practical, where, for example, probably not relevant to this site, but you might post something and say, well, my site is technically correct and post all of your details. And one of the product experts looks at it and says it might be technically correct, but it’s still a terrible website. You need to get your act together, write, and create better content. And, from our point of view, we would focus on technical correctness. And you need someone to give you that, I don’t know, personal feedback. But anyway, in the Help forums, if you post the details of your website with everything that you’ve seen, the product experts are often able to take a look and give you some advice on, specifically, your website and the situation that it’s in. And if they’re not able to figure out what is happening there, they also have the ability to escalate these kinds of topics to the community manager of the Help forums. And the community manager can also bring things back to the Google Search team. So if there are things that are really weird and now and then, something really weird does happen with regards to Search. It’s a complex computer system. Anything can break. But the community managers and the product experts can bring that back to the Search team. And they can look to see if there is something that we need to fix, or is there something that we need to tell the site owner, or is this kind of just the way that it is, which, sometimes, it is. But that’s generally the direction I would go for these questions. The other subtly mentioned here is that I think the site does not rank for its brand name. One of the things to watch out for, especially with regards to brand names, is that it can happen that you say something is your brand name, but it’s not a recognised term from users. For example, you might say I don’t know. You might call your website bestcomputermouse.com. And, for you, that might be what you call your business or what you call your website. Best Computer Mouse. But when a user goes to Google and enters “best computer mouse,” that doesn’t necessarily mean they want to go directly to your website. It might be that they’re looking for a computer mouse. And, in cases like that, there might be a mismatch of what we show in the search results with what you think you would like to have shown for the search results for those queries if it’s something more of a generic term. And these kinds of things also play into search results overall. The product experts see these all the time, as well. And they can recognise that and say, actually, just because you call your website bestcomputermouse.com I hope that site doesn’t exist. But, anyway, just because you call your website doesn’t necessarily mean it will always show on top of the search results when someone enters that query. But that’s kind of something to watch out for. But, in general, I would go to the Help forums here and include all of the information you know that might play a role here. So if there was a manual action involved and you’re kind of, I don’t know, ashamed of that which, it’s kind of normal. But all of this information helps the product experts better understand your situation and give you something actionable that you can do to take as a next step or to understand the situation a little bit better. So the more information you can give them from the beginning, the more likely they’ll be able to help you with your problem.

Sign up for our Webmaster Hangouts today!

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH 

SEO & web developers: friends or foes?

Is SEO just another constraint?

(0:59) MARTIN: So you are saying that for you as a web developer, SEO is just another constraint. What do you mean by that?

SURMA: I mean, as web developers, as far as web development goes, we and I are pretending to be a representative of all web developers, which I’m clearly not. But the usual struggle involves stuff like, how many browsers do I support, or how far back in the browser versions do I go? Do I support IE11? Do I not support IE11? How do I polyfill certain things? Do I polyfill them, or do I use progressive enhancement? What kind of implications do both of these choices have? Do we actually make design fully responsive? Do we do a separate mobile site? Like, there are all these choices that I already have to make. Four and then now frameworks come along and are like; we’re just going to do most of the stuff on the client-side because we want to write a single-page app. And then you either have to say, do I set up something for server-side rendering or static rendering and build time, or do I go all-in on the single-page app, and just everything happens on the client? What do search engines think about that? And then search engines come in like, well, you should be doing this, and you should not be doing that because that gets penalised. And actually, we don’t even understand what you’re doing here because search engines are running Chrome 12 or something. And it’s just like it’s yet another constraint that I have to balance and make decisions on whether following their restrictions is more important or following my business restrictions, my marketing restrictions, or my project management restrictions. And I feel like some people are frustrated about that sometimes.

MARTIN: I remember when I was a web developer, there’s also this entire user first no, mobile-first, no, content first, no this first, no, that first. That’s probably also going in the same direction, and I understand the frustration. And I see that there are lots of requirements, and sometimes these requirements might even contradict each other. But I think as developers, we should understand what SEOs are trying to help us with and what search engines, separately from what we are building and doing, are actually trying to accomplish. And then we would see that it’s basically all of these requirements are important, but maybe some of them are more important than others, and they are important in different stages, I would say. So, for instance, you mentioned mobile-first versus or a responsive design versus having separate versions, right? I would say that’s a decision that you need to make relatively early on, right? In the technology process before you, yeah, whereas then should I use this feature, should I polyfill this feature, should I not use this feature because I need to support an old browser that doesn’t support it, and the polyfill is tricky, that’s something that probably happens a little later in development, right?

SURMA: Yeah, I think I agree with that. It depends on how much flexibility you’re given as a developer. I think we all may or may not have lived through the experience of working with a designer who insists on a pixel perfect design, which is just not something that works on the web, but sometimes, you’re given a task, and your job is to complete it and not have an opinion. I don’t want to go down. It depends on the route. But in the end, we won’t get whatever we end up talking about. We probably won’t find a definitive answer. Like context matters, and everyone has different constraints. And I think that’s really what it’s about that you need to just be aware of the constraints that you have and make decisions in the context of your situation.

SEO – friend, not foe

(04:41) SURMA: You mentioned something that I find quite interesting. You said SEOs are trying to help us with something because often they’re kind of like villains, almost like the people who just try to get you to the top of the rankings, whether you deserve it or not. But in the end, I feel like there is help going on. Both search engines, as well as the people that want you to do well in the search results, actually are trying to make your site better in the end. Like no search engine actually wants to give you results that are bad. That just doesn’t make sense. In the end, search engines are trying to put the best results on top. And if an SEO helps you get on top, then ideally, what that means is your site has gotten better. 

MARTIN: Yes, exactly. And I love that you are saying, like, oh yeah, you have to look at the context. You have to understand the constraints. And that’s actually something that a good SEO will help you with because if you look at it from a purely SEO standpoint, depending on what you’re building and how you’re building it, you might have different priorities. So, for instance, if you’re saying, oh, this is a test version of a landing page. We just want to see if the customer segment is even interested in what we are potentially building later on, and you don’t want to build for the bin, right? You don’t want to build something that then, later on, you find out doesn’t actually work because there’s no interest in it. So then, for these things, SEO might be relatively important because you definitely want people to find that so that you get enough data on making decisions later on. But you might not be so constrained in terms of oh, yeah, this has to be client-side versus server-side. We don’t really have to make this super fast. We just have to get this into people’s faces, especially through search engines, so that we get some meaningful data to make decisions later on, versus you’re building and improving on an existing product, and that should belong evitable.

Building better websites for everyone

(6:33) MARTIN: So, a good SEO helps you understand what kind of requirements you should take into account. And SEO is a gigantic field, and they should pick the bits and pieces that actually matter for your specific context. So you said like, oh, we want to build a single page application. Maybe. Maybe you do, maybe you don’t. Maybe it’s fine to build a client-side rendering, but maybe consider doing some bits and pieces of server-side rendering because you reap some performance benefits there. And that also influences SEO because, as you say, search engines want to find the good things. So making your site better includes making it faster but also making it accessible because if you think about it, search engines are computers interacting with your website, working through your website and trying to understand what your website says. So they have basic accessibility needs. They don’t necessarily interact with things. They don’t click on stuff. And yet they need to work with your content. So it should be accessible to them. And SEOs will point these things out.

SURMA: That’s really interesting that you bring that up because I was just thinking about both performance, like loading performance, for example, and accessibility. So, on the one hand, it’s kind of accepted that loading performance is important. But now that, for example, we have Core Web Vitals. And one of the core ones of their core statements is that they don’t want to just put a number on a metric or something that’s measurable. They want to measure things that are important to user experience. And so the Core Web Vitals that we currently have, which is just three metrics, LCP, CLS, and FID, right. All of these are statistically correlated to users enjoying the site more or staying on your site longer. And that means if you optimise for those, you actually will get something out of that. You will get users that stay longer. And now that search is looking into those, it means optimising for those metrics not only gets you higher in the rankings potentially but also the people that do see your site will most likely stay longer or engage with it more because we know that these metrics correlate with user behaviour. And I think that’s a really interesting approach, wherein, in the end, actually search engines are helping you do the right thing. And now I’m wondering which I don’t even know like accessibility is something, which we keep talking about, and we know it’s important. And yet it feels like it always falls off the truck. In many projects, it’s an afterthought, and many people know that it needs to be something that needs to be considered from the very beginning of a project because it’s hard to shoehorn in at the end. It needs to be something that works from the start. Has any search engine ever done anything in this space to help developers be better with accessibility?

MARTIN: We are more or less going in that direction, not necessarily from a purely accessible standpoint, but as search engines need to semantically understand what the site is about, we just don’t take the text and take it as plain. We basically try to figure out, oh, so this is a section, this is a section, this is the section that is most important on the page. This is just an explainer for the previous section and so on and so forth. For that, we need the semantics that HTML gives us. And actually, these semantics are also important for accessibility oftentimes because people need to be able to navigate your content differently, maybe with a keyboard, maybe with a screen reader. And for that, the semantics on the page need to be in place from the get-go, basically. So in that direction, having better semantics does help search engines better understand your content and, as a byproduct, also help people better navigate your content who have additional navigational needs. So you could say search engines are a little involved in terms of accessibility. That does not cover accessibility as a whole. There is so much more to accessibility than just that. But at least in the core of the semantics on the web, that is taken care of here. 

Keeping up with web development trends is important for SEOs

(10:37) MARTIN: Another thing that I really found interesting is where you say, oh, you know, SEOs are often seen as just coming with all of these additional constraints and requirements. What is there that they could do differently that you would think would help you and other developers understand where they’re coming from or have a meaningful discussion about these things and turn that into a positive, constructive input?

SURMA: I don’t know if this is the answer you’re looking for, but one thing I have seen is that some SEOs need to put a bit more effort into being up to date on what is good and what is not good guidance, or more specifically, what search engines are and are not capable of processing correctly. I think– I know that you have been fighting the no, no JavaScript is the fine fight for a long time now, but I think to this day, there are still some SEOs out there who insist that anything that is in JavaScript is invisible to the search engine. I think in general, I think it goes back to the trade-off thing, where I think web developers need to realise that SEOs are trying to help you be better, and SEOs need to realise that that they can’t give advice as a either you do this, or you’re screwed kind of approach. Like, it’s a trade-off. You can say that this is one way where you can make a site better. This is another way, and this is yet another thing you can do. And all of these things will accumulate to make your site better, ideally resulting in giving you a higher rank. But it’s not like an all or nothing approach, I think. Sometimes certain constraints just outweigh other constraints, and you then make a decision to go with plan A rather than plan B or stick to what you currently got. We have recently seen a lot of shifts from purely client-side up to like this hybrid approach, where the app is rendered on the server-side or even at build time but then turns into a single page app once loaded onto the client, and that has upsides, and it has downsides. Like we know that statically rendered content is very good for your first render, your largest loading time, that all goes down. But now we have this additional blob of JavaScript state that is somehow inserted into the document, and then often, the full dynamic client-side re-render happens, which can create an iffy user experience at times. And all these things are working for or against you in certain aspects. And I think that’s just something that the SEOs need to be mindful of as well, that the developer cannot just follow everything that they say because they’re different; they’re not the only deciding force on a project. I’m not saying that all SEOs behave like this, of course, because I’m honestly quite inexperienced in working with an SEO directly. But just based on stories that I hear and people that I see on Twitter, it’s all a trade-off. And I think people need just to realise that everyone is in 90% of the cases trying to do the best they can and do their job well. And just keep that in mind. And then probably find a solution that works for both or is a compromise.

Change your perspective

(13:57) MARTIN: Yeah. No, that makes perfect sense. And I wish that both SEOs and developers would look at it from a different perspective. Like both SEOs and developers want to build something that people are using, right? You don’t want to build something and no one uses it. That’s neither going to pay your bills very long. Nor is it making you happy to see like, oh, yeah, we built something that helps many people. That’s true for me. When I was a developer, I wanted to build things that have an impact, and that means that they need to be used by someone. And if we are building something that we genuinely are convinced is a good thing, then that should be reflected by the fact that search engines would agree on that and say like, oh, yeah, this is a good solution to this problem or this challenge that people might face and thus want to showcase your solution basically. But for that, there needs to be something that search engines can grasp and understand and look at and put into their index accordingly. So basically, they need to understand what is the page about, what it offers the users, is it fast, is it slow, is it mobile-friendly, all these kinds of things. And SEOs are then the ones who are– because you as a developer are focused on making it work in all the browsers that it needs to work in, making it fast, using all the best practices, using tree shaking, bundle splitting, all that kind of stuff. And then SEOs come in and help you make sure that search engines understand what you’re building and can access what you’re building and that you are following the few best practices that you might not necessarily be aware of yet. But you are right. For that, SEOs need to follow up-to-date best practice guidance, and not all of them do. Well, at the beginning of 2021, I ran a poll in one of the virtual events, asking if people were aware that the Google bot is now using an evergreen Chrome. So we are updating the Chromium instance that is used to render pages. And I think like 60% of the people were like, oh, I didn’t know that even though we announced that in 2019 at Google I/O in May.

SURMA: How was that?

MARTIN: That was amazing. I mean, launching this has been a great initiative. But I’m surprised that I think we have gotten developers to notice that, but not necessarily all SEOs have noticed. And it’s things that are not necessarily easy or not even your job as a developer, where SEOs can really help you or at least make the right connections for you. For instance, I know you build squoosh. The app, right?

SURMA: Well, not just me, but I was part of the team that built it.

MARTIN: Right. You were part of the team who built squoosh app. And I think squoosh.app is awesome. For those who don’t know it, it’s a web app that allows you to experiment with different image compression settings and then basically get the image that you put into the application in your browser. It’s all working from the browser. You don’t have to install anything. And basically, get like the best settings for the best gains in terms of file size, right? That’s roughly what it does.

SURMA: Yeah. It’s an image compressor, and you can fiddle with all the settings and can try out the next generation codecs now that are coming to the web. But yeah, you have more control than I think any other image compression app that I know.

MARTIN: And it’s really, really cool, and I really admire the work that the engineering put into this, that all the developers put into this to make this work so smoothly, so quickly, so nice. It implements lots of best practices. But for a search engine, if you were to sell that as a product, this might not be very good. And that’s because if you look at it, it’s an interface that allows me to drag and drop an image into it, and then it does a bunch of stuff in terms of user interface controls to fine-tune settings. But if I was robot access that page, it’s a bunch of HTML controls, but not that much content, right?

SURMA: Agreed

MARTIN: So would you want to have to sit down and figure out how you would describe this and how you probably don’t want to do all that work by yourself. You want to focus on building cool stuff with the new image algorithms and fine-tuning how to make the settings work better or more accessible, or easier to understand, right? That’s where you want to focus on.

SURMA: Yeah. And I think I actually would like to get help from someone who knows whether this site like I wouldn’t have been able to say if like I think our loading performance is excellent because we spend lots of time on making it good and trying to pioneer some techniques. But I wouldn’t have been able to tell you whether it gets a good ranking from a search bot or a bad ranking, to be honest. I mean, the name is unique enough that it’s very Google-able, so I think even if it didn’t do so well, people would probably find it. But in the end, it’s actually a very interesting example because you’re completely right. The opening page, because it’s about images, it mostly consists of images. The only text we have on the landing page is the name and the file size of the demo images, and the licensing link. So there’s not much going on for a bot to understand what the site does, especially because something this specific, there’s not even much to do with semantic markup, as you said. Right, OK, cool, there’s an image and an input tag. You can drag and drop an image. But even that, even the drag and drop is actually only communicated via the user interface, not via the markup. And so yeah, that’s a really interesting example. Like, I would have no idea how to optimise. I would have probably said like meta description tag. I don’t know. And then John Miller told me that apparently, we don’t pay attention to the meta description tag anymore.

MARTIN: Well, we do. It’s the keywords that we don’t.

SURMA: Oh, the keywords are the one. OK, I take that back. Yeah, exactly. So I think you’re right that it’s very easy for developers to sometimes also guess what is good for SEO and what is bad and actually get input from someone who put in the time to learn what is actually going on. Keep up to date with the most recent updates. As you say, people apparently don’t even know that Google bot is now evergreen Chrome, which is amazing. So there are probably a lot of SEOs who go around saying like, no, no, no, no, you can’t use Shadow Dom or something like that, even if they know the JavaScript actually works. I agree. Get someone who knows.

Making things on the web is a team sport.

(20:26) SURMA: I mean, I’ve been saying that even as a very enthusiastic experimenter and web developer, one single person cannot really understand and use the entire web platform. It’s now so incredibly widespread in the areas that cover. So you can do web audio, web assembly, web use B, MIDI, and all these things. Like, you will not have experience in all of these things. And some of these holes, like WebGL itself, is a huge rabbit holes to fall into. So, pick some stuff. Get good at it. And for things you don’t know, get help because otherwise, you’re going to work on half-knowledge that might end up very likely going to end up making actually counterproductive for what you’re trying to achieve.

Sign up for SEO & Web Developers today!

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH 

WebMaster Hangout – Live from June 03, 2022

Can I use two HTTP result codes on a page?

Q: (01:22) All right, so the first question I have on my list here is it’s theoretically possible to have two different HTTP result codes on a page, but what will Google do with those two codes? Will Google even see them? And if yes, what will Google do? For example, a 503 plus a 302.

  • (01:41) So I wasn’t aware of this. But, of course, with the HTTP result codes, you can include lots of different things. Google will look at the first HTTP result code and essentially process that. And you can theoretically still have two HTTP result codes or more there if they are redirects leading to some final page. So, for example, you could have a redirect from one page to another page. That’s one result code. And then, on that other page, you could serve a different result code. So that could be a 301 redirect to a 404 page is kind of an example that happens every now and then. And from our point of view, in those chain situations where we can follow the redirect to get a final result, we will essentially just focus on that final result. And if that final result has content, then that’s something we might be able to use for canonicalization. If that final result is an error page, then it’s an error page. And that’s fine for us too.

Does using a CDN improve rankings if my site is already fast in my main country?

Q: (02:50) Does putting a website behind a CDN improve ranking? We get the majority of our traffic from a specific country. We hosted our website on a server located in that country. Do you suggest putting our entire website behind a CDN to improve page speed for users globally, or is that not required in our case?

  • (03:12) So obviously, you can do a lot of these things. I don’t think it would have a big effect on Google at all with regards to SEO. The only effect where I could imagine that something might happen is what users end up seeing. And kind of what you mentioned, if the majority of your users are already seeing a very fast website because your server is located there, then you’re kind of doing the right thing. But of course, if users in other locations are seeing a very slow result because perhaps the connection to your country is not that great, then that’s something where you might have some opportunities to improve that. And you could see that as something in terms of an opportunity in the sense that, of course, if your website is really slow for other users, then it’s going to be rarer for them to start going to your website more because it’s really annoying to get there. Whereas, if your website is pretty fast for other users, then at least they have an opportunity to see a reasonably fast website, which could be your website. So from that point of view, if there’s something that you can do to improve things globally for your website, I think that’s a good idea. I don’t think it’s critical. It’s not something that matters in terms of SEO in that Google has to see it very quickly as well or anything like that. But it is something that you can do to kind of grow your website past just your current country. Maybe one thing I should clarify, if Google’s crawling is really, really slow, then, of course, that can affect how much we can crawl and index from the website. So that could be an aspect to look into. In the majority of websites that I’ve looked at, I haven’t really seen this as being a problem with regards to any website that isn’t millions and millions of pages large. So from that point of view, you can double-check how fast Google is crawling in the Search Console and the crawl stats. And if that looks reasonable, even if that’s not super fast, then I wouldn’t really worry about that.

Should I disallow API requests to reduce crawling?

Q: (05:20) Our site is a live stream shopping platform. Our site currently spends about 20% of the crawl budget on the API subdomain and another 20% on image thumbnails of videos. Neither of these subdomains has content which is part of our SEO strategy. Should we disallow these subdomains from crawling, or how are the API endpoints discovered or used?

  • (05:49) So maybe the last question there first. In many cases, API endpoints end up being used by JavaScript on our website, and we will render your pages. And if they access an API that is on your website, then we’ll try to load the content from that API and use that for the rendering of the page. And depending on how your API is set up and how your JavaScript is set up, it might be that it’s hard for us to cache those API results, which means that maybe we crawl a lot of these API requests to try to get a rendered version of your pages so that we can use those for indexing. So that’s usually the place where this is discovered. And that’s something you can help by making sure that the API results can also be cached well, that you don’t inject any timestamps into URLs, for example, when you’re using JavaScript for the API, all of those things there. If you don’t care about the content that’s returned with these API endpoints, then, of course, you can block this whole subdomain from being crawled with the robots.txt file. And that will essentially block all of those API requests from happening. So that’s something where you first of all need to figure out are these API results are actually part of the primary content or important critical content that I want to have indexed from Google?

Q: (08:05) Is it appropriate to use a no-follow attribute on internal links to avoid unnecessary crawler requests to URLs which we don’t wish to be crawled or indexed?

  • (08:18) So obviously, you can do this. It’s something where I think, for the most part, it makes very little sense to use nofollow on internal links. But if that’s something that you want to do, go for it. In most cases, I will try to do something like using the rel=canonical to point at URLs that you do want to have indexed or using the robots.txt for things that you really don’t want to have crawled. So try to figure out is it more like a subtle thing that you have something that you prefer to have indexed and then use rel=canonical for that? Or is it something where you say actually, when Googlebot accesses these URLs, it causes problems for my server. It causes a large load. It makes everything really slow. It’s expensive or what have you. And for those cases, I would just disallow the crawling of those URLs. And try to keep it kind of on a basic level there. And with the rel=canonical, obviously, we’ll first have to crawl that page to see the rel=canonical. But over time, we will focus on the canon that you’ve defined. And we’ll use that one primarily for crawling and indexing.

Why don’t site:-query result counts match Search Console counts?

Q: (09:35) Why don’t the search results of a site query, which returns so many giant numbers of results, match what Search Console and the index data have for the same domain?

  • (09:55) Yeah, so this is a question that comes up every now and then. I think we’ve done a video on it separately as well. So I would double-check that out. I think we’ve talked about this a long time already. Essentially, what happens there is that there are slightly different optimisations that we do for site queries in terms of we just want to give you a number as quickly as possible. And that can be a very rough approximation. And that’s something where when you do a site query, that’s usually not something that the average user does. So we’ll try to give you a result as quickly as possible. And sometimes, that can be off. If you want a more exact number of the URLs that are actually indexed for your website, I would definitely use Search Console. That’s really the place where we give you the numbers as directly as possible, as clearly as possible. And those numbers will also fluctuate a little bit over time. They can fluctuate depending on the data centre sometimes. They go up and down a little bit as we crawl new things, and we kind of have to figure out which ones we keep, all of those things. But overall, the number in Search Console for in, I think the indexing report that’s really the number of URLs that we have indexed for your website. I would not use the about number for any diagnostics purposes in the search results. It’s really meant as a very, very rough approximation.

What’s the difference between JavaScript and HTTP redirects?

Q: (11:25) OK, now a question about redirects again, about the differences between JavaScript versus 301, HTTP, status code redirects, and which one would I suggest for short links.

  • (11:43) So, in general, when it comes to redirects, if there’s a server-side redirect where you can give us a result code as quickly as possible, that is strongly preferred. The reason that it is strongly preferred is just that it can be processed immediately. So any request that goes to your server to one of those URLs, we’ll see that redirect URL. We will see the link to the new location. We can follow that right away. Whereas, if you use JavaScript to generate a redirect, then we first have to render the JavaScript and see what the JavaScript does. And then we’ll see, oh, there’s actually a redirect here. And then we’ll go off and follow that. So if at all possible, I would recommend using a server-side redirect for any kind of redirect that you’re doing on your website. If you can’t do a server-side redirect, then sometimes you have to make do. And a JavaScript redirect will also get processed. It just takes a little bit longer. The meta refresh type redirect is another option that you can use. It also takes a little bit longer because we have to figure that out on the page. But server-side redirects are great. And there are different server-side redirect types. So there’s 301 and 302. And I think, what is it, 306? There’s 307 and 308, something along those lines. Essentially, the differences there are whether or not it’s a permanent redirect or a temporary redirect. A permanent redirect tells us that we should focus on the destination page. A temporary redirect tells us we should focus on the current page that is redirecting and kind of keep going back to that one. And the difference between the 301, 302, and the 307, and I forgot what the other one was, is more of a technical difference with regards to the different request types. So if you enter a URL in your browser, then you do what’s called a GET request for that URL, whereas if you send something to a form or use specific types of API requests, then that can be a POST request. And the 301, 302 type redirects would only redirect the normal browser requests and not the forms and the API requests. So if you have an API on your website that uses POST requests, or if you have forms where you suspect someone might be submitting something to a URL that you’re redirecting them, then obviously, you would use the other types. But for the most part, it’s usually 301 or 302.

Should I keep old, obsolete content on my site, or remove it?

Q: (14:25) I have a website for games. After a certain time, a game might shut down. Should we delete non-existing games or keep them in an archive? What’s the best option so that we don’t get any penalty? We want to keep informed of the game through videos, screenshots, et cetera.

  • (14:42) So essentially, this is totally up to you. It’s something where you can remove the content of old things if you want to. You can move them to an archive section. You can make those old pages no-index so that people can still go there when they’re visiting your website. There are lots of different variations there. The main thing that probably you will want to do if you want to keep that content is moving it into an archive section, as you mentioned. The idea behind an archive section is that it tends to be less directly visible within your website. That means it’s easy for users and for us to recognise this is the primary content, like the current games or current content that you have. And over here is an archive section where you can go in, and you can dig for the old things. And the effect there is that it’s a lot easier for us to focus on your current live content and to recognise that this archive section, which is kind of separated out, is more something that we can go off an index. But it’s not really what you want to be found for. So that’s kind of the main thing I would focus on there. And then whether or not you make the archive contains no index after a certain time or for other reasons, that’s totally up to you.

Q: (16:02) Is there any strategy by which desired pages can appear as a site link in Google Search results?

  • (16:08) So site links are the additional results that are sometimes shown below a search result, where it’s usually just a one-line link to a different part of the website. And there is no meta tag or structured data that you can use to enforce a site link to be shown. And it’s a lot more than our systems try to figure out what is actually kind of related or relevant for users when they’re looking at this one web page as well? And for that, our recommendation is essentially to have a good website structure, to have clear internal links so that it’s easy for us to recognise which pages are related to those pages, and to have clear titles that we can use and kind of show as a site link. And with that, it’s not that there’s a guarantee that any of this will be shown like that. But it kind of helps us to figure out what is related. And if we do think it makes sense to show a site link, then it’ll be a lot easier for us to actually choose one based on that information.

Our site embeds PDFs with iframes, should we OCR the text?

Q: (17:12) More technical one here. Our website uses iframes and a script to embed PDF files onto our pages and our website. Is there any advantage to taking the OCR text of the PDF and pasting it somewhere into the document’s HTML for SEO purposes, or will Google simply parse the PDF contents with the same weight and relevance to index the content?

  • (17:40) Yeah. So I’m just momentarily thrown off because it sounds like you want to take the text of the PDF and just kind of hide it in the HTML for SEO purposes. And that’s something I would definitely not recommend doing. If you want to have the content indexable, then make it visible on the page. So that’s the first thing there that I would say. With regards to PDFs, we do try to take the text out of the PDFs and index that for the PDFs themselves. From a practical point of view, what happens with a PDF is as one of the first steps, we convert it into an HTML page, and we try to index that like an HTML page. So essentially, what you’re doing is kind of framing an indirect HTML page. And when it comes to iframes, we can take that content into account for indexing within the primary page. But it can also happen that we index the PDF separately anyway. So from that point of view, it’s really hard to say exactly kind of what will happen. I would turn the question around and frame it as what do you want to have to happen? And if you want your normal web pages to be indexed with the content of the PDF file, then make it so that that content is immediately visible on the HTML page. So instead of embedding the PDF as a primary piece of content, make the HTML content the primary piece and link to the PDF file. And then there is a question of do you want those PDFs indexed separately or not? Sometimes you do want to have PDFs indexed separately. And if you do want to have them indexed separately, then linking to them is great. If you don’t want to have them indexed separately, then using robots.txt to block their indexing is also fine. You can also use the no index [? x-robots ?] HTTP header. It’s a little bit more complicated because you have to serve that as a header for the PDF files if you want to have those PDF files available in the iframe but not actually indexed. I don’t know. Timing, we’ll have to figure out how long we make these.

Q: (20:02) We want to mask links to external websites to prevent the passing of our link juice. We think the PRG approach is a possible solution. What do you think? Is the solution overkill, or is there a simpler solution out here?

  • (20:17) So the PRG pattern is a complicated way of essentially making a POST request to the server, which then redirects somewhere else to the external content so Google will never find that link. From my point of view, this is super overkill. There’s absolutely no reason to do this unless there’s really a technical reason that you absolutely need to block the crawling of those URLs. I would either just link to those pages normally or use the rel nofollow to link to those pages. There’s absolutely no reason to go through this weird POST redirect pattern there. It just causes a lot of server overhead. It makes it really hard to cache that request and take users to the right place. So I would just use a nofollow on those links if you don’t want to have them followed. The other thing is, of course, just blocking all of your external links. That rarely makes any sense. Instead, I would make sure that you’re taking part in the web as it is, which means that you link to other sites naturally. They link to you naturally, taking part of the normal part of the web and not trying to keep Googlebot locked into your specific website. Because I don’t think that really makes any sense.

Does it matter which server platform we use, for SEO?

Q: (21:47) For Google, does it matter if our website is powered by WordPress, WooCommerce, Shopify, or any other service? A lot of marketing agencies suggest using specific platforms because it helps with SEO. Is that true?

  • (22:02) That’s absolutely not true. So there is absolutely nothing in our systems, at least as far as I’m aware, that would give any kind of preferential treatment to any specific platform. And with pretty much all of these platforms, you can structure your pages and structure your website however you want. And with that, we will look at the website as we find it there. We will look at the content that you present, the way the content is presented, and the way things are linked internally. And we will process that like any HTML page. As far as I know, our systems don’t even react to the underlying structure of the back end of your website and do anything special with that. So from that point of view, it might be that certain agencies have a lot of experience with one of these platforms, and they can help you to make really good websites with that platform, which is perfectly legitimate and could be a good reason to say I will go with this platform or not. But it’s not the case that any particular platform has an inherent advantage when it comes to SEO. You can, with pretty much all of these platforms, make reasonable websites. They can all appear well in search as well.

Does Google crawl URLs in structured data markup?

Q: (23:24) Does Google crawl URLs located in structured data markup, or does Google just store the data?

  • (23:31) So, for the most part, when we look at HTML pages, if we see something that looks like a link, we might go off and try that URL out as well. That’s something where if we find a URL in JavaScript, we can try to pick that up and try to use it. If we find a link in a text file on a site, we can try to crawl that and use it. But it’s not really a normal link. So it’s something where I would recommend if you want Google to go off and crawl that URL, make sure that there’s a natural HTML link to that URL, with a clear anchor text as well, that you give some information about the destination page. If you don’t want Google to crawl that specific URL, then maybe block it with robots.txt or on that page, use a rel=canonical pointing to your preferred version, anything like that. So those are the directions I would go there. I would not blindly assume that just because it’s in structured data, it will not be found, nor would I blindly assume that just because it’s in structured data, it will be found. It might be found. It might not be found. I would instead focus on what you want to have to happen there. If you want to have it seen as a link, then make it a link. If you don’t want to have it crawled or indexed, then block crawling or indexing. That’s all totally up to you.

Sign up for our Webmaster Hangouts today!

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH 

Are You Ready for Google Analytics 4?

With all the new changes in the past decade in the digital marketing landscape, a more sophisticated way to collect and organise user data was much needed. In the fall of 2020, Google introduced an updated software called Google Analytics 4 (GA4), a version that, so far, has worked in parallel with its predecessor Google Universal Analytics (UA). However, Google recently announced that this version would be sunsetting on July 1, 2023, including its premium version 360 Universal Analytics, which will stop processing data in October of next year as well. It is worth noting that the premium features from 360 Universal Analytics will be rolled into the new iteration of GA4 as well.

Getting used to new software takes time, especially in this case, considering that Google Analytics 4 presents an entirely different interface and configuration to UA. This is most certainly why Google made this announcement in advance, to allow businesses still using the UA tool to migrate and get used to the latest version. It is also worth noting that GA4 doesn’t provide any historical data you’ve tracked in Universal Analytics, which adds another good reason why you should start migrating to GA4 since data continuity and reporting are paramount to your business.

Some of the main tools integrated with Google Analytics 4

Event-based data model

Probably the most significant change in the software, Google Analytics 4 introduces an event-based model that offers flexibility in the way we collect data while also providing a new set of reports based on the model.

With Google Analytics Universal, businesses relied on “sessions”, which accounted for a more fragmented model since it only collected data in limited slots. Also, it only worked with specific and predefined information categories, making custom-type data much more challenging to obtain. But now, since everything can be an event, there’s a broader opportunity to understand and compare client behaviour through different custom-type data across various platforms.

Operation across platforms

Previous to GA4, businesses required different tools to analyse both website and app data separately; this made it difficult to obtain a global picture of its user traffic. But now, GA4 added a new kind of property that merges app and web data for reporting and analysis.

Thanks to this update, if you have users coming to you through different platforms, you can now use a single set of data to know which marketing channels are acquiring more visitors and conversions.

No need to rely on cookies

As mentioned at the beginning of this article, a lot has changed in the last decade regarding digital marketing; this includes an ever-growing emphasis on user privacy.

Big tech companies, such as Apple, have started to develop a first-privacy policy, which is why Safari started blocking all third-party cookies in 2020. So it comes as no surprise that Google also announced that they will do the same in 2023 for Chrome.

With GA4, Google is moving away from a cookies-dependent model, no longer needing to store IP addresses for its functionality and looking to be more compliant with today’s international privacy standards.

Audience Triggers

This is a cool feature and lets brands set conditions for a user to move from one audience group to another (like they’ve bought into a specific product category). Then you can better personalise the ad experience, offering complimentary/similar products across the display, video and discovery placements and bring them back to shop more with you.

More Sophisticated insights

GA4 promised a more modernised way of collecting and organising data. Still, the most important thing for businesses is “how” to use this data. Advanced AI learning has been applied in Google Analytics 4 to generate sophisticated predictive insights about user behaviour and conversions, pivotal to improving your marketing.

Integrations

GA4 brings deeper integrations with other Google products, such as Google Ads, allowing you to optimise marketing campaigns by using data to build custom audiences that are more relevant to your marketing objectives and utilising Google Optimise for AB testing

In summary, Google Analytics 4 combines features designed to understand client behaviour in more detail than Universal Analytics previously allowed whilst prioritising user privacy. It also brings about a very friendly interface, with some drag-and-drop functionality to help build reports, reminiscent of Adobe Analytics Workspace.

Adobe Analytics Workspace

GA4 Drag and Drop

You can chat with the team at LION Digital and we can help you set up on GA4

We had a good chat with a colleague at our first Shopify Plus Partner meetup who was developing a Shopify Plus site for their client. They noted GA4 setup they had to do was quite complex and time-consuming as event-tracking needed to be configured, including eCommerce tracking, and Data Studio reports needs to be rebuilt. Took him a good 3 hours that he was keen not to repeat. Thankfully we’ve got a bunch of skilled specialists to help you set up GA4 and we can connect this to our Digital ROI Dashboard to help you get the insights you need, and look at your Channel Action Plans.

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH

Contact Us

Article by

Dimas Ibarra –
Digital Marketing Executive

What to Expect When Performance Max Replaces Smart Shopping

ICYMI: Google is rolling out a new campaign type called Performance Max that replaces Smart Shopping campaigns. Smart Shopping has been great from a channel diversification perspective, expanding your real estate from the Shopping tab across Google’s Display Network, YouTube and Gmail without having to set up specific campaigns across these verticals.

Performance Max builds on Smart Shopping by making available new inventory, including Dynamic Search Ads, Discovery Ads and YouTube Instream. Google likes PMAX so much that they’re forcing everyone to migrate over to these campaigns come July 1st – this means the time to test and learn is closing fast, and we have seen there is an element of learning from the machine’s side before ROAS returns to normal before starting to trend in the right direction.

Similar to Google Analytics 4’s event-based system, PMAX is touted as a goals-based, automated solution targeted at unlocking maximum performance comprised of the following three components:

  1. A single, goal-based campaign focused on achieving the performance objectives, leveraging automation and machine learning.
  2. Path-to-purchase aware, ensuring the right ad served at the right time in line with your marketing objectives.
  3. Access to the best inventory across Google properties to reach customers where they are, efficiently and at scale.

Are you ready to move from Smart Shopping to Performance Max?

Performance Max is about to be adopted by all eCommerce spenders, and the window of first-mover advantage and test and learn is closing rapidly. Act now. LION Digital’s Search specialists can support you throughout your transition to PMAX and other adaptive ad technologies.

We recently achieved outstanding results for our client Helly Hansen!
To know more check our case study.

Work with leaders in the eCommerce space to transform your channel strategy.

LION stands for Leaders In Our Niche. We pride ourselves on being true specialists in each eCommerce Marketing Channel. LION Digital has a team of certified experts and the head of the department with 10+ years of experience in eCommerce and SEM. We follow a ROI-focused approach in paid search backed by seamless coordination and detailed reporting, thus helping our clients meet their goals.

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH

Contact Us

Article by

Leonidas Comino – Founder & CEO

Leo is a, Deloitte award winning and Forbes published digital business builder with over a decade of success in the industry working with market-leading brands.

Like what we do? Come work with us

Thought Starters to help your business thrive in an economic downturn

No doubt you’ve read in the news that there are concerns about an economic downturn in Australia and abroad.

COVID tailwinds, the conflict in Ukraine, the US stock market declines, and the increased cost of shipping goods from China have both businesses and consumers concerned for their livelihoods.

We wanted to write a Thought Starter piece (the first of many to come) that summarises consumer and business reactions to past downturns, shifts in consumer behaviour observed over the past two years, and suggested actions you can take to prepare for the road ahead.

Consumer and Business Reactions to downturns

We’ve just come out of a pandemic, which typically results in a rapid V shape halt and subsequent snapback, as illustrated by the 2020-2021 consumer confidence swings. But we don’t see this in 2022, which belies a gradual decline in confidence as more news reaches consumers and their purchasing behaviour responds accordingly.

Source: ANZ Roy Morgan Consumer Confidence Index 2020-2022

Consumers respond in different ways to economic downturns; they are quick to act when they sense a tightening but less responsive when times are good. During trying times, they may buy fewer consumer durables, become more price-sensitive and seek cheaper alternatives, stockpile savings and shift spend away from status and lifestyle purchases to focus on items of necessity (non-perishables, essential items, healthcare, and apparently, toilet paper).

Businesses respond to downturns by reviewing and cutting discretionary spend – which often includes a round of redundancies and declines in marketing spend. There have been many studies from academics like Peter Field, Byron Sharp and Mark Ritson advising against cutting marketing budgets during a recession, with many advertisers believing that fewer companies will be advertising, and they can maintain their Share of Voice for a lower cost. This has been proven wrong time and again, with studies like the below from Engagement Labs showing without positive marketing messages, consumers focus on the negatives they may hear in the news or, worse, they may forget you completely, being wrapped up in competitor narratives.

The other side of this, which many brands are still grappling with from the pandemic is when consumers switch to alternatives, often they are slow to switch back when normality returns. Consider how many people now have a coffee machine instead of doing their morning coffee run – great for a coffee supplier, but not so great for the coffee shop owner. 

However, there is only so far that prior studies can take us, as there are a number of unique factors that still linger from COVID that need to be considered when making judgements about the upcoming economic environment.

Macro factors of the 2020’s

Ad spend reaches record high

First and foremost, the ad industry in Australia has surged to over $22.8 billion, with year-to-date financial ad spend reaching record levels, up 14.2 per cent. Digital, powered by search and social, grew 24.2 per cent (thanks to eCommerce, Government and Election campaigns), Outdoor bookings grew 11.9 per cent and TV ad spend is up 7.4 per cent. Ad execs don’t expect to see a spending decline until December or even into 2023.

Source: IAB, SMI, April 2022

It’s been reported in some industries that CPCs for search terms have risen as much as 800%; this coupled with Google’s rollout of Performance Max next month means brands are likely to be paying more to achieve the same results if they’re not careful.

Any eCommerce vendor who hasn’t tested Performance Max yet should look to expedite their migration as a priority, as PMAX campaigns tend to experience a learning curve for the first few weeks before ROI starts to trend in the right direction.

Shifting Consumer Behaviours

The second unique factor in the current environment is they precede sweeping changes in consumer buying behaviour – key outtakes from the AusPost 2022 Ecomm Report include:

  • 81% of households bought online in the last 12 months
  • Average 23% growth in spending on online physical goods
  • Consumers are shopping with more retailers – averaging 15/year vs 9 in 2019
  • Consumers more frequently than ever before – with almost 60% of Australians shopping online more than once a month (up from 42% in 2019)

As we noted earlier, consumers are slow to return to the norm after switching preferences, which they’ve barely had a chance to do before economic concerns arose. We can expect more people to be shopping online and greater price sensitivity, supported by Shopping, Marketplaces and offers from many vendors to impact consumer buying decisions in the months ahead.

Source: Australia Post: 2022 Inside Australian Online Shopping – Ecommerce Industry Report

Fading mental availability and attention

Attention has been cited as a challenge for many brands targeting Gen Z consumers. However, recent research by Karen Nelson-Field, backed by Peter Field and Orlando Wood, indicates this is a much broader issue, highlighting that left-brain targeted ads, particularly in the field of display, are not resonating with the type of right-hemisphere attention that parses the world and our place in it.

People, not product, they argue should be front and centre:

That means advertising, by and large, that involves the living [i.e. people not products] doing interesting things in a definable place. Not always, but mostly those are the sorts of things that capture our attention, elicit an emotional response and put things into long-term memory.

Orlando Wood

They warn that left-brain-targeted advertising is eroding the tried and tested ESOV (excess share of voice) principles that have underpinned advertising for the past 30 years.

The trio will present their findings and advice for marketers at the Cannes Lion Festival of Creativity this week.

Source: Mi-3 – Why mental availability, ESOV are fading: Peter Field, Karen Nelson-Field, Orlando Wood warn ad industry faces triple jeopardy threat, effectiveness rulebook ripped up

So what can you do about all this?

Here are a few Thought Starters to help you plan and prepare for the uncertainty ahead.

Find your own alternatives

Price may be a conscious factor in your consumers’ decision to move away from you, so can you find alternatives that can reduce your overheads, like sharing/renting warehouse space, trialling new suppliers or looking at drop-shipping solutions, to pass savings onto your customers? This is a short-term solution but if your consumers are thinking this way, best be proactive and consider what you can do to keep them. 

Talk to your customers

Above all, don’t forget to talk to your customers. They are feeling the same way you are and a bit of reassurance and care can go a long way! You’ll likely find ideas from even the most casual consumer that can help you navigate this environment and retain your customers.

Review Marketing ROI and how this has changed

Take a good look at where you are directing your marketing dollars and the ROI this yields – look beyond ROAS to actual purchase outcomes, order value growth and expected lifetime value of your customers. The Assisted Conversions and Model Comparison tools in Google Analytics are a good way of measuring cross-channel interplay; we’ve seen time and again that consumers first touch and engage through Search and convert through Direct – you can’t neglect the source channel and expect Direct sales to continue growing.

Look at how cost and acquisition has changed since the pandemic – could you afford similar jumps as competition and CPCs rise? 

At what point do you face diminishing returns on your Marginal ROI? 

What are your contingency plans and channels you can shift to if this occurs?

These are critical questions to discuss with your team and will help you plan for the future

What channels have you not considered?

We know Performance is the driving force for acquisition but what other channels are you not playing in that can yield new customers and returns? You would know best what you’re doing and what you’re not, but look to competitors and market leaders for ideas, or look even further afield to related industries for inspiration (I’ve seen some great Health Insurance providers leaning into the Healthcare space to become more involved with consumers with health concerns, and they are front of mind when premiums come up as a result).

Email remains the top converting channel – do you have segmented audiences in play with offers going out to your customers to bring them back to the store? Have you considered gamifying this channel with quizzes to better understand the products your customers like, which can in turn drive better segmentation, more compelling offers and engaging emails? Is your CRM up to the task, or is it time to enrich your audience list, bolstering it with earned activity or by buying third-party prospecting lists? This is particularly effective in the B2B space, it may be worth considering taking a page out of their book!

Loyalty – Do you have a loyalty program? How is it performing? How do you know what your customers want from you? Don’t be afraid to ask the questions – consumers love being engaged and you will likely get really valuable insights and excite your loyal customers with the offers to come!

SMS might surprise you – with the right offers and focus on what value, this can be a powerful channel! SMS works effectively when paired with Email offers, but consider the role this should play in the overall mix and be cautious of frequency, lest you lose subscribers.

Content – Yes, content is a channel! Going back to Peter Field’s earlier interview, performance is only so effective during the decision-making process – be there with content that helps your prospective customer understand the category, questions they need to be asking and what’s really important when comparing like-for-like products – they will love you for it! Need help with ideas? Chat to the LION SEO team, we can research topics using SEO tools and pull out insights from your own data to help connect you to the questions consumers are asking and you’ll have everything you need to get started.

Partnerships – you are not alone, many business owners are concerned about the near term. What contacts do you have (suppliers, friends/of friends, frenemies) that you can support and what can they do for you in return? Reciprocity is the key to good partnerships!

Search/Social – Yes, we’ve come back to Search and Social! But the power of these channels cannot be overstated and it’s all about finding the right levels of investment; investment of time on the organic side (focused on reach), and smart investment on the paid side (focused on revenue and ROI). There is always more you can do with Search, so look closely at what’s performing well and where you have untapped potential.  You should look to outsmart your competitors rather than outspend them – use your depth of knowledge in the industry to win customers over and leverage the wealth of data in your campaigns to leapfrog your sleeping competitors.

Consolidate channels – We see lots of business owners and marketing managers who work with specialist agencies for different channels. While there is merit to the argument that silos are great for specialism, the downside of managing many vendors, having to switch hats each time and align the agencies yourself stretches you too thin, not to mention higher costs and holistic, cross-channel efficiencies you miss out on by working with disparate teams. 

Pick a niche player – Another challenge we see eCommerce businesses dealing with is working with generalised agencies that work across local, service and eCommerce clients. You know what they say, Jack of all trades, master of none.

LION stands for Leaders In Our Niche. We pride ourselves on being among the top experts in each eCommerce Marketing Channel!

Thanks for reading! If any of what you’ve read resonates call me for a chat

Article by

Leonidas Comino – Founder & CEO

Leo is a, Deloitte award winning and Forbes published digital business builder with over a decade of success in the industry working with market-leading brands.

Like what we do? Come work with us

WebMaster Hangout – Live from May 06, 2022

Can I use Web Components for SEO?

Q. (03:04) Is there any problem with using web components for SEO, or is it OK to use them for my website?

  • (03:15) So web components are essentially a way of using predefined components within a website. Usually, they’re integrated with a kind of JavaScript framework. And the general idea there is that as a web designer, as a web developer, you work with these existing components, and you don’t have to reinvent the wheel every time something specific is supposed to happen. For example, if you need a calendar on one of your pages, then you could just use a web component for a calendar, and then you’re done. When it comes to SEO, these web components are implemented using various forms of JavaScript, and we can process pretty much most JavaScript when it comes to Google Search. And while I would like to say just kind of blindly everything will be supported, you can test this, and you should test this. And the best way to test this is in Google Search Console; there’s the Inspect URL tool. And there, you can insert your URL, and you will see what Google will render for that page, the HTML. You can see it on a screenshot essentially, first of all, and then also in the rendered HTML that you can look at as well. And you can double-check what Google is able to pick up from your web components. And if you think that the important information is there, then probably you’re all set. If you think that some of the important information is missing, then you can drill down and try to figure out what is getting stuck there? And we have a lot of documentation on JavaScript websites and web search nowadays. So I would double-check that. Also, Martin Splitt on my team has written a lot of these documents. He’s also findable on Twitter. He sometimes also does office hours like this, where he goes through some of the JavaScript SEO questions. So if you’re a web developer working on web components and you’re trying to make sure that things are being done well, I would also check that.

Is it ok to use FAQ schema across different parts of a page?

Q. (05:18) Is it OK to use the FAQ schema to markup questions and answers that appear in different sections of a blog post that aren’t formatted as a traditional FAQ list? For example, a post may have ten headings for different sections. A few of those are questions with answers.

  • (05:35) So I double-checked the official documentation. That’s where I would recommend you go for these kinds of questions as well. And it looks like it’s fine. The important part when it comes to FAQ snippets and structured data, in general, is that the content should be visible on the page. So it should really be the case that both the question and the answer are visible when someone visits that page, not that it’s kind of hidden away in a section of a page. But if the questions and the answers are visible on the page, even if they’re in different places on the page, that’s perfectly fine. The other thing to keep in mind is that, like all structured data, FAQ snippets are not guaranteed to be shown in the search results. Essentially, you make your pages eligible to have these FAQ snippets shown, but it doesn’t guarantee that they will be shown. So you can use the testing tool to make sure that everything is implemented properly. And if the testing tool says that’s OK, then probably you’re on the right track. But you will probably still have to kind of wait and see how Google actually interprets your pages and processes them to see what is actually shown in the search results. And for structured data, I think it’s the case for FAQs, but at least for some of the other types of schema, there are specific reports in Search Console as well that give you information on the structured data that was found and the structured data that was actually shown in the search results, so that you can roughly gauge, is it working the way that you want it to, or is it not working the way that you want it to? And for things like this, I would recommend trying them out and making a test page on your website, kind of seeing how things end up in the search results, double-checking if it’s really what you want to do, and then going off to actually implement it across the rest of your website.

Our site is not user-friendly with JavaScript turned off. Is it a problem?

Q.  (10:23) Our website is not very user-friendly if JavaScript is turned off. Most of the images are not loaded, and our flyout menu can’t be opened. However, the Chrome ‘inspect’ feature in the ‘all menu links are there in the source code. Might our dependence on JavaScript still be a problem for Googlebot?

  • (10:45) From my point of view, like with the first question that we had on web components, I would test it. So probably everything will be OK. And probably, I would assume if you’re using JavaScript reasonably, if you’re not doing anything special to block the JavaScript on your pages, it will probably just work. But you’re much better off not just believing me but rather using a testing tool to try it out. And the testing tools that we have available are quite well-documented. There are lots of variations on things that we recommend with regards to improving things if you run into problems. So I would double-check our guides on JavaScript and SEO and think about maybe, I don’t know, trying things out, making sure that they actually work the way that you want, and then taking that to improve your website overall. And you mentioned user-friendly with regards to JavaScript. So from our point of view, the guidance that we have is essentially very technical, in the sense that we need to make sure that Googlebot can see the content from a technical point of view and that it can see the links on your pages from a technical point of view. It doesn’t primarily care about user-friendliness. But, of course, your users care about user-friendliness. And that’s something where maybe it makes sense to do a little bit more so that your users are really for sure having a good experience on your pages. And this is often something that isn’t just a matter of a simple testing tool, but rather something where maybe you have to do a small user study, or kind of interview some users, or at least do a survey on your website to understand where do they get stuck? What kind of problems are they facing? Is it because of these, I don’t know. You mentioned the fly-out menus. Or is it something may be completely different where they see problems, that may be the text is too small, or they can’t click the buttons properly, those kinds of things which don’t really align with technical problems, but are more user-side things, if you can improve those, and if you can make your users happier, they’ll stick around, and they’ll come back, and they’ll invite more people to visit your website as well.

We use some static HTML pages and some WordPress pages, does that matter?

Q. (13:07) Our static page is built with HTML, and our blog is built with WordPress. The majority of our blog posts are experiencing indexing issues in Google. How do I fix this?

  • (13:21) So I think, first of all, it’s important to know that these are essentially just different platforms. And essentially, with all of these platforms, you’re creating HTML pages. And the background or the backend side of your website that ends up creating these HTML pages that’s something that Googlebot doesn’t really look at. Or at least, that’s something that Googlebot doesn’t really try to evaluate. So if your pages are written in HTML, and you write them in an editor, and you load them on your server, and they serve like that, we can see that they’re HTML pages. If they’re created on the fly on your server based on a database in WordPress or some other kind of platform that you’re using, and then it creates HTML pages, we see those final HTML pages, and we essentially work with those. So if you’re seeing kind of issues with regards to your website overall when it comes to things like crawling, indexing, or ranking, and you can kind of exclude the technical elements there, and  Googlebot is able to actually see the content, then usually what remains is kind of the quality side of things. And that’s something that definitely doesn’t rely on the infrastructure that you use to create these pages, but more it’s about the content that you’re providing there and the overall experience that you’re providing on the website. So if you’re seeing something that, for example, your blog posts are not being picked up by Google or not ranking well at Google, and your static HTML pages are doing fine on Google, then it’s not because they’re static HTML pages that they’re doing well on Google, but rather because Google thinks that these are actually good pieces of content that it should recommend to other users. And on that level, that’s where I would take a look, and not focus so much on the infrastructure, but really focus on the actual content that you’re providing. And when it comes to content, it’s not just the text that’s like the primary part of the page. It’s like everything around the whole website that comes into play. So that’s something where I would really try to take a step back and look at the bigger picture. And if you don’t see kind of from a bigger picture point of view where maybe some quality issues might lie or where you could improve things, I would strongly recommend doing a user study. And for that, maybe invite, I don’t know, a handful of people who aren’t directly associated with your website and have some tasks on your website, and then ask them really tough questions about where they think maybe there are problems on this website, or if they would trust this website or any kind of other question around understanding the quality of the website. And we have a bunch of these questions in some of our blog posts that you can also use for inspiration. It’s not so much that I would say you have to ask the questions that we ask in our blog post. But sometimes, having some inspiration for these kinds of things is useful. In particular, we have a fairly old blog post on one of the early quality updates, and we have a newer blog post; maybe I guess it’s like two years old now. It’s not a super new blog post on core updates. And both of these have a bunch of questions on them that you could ask yourself about your website. But especially if you have a group of users that are willing to give you some input, then that’s something that you could ask them, and really take their answers to heart and think about ways that you can improve your website overall.

I have some pages with rel=canonical tags, some without. Why are they both shown in the search?

Q. (17:10) I have a set of canonical tags or I have set canonical URLs on five pages, but Google is showing it on the third page as well. Why is it not only showing the URLs which I’ve set a canonical on it for?

  • (17:30) So I’m not 100% sure I understand this question correctly. But kind of paraphrasing, it sounds like on five pages of your website, you set a rel=canonical. And there are other pages on your website where you haven’t set a rel=canonical. And Google is showing all of these pages kind of indexed essentially in various ways. And I think the thing to keep in mind is the rel=canonical is a way of specifying which of the pages within a set of duplicate pages you want to have indexed. Or essentially, which address do you want to have used. So, in particular, if you have one page, maybe with the file name in uppercase, and one page with the file name in lowercase, then in some situations, your server might show exactly the same content; technically, they are different addresses. Uppercase and lowercase are slightly different. But from a practical point of view, your server is showing exactly the same thing. And Google, when it looks at that, says, well, it’s not worthwhile to index two addresses with the same content. Instead, I will pick one of these addresses and use it kind of to index that piece of content. And with the rel=canonical, you give Google a signal and tell it, hey, Google, I really want you to use maybe the lowercase version of the address when you’re indexing this content. You might have seen the uppercase version, but I really want you to use the lowercase version. And that’s essentially what the rel=canonical does. It’s not a guarantee that we would use the version that you specify there, but it’s a signal for us. It helps us to figure out all things else being kind of equal; you really prefer this address, so we will try to use that address. And that’s kind of the preference part that comes into play here. And it comes into play when we’ve recognised there are multiple copies of the same piece of content on your website. And for everything else, we will just try to index it to the best of our abilities. And that also means that for the pages where you have a rel=canonical on it, sometimes it will follow that advice that you give us. Sometimes our systems might say, well, actually, I think maybe you have it wrong. You should have used the other address as the primary version. That can happen. It doesn’t mean it will rank differently, or it will be worse off in search. It’s just, well, Google systems are choosing a different one. And for other pages on your website, you might not have a rel=canonical set at all. And for those, we will just try to pick one ourselves. And that’s also perfectly fine. And in all of these cases, the ranking will be fine. The indexing will be fine. It’s really just the address that is shown in the search results that varies. So if you have the canonical set on some pages but not on others, we will still try to index those pages and find the right address to use for those pages when we show them in search. So it’s a good practice to have the rel=canonical on your pages because you’re trying to take control over this vague possibility that maybe a different address will show. But it’s not an absolute necessity to have a rel=canonical tag.

What can we do about spammy backlinks that we don’t like?

Q. (20:56) What can we do if we have thousands of spammy links that are continuously placed as backlinks on malicious domains? They contain spammy keywords and cause 404s on our domain. We see a strong correlation between these spammy links and a penalty that we got after a spam update in 2021. We disavowed all the spammy links, and we reported the domain which is listed as a source of the links of spam. What else can we do?

  • (21:25) Yeah. I think this is always super frustrating as a site owner when you look at it and you’re like, someone else is ruining my chances in the search results. But there are two things I think that are important to mention in this particular case. On the one hand, if these links are pointing at pages on your website that are returning 404, so they’re essentially linking to pages that don’t exist, then we don’t take those links into account because there’s nothing to associate them with on your website. Essentially, people are linking to a missing location. And then we’ll say, well, what can we do with this link? We can’t connect it to anything. So we will drop it. So that’s kind of the first part. Like a lot of those are probably already dropped. The second part is you mentioned you disavowed those spammy backlinks. And especially if you mention that these are like from a handful of domains, then you can do that with the domain entry in the disavow backlinks tool. And that essentially takes them out of our system as well. So we will still list them in Search Console, and you might still find them there and kind of be a bit confused about that. But essentially, they don’t have any effect at all. If they’re being disavowed, then we tell our systems that these should be taken into account neither in a positive nor a negative way. So from a practical point of view, both from the 404 sides and from the disavow, probably those links are not doing anything negative to your website. And if you’re seeing kind of significant changes with regards to your website in Search, I would not focus on those links, but rather kind of look further. And that could be within your own website kind of to understand a little bit better what is actually the value that you’re providing there. What can you do to really stand up above all of the other websites with regards to kind of the awesome value that you’re providing users? How can you make that as clear as possible to search engines? That’s kind of the direction I would take there. So not lose too much time on those spammy backlinks. You can just disavow the whole domain that they’re coming from and then move on. There’s absolutely nothing that you need to do there. And especially if they’re already linking to 404 pages, they’re already kind of ignored.

What’s the stand on App Indexing?

Q. (26:51) What’s the latest stand on app indexing? Is the project shut? How to get your app ranked on Google if app indexing is no longer working?

  • (26:58) So app indexing was a very long time ago, I think a part of Search Console and some of the things that we talked about, where Google will be able to crawl and index parts of an app as it would appear, and try to show that in the search results. And I think that migrated a long time ago over to Firebase app indexing. And I double-checked this morning when I saw this question, and Firebase has also migrated to yet another thing. But it has a bunch of links there for kind of follow-up things that you can look at with regards to that. So I would double-check the official documentation there and not kind of listen to me talk about app indexing as much because I don’t really know the details around Android app indexing. The one thing that you can do with regards to any kind of an app, be it a mobile phone, smartphone app like these things, or a desktop app that you install, you can absolutely make a homepage for it. And that’s something that can be shown in Search like anything else. And for a lot of the kinds of smartphone apps, there will also be a page on the Play Store or the App Store. I don’t know what they’re all called. But usually, they’re like landing pages that also exist, which are normal web pages which can also appear in Search. And these things can appear in search when people search for something around your app. Your own website can appear in search when people are searching for something around the app. And especially when it comes to your own website, you can do all of the things that we talk about when it comes to SEO for your own website. So I would not like to say, oh, app indexing is no longer the same as it was 10 years ago. Therefore, I’m losing out. But rather, you have so many opportunities in different ways to be visible in Search. You don’t need to rely on just one particular aspect.

Our CDN blocks Translate. Is that bad for SEO?

Q. (29:08) The bot crawling is causing a real problem on our site. So we have our CDN block unwanted bots. However, this also blocks the Translate This Page feature. So my questions are, one, is it bad for Google SEO if the Translate This Page feature doesn’t work? Does it also mean that Googlebot is blocked? And second, is there a way to get rid of the ‘Translate This Page’ link for all of our users?

  • (29:37) So I think there are different ways or different infrastructures on our side to access your pages. And there’s, on the one hand, Googlebot and the associated infrastructure. And I believe the translate systems are slightly different because they don’t go through robots.txt, but rather they look at the page directly kind of thing. And because of that, it can be that these are blocked in different ways. So, in particular, Googlebot is something you can block on an IP level using a reverse DNS lookup, or you can allow it on an IP level. And the other kinds of elements are slightly different. And if you want to block everything or every bot other than Googlebot, or other than official search engine bots, that’s totally up to you. When it comes to SEO, Google just needs to be able to crawl with Googlebot. And you can test that in Search Console to see does Googlebot have access? And through Search Console, you can get that confirmation that it’s working OK. How it works for the Translate This Page backend systems, I don’t know. But it’s not critical for SEO. And the last question, how can you block that Translate This Page link? There is a “no translate” meta tag that you can use on your pages that essentially tells Chrome and the systems around translation that this page does not need to be translated or shouldn’t be offered as a translation. And with that, I believe you can block the Translate This Page link in the search results as well. And the “no translate” meta tag is documented in our search developer’s documentation. So I would double-check that.

Sign up for our Webmaster Hangouts today!

GET IN CONTACT TODAY AND LET OUR TEAM OF ECOMMERCE SPECIALISTS SET YOU ON THE ROAD TO ACHIEVING ELITE DIGITAL EXPERIENCES AND GROWTH