Ask a SEO: Glenn Gabe
Glenn unpacks Google’s broad core updates, explains why sites experience volatility and how publishers can future-proof their SEO
#SPONSORED
NewzDash wins the Best Global SEO Software Suite at Global Search Awards
🚀 The ONLY SEO tool that provides: 1) LIVE Search Visibility 2) Real-Time Search Volume 3) Unlimited Trends Monitoring & Rankings 4) Historic Performance & KPIs 5) News SEO Recommendations 6) Global Competitive Analysis 7) Publish-to-Indexing Tracking and much more!
Hello, and welcome back. Jessie and Shelby here, both back from working Super Bowl weekend. We optimized, we snacked, we watched Kendrick Lamar dunk on one of Canada’s biggest exports (after maple syrup). Before that, Jessie almost (almost!) finished sewing her scrap sweater while Shelby found true happiness watching her corgi puppy get lost in the snow. We really don’t deserve dogs.
This week: Ask a News SEO interview with Glenn Gabe on Google updates! Glenn unpacks Google’s broad core updates, why sites experience volatility and how publishers can future-proof their SEO with the “kitchen sink” approach.
And stay tuned — we loved chatting with Glenn so much, we’ll be back for part two!
THE INTERVIEW
WTF is SEO?: When Google announces a core update, what are the first metrics or trends that you analyze?
Glenn Gabe: I take a bit of a unique position because I’m looking at a whole slew of different sites and tracking across various industries. It’s really important to understand that when these updates roll out, Google could potentially be updating multiple systems along the way. That’s why you get these multi-week rollouts, like the March 2024 core update, which was a massive rollout.
Google told us there would be tons of volatility. I was tracking that throughout the update, watching as different systems were being updated. And sometimes they can counterbalance each other for certain sites.
It’s really important not to take any significant action until a core update is fully rolled out, but you definitely want to start checking visibility. GSC (Google Search Console) data is the gold standard, but visibility reporting — like what Sistrix, Semrush and Ahrefs provide — is so important because they’re looking at rankings and not necessarily traffic, which might be affected by seasonality or other factors.
GSC data and visibility trending will enable you to see if you're doing well during the update or not.
We’ve seen it with almost every broad core update: some sites start surging, and by the end, they’re going back down, sometimes ending up worse than where they started. That situation is terrible for site owners.
But it’s something I’ve seen time and again — different systems can counterbalance each other.
That’s why it’s so important to wait until the update is fully rolled out. Then, you can take a proper look at everything: what dropped, did you win or lose, which types of content were impacted and so on.
Google’s core ranking system is super sophisticated. There are many systems that make up Google’s core ranking system, and Google itself can’t even tell us how many systems there are, which is fascinating.
With the March 2024 core update, Barry Schwartz interviewed Elizabeth Tucker and basically she said, “We can't really tell you how many systems there are.” For example, Core Web Vitals, technically, that’s a ranking factor, but it’s tiny, so that might be a smaller system.
Then you have authority — basically mentions, citations and links — which are obviously very powerful.
Let’s say several smaller systems are updated, and your site starts to surge — great. But then, another system gets updated later in the rollout, and it’s more powerful and drags your site back down.
That’s exactly why the March 2024 core update was unique — it had a really long rollout, where multiple systems were being updated over time. It's super important to understand that all of these systems are at play as they get updated, and they can counterbalance each other.
“Counterbalance” is something Google has also explained. This is unfortunately how things work until we get to the point where broad core updates no longer roll out, and things are continually updating.
I believe that's where we are headed. Google hinted recently in December that they want more core updates and more frequently. Danny Sullivan said Google wants to get to a point where these are just happening and they don't have to announce it.
WTF is SEO?: If that's the future of core updates, how should publishers prepare?
Glenn Gabe: I'm a firm believer that a site owner should not wait until a broad core update impacts their site due to quality. Site owners should continually audit their sites through the lens of broad core updates. Google's systems are evaluating a site over time; you can catch issues and nip them in the bud.
I think it’s about really evaluating the site through the lens of quality. It's about Google's view of the site's overall quality, and relevance. If you look at it through that lens continually, you probably will not get into a situation where you have this massive hit from a core update due to quality.
Sites can always drop due to relevancy adjustments or intent shifts, but it's important to maintain a high level of quality over time.
The kitchen sink approach to remediation is something I've been talking about for a really long time. The reason is because it works really well.
Back in medieval Panda days (circa 2011), I audited sites that were impacted by Panda, and the site owners would say, “Great, thanks. We’re going to implement 15 per cent of those changes.” And I’d be like, “Wait, you’re missing 85 per cent of the changes.”
What I saw over time was that sites that made significant changes recovered — and remained strong.
The sites that were cherry-picking changes either wouldn’t recover, or they would recover and then drop back down. That’s when I started telling clients, “You really need to be serious about making a lot of these changes and improving the site.”
Some changes are not easy. When it comes to over-monetization in publishing, obviously, advertising is a big part of it. But sometimes it goes way overboard, and it impacts the user experience, which Google is looking at, especially through Navboost. Suddenly, the user experience is so bad that Google’s systems decide they are not going to rank this as highly.
With the kitchen sink approach, it’s about identifying all potential quality problems and fixing as many as you can. It’s something I’ve seen work extremely well over time.
WTF is SEO?: Sometimes publishers can't get to every fix. What are the biggest movers?
Glenn Gabe: Highly visible and low quality content is where you should start because that's where Google's gaining a majority of its Navboost data from.
All of the people are coming to your site through those queries and top landing pages. If you're going to analyze things and capture potential quality problems, and it’s on those specific pages, fix that first. As more traffic comes through, you don’t want to have negative user interaction signals that could potentially throw things off.
WTF is SEO?: Let’s talk about classifiers and precision with Google’s updates.
Glenn Gabe: I recently listened to Mark Zuckerberg on the Joe Rogan podcast because I wanted to hear what Mark had to say about the future of AI and augmented/virtual reality. Then, I came across a section that really struck me — not from a social standpoint, but from an SEO standpoint.
Mark was talking about creating classifiers. For Meta, it was about catching dangerous content, but for Google or SEO, we could think of unhelpful, low-quality or AI-generated content. When building systems to catch that, you have to create a classifier.
What Mark explained was that the precision of the classifier is super important. If you want to catch a high percentage of unhelpful content, and set the precision of that classifier at 99 per cent, that’s great. There won’t be much collateral damage, but you’re going to miss a whole bunch of stuff that should be caught. If you drop that precision to maybe 85 or 90 per cent, you’re going to catch a lot more of the bad content, but that also means that some of what you’re catching is collateral damage.
That’s not necessarily brand new information, but when you think about it in terms of SEO — especially in the context of the Helpful Content Update — it’s really important.
That update was botched and included a classifier that ended up catching a bunch of sites that Google later admitted shouldn’t have been caught. That’s really important to understand.
I have site owners reach out to me all the time, saying that after one update they’re surging, and then with the next update, they plummet. It looks crazy and I call that yo-yo trending.
And one reason this could be happening is Google could be adjusting the precision levels for specific classifiers — not just one, but several. And boom, you surge. Or maybe the engineers take a look and decide they need to adjust the classifier’s precision down a little bit. Boom, now you tank.
This is why, in my opinion, site owners should work to get out of the grey area of Google’s algorithms. Significantly improve quality over the long term.
Also, after broad core updates I often hear people say, "That site surged and didn’t do anything to improve!" And I’m like, it could just be a classifier where Google adjusted the precision level. That doesn’t mean they’ll stay there — maybe they’ll drop with the next update when Google recalibrates that classifier and its precision level.
The interview, and topic, was just really interesting. This whole idea — and my blog post — came from Zuckerberg talking about social media, but it also applies directly to SEO. Go figure.
#SPONSORED - The Classifieds
Get your company in front of almost 13,000 writers, editors and digital marketers working in news and publishing. Sponsor the WTF is SEO? newsletter!
RECOMMENDED READING
Google news and updates
🤖 Barry Schwartz: Google added the term "generative AI" to the quality raters guidelines and better outlined where AI is and isn’t acceptable.
🤖 Roger Montti: John Mueller explained what to do if Google indexes duplicate URLs with query parameters.
🤖 Abner Li: Google is internally testing an “AI Mode” that takes “exploratory” questions and returns AI Overview-style, AI-generated responses.
🤖 Barry Schwartz: Google says focus on originality to win in 2025.
Even more recommended reading
🔑 Kevin Indig: The keyword is dead. Here’s why aggregate organic traffic is a better metric.
📈 Sistrix: Here are the SEO losers on search in the US in 2024
😠 Kyle Orland: Cursing disables Google’s AI overviews.
👥 Amanda Natividad: A guide to audience research in 2025.
🐁 Tracy McDonald: How AI Overviews impact search click-through rates.
🕵️ Brenna Kelly: Investigating ChatGPT Search — insights from 80 million clickstream records.
💻 Peter Richman: How to decide between one domain or many for SEO.
🔗 Tyler Einberger: Best practices for using @id in schema.org markup for SEO, LLMs and Knowledge Graphs.
📹 Ana de la Cruz: YouTube content strategy — making sense of the SEO impact and cross-platform brand awareness.
What did you think of this week's newsletter?
(Click to leave feedback.)
Catch up: Last week’s newsletter
Have something you’d like us to discuss? Send us a note on Twitter (Jessie or Shelby) or to our email: seoforjournalism@gmail.com.
Written by Jessie Willms and Shelby Blackley
Thanks for getting Glenn on, I've followed him for years on LinkedIn and he always has the best insights! So true about constantly auditing to stay out of Google's grey areas, don't wait until it's too late.
Very good guys!