Ask a News SEO: Christine Liang
Christine Liang, Senior Director of SEO for The New York Times, discusses the technical SEO concepts everyone should know, innovation at scale and how to plan for algorithm updates.
Hello, and welcome back. Jessie here, back and hyped by the thrill of seeing shows again! An evening of live music: a joy, a gift and the exclusive reason my spine is screaming this morning. (The War on Drugs: totally worth it.)
This week: Ask a News SEO is back. In June, Shelby and I were so thrilled to meet up with Christine Liang, Senior Director of SEO for The New York Times, for our first-ever IRL news SEO interview. We chatted about the top technical concepts every SEO should know, innovation at scale and planning – or not planning – for algorithm updates. (An extra special thanks to Christine for interrupting her parental leave to reply to our follow up questions!)
Next week is somehow the last real weekend of summer (August: it really did slip away, like a bottle of wine) – and a long weekend in Canada. As such, there will be no new WTF is SEO.
Catch up on our previous news SEO interviews: Barry Adams, Lindsey Wiebe, Dan Smullen, Bryan Flaherty, Jake Banas and Claudio Cabrera. We’ll be back on September 5.
This interview has been condensed and edited for clarity.
What are the three technical concepts news SEOs should know?
When it comes to technical SEO, the three concepts that matter most are crawling (including indexing and ranking), internal linking and structured data. We obsess over and worry about these things consistently. And we know that, after doing product work, they’re what drive the most impact.
Crawling, indexing and ranking: There’s a misconception that ranking happens immediately after a piece is published. That is not the case. Instead, a piece of content starts ranking after it’s been crawled by spiders. So you need to make sure spiders can get to your content and don’t run into any snags along the way. There’s a lot of behind-the-scenes work that needs to happen to help a piece rank well.
The ranking component comes after the crawling. To get there, the SEO editorial team needs to understand search intent, including the right target keyword in the headline and making body content recommendations. Good ranking is accelerated by having sound crawling and indexing.
Internal linking: Second is internal links. Links create pathways for spiders to find your content. So it’s key to have links pointing to the most important content. That could be featuring the link on the homepage, on section fronts, or in highly relevant content-popular areas on the site where you can make the linked content more discoverable.
The thing is, Google doesn't have all the time in the world to crawl every site nonstop, especially a site with as many published pieces as The New York Times. So maximizing the crawl budget is top of mind. We’re always thinking about how we can help Google prioritize what to crawl and how to crawl more efficiently. That’s where link optimization and structure comes into the picture.
We do our best, from a technical hygiene perspective, to give Google the cleanest signals. We ask ourselves: Are we serving Google proper clean URLs? Are we linking to the canonical version? Are we making it easy to find all this content by linking from highly crawled pages? That’s how internal linking feeds directly into crawling and indexing. It allows Google to find all this content pretty quickly.
Structured data: The web is moving to a more structured place, so in order to futureproof the site for what’s to come, we want to take advantage of markup. Structured data markup powers all those rich features you get in the news search space: live blog, video, news article. To activate any of those features, you need to give Google clues indicating what the content is actually about.
Leveraging structured data gives Google information about how to interpret and display our content. If we didn't include structured data, it would be a guessing game for Google. And why make Google guess? We want to remove ambiguity from the equation. We want to make sure there are no misaligned signals.
The New York Times is known as a hotbed of innovation. How does that scale down to the daily tasks, or how do you make sure that culture of innovation is there all the time?
In terms of innovation, The New York Times is the leader. I am very fortunate to be on a team that is audience-centric. I’m part of the Audience team but also embed with the Audience Product team. My hybrid role lets me sit at the intersection of all that is happening on the newsroom and product side. That means I’m able to tune into the needs of the newsroom and bring it back to the product and engineering team to figure out how to address those needs.
Every half, we plan out the technical SEO work and set the product roadmap. All our initiatives have to support the larger newsroom and product goals. We have to account for both short- and long-term goals, and determine the types of technical work to achieve those goals. So a lot of the technical SEO innovation is planned out.
And because we’re a news site, we also have to quickly respond to news demand and the news cycle. And then there’s responding to Google and changes we’re seeing in the search results. We leave a lot of space to be scrappy. That includes room to test, experiment, and take on additional requests. We’re constantly working on things that feed into our everyday work (such as improving our internal SEO tools and monitoring AMP). On the technical SEO side, we’re looking for creative, innovative solutions.
An example of innovating on-demand is Google Web Stories. Web Stories used to be strictly for evergreen and lifestyle content, but now we're seeing this feature show up on newsy keywords. So, what does that mean to us? What is our POV? How do we factor this into our roadmap? On a day-to-day basis, we can try to understand how something like Google Web Stories serves our audience, then understand what it takes — from a technical perspective — to actually get a change implemented. We’re working through these kinds of questions every single day.
How do you approach core algorithm updates? How are you both proactive and reactive to them?
Lucky for us, as The New York Times, core updates don't keep us awake at night. Because we’re a reputable site producing high quality journalism, we rarely get penalized during these updates. If we were to get dinged, it’s most likely a reversal of Google over-correcting for us.
We don't let core updates influence or drive our overall strategy. In terms of core updates, there’s nothing to plan for. We just have to continue following SEO best practices. And for us that involves actively investing in technical health, producing quality content, improving user experience, and reducing friction. The goal is to not do anything to upset Google and hurt our search performance.
Where we’re proactive is post–core update. We audit to ensure everything is stable and we’re not seeing a sudden ranking drop in a particular section, templates, or keywords. We're always monitoring our search visibility but even more so after a core update. That way we can properly communicate observations and trends to stakeholders.
What do you wish you knew about SEO when you first started?
I wish I knew how much this space would evolve. When I first started, a SERP had three paid links and 10 blue links. Things were simple. I had no idea organic search was going to become such a beast. So complex or so volatile keyword-to-keyword. And now we also have personalization, localization, and page performance as factors. There’s a lot to think about!
All these factors, some in our control and some not, make the job quite challenging but also extremely fascinating. You’re constantly reading and experimenting. The job encapsulates a lot, so you need to have an appetite for learning on the fly.
If you’re up for constant change, then this space is for you. There’s never a dull moment.
RECOMMENDED READING
Brodie Clark on Twitter: Does word count matter for SEO? Google is experimenting with labels for so-called “quick reads.”
Techmeme on Twitter: Tribune Publishing says getting rid of AMP had little impact on mobile search referrals. (Here’s what Barry Adams has to say in response.
This is not about news SEO, but friend of the newsletter Josh O’Kane’s second book – Sideways: The City Google Couldn't Buy – is now available for pre-order. Click if you’re keen to understand the Sidewalk Labs fiasco in Toronto. We’re really proud of you, Josh!
Have something you’d like us to discuss? Send us a note on Twitter (Jessie or Shelby) or to our email: seoforjournalism@gmail.com.
Written by Jessie Willms and Shelby Blackley