Botify News, Events, & Trends Crawl & Render Budget Publishers Technical SEO

How The New York Times Wins At SEO: Key Webinar Takeaways

When it comes to publishing, The New York Times is globally iconic, critically acclaimed, and known for its editorial excellence and reliability. Over the years, the NYT has keenly transitioned from a print-only brand to one of the largest online news sites, but mastering its digital presence and earning that top spot in organic search has been – and continues to be – a true team effort. To explore that further, Botify recently hosted a webinar with Christine Liang, The New York Times’ Director of SEO, and Ryan Sholin, Director of Business Development at WordPress VIP. 

How The New York Times Wins at SEO,” which is available on-demand, covered a lot of ground, from technical SEO focus areas and editorial alignment to platform scalability and managing tentpole events, however we’ve distilled the conversation into four key takeaways that publishers across the web can leverage. There were also so many audience questions that we were unable to address live, however Christine was kind enough to answer them offline, which we incorporated into this post. 

Technical Foundations First 

Solid technical foundation comes first: Christine is extremely conscious of website health and ruthless about optimizing crawl budget. Whether looking at status codes after every crawl to spot anomalies or removing redirect loops and chains yields results at scale. And when it comes to managing archived content and its impact on crawl budget, Christine noted, “Google, for the most part, is good at prioritizing crawling fresh content – something certainly true in the news space. We can’t really delete or block old content from being crawled because there are editorial implications to this, but we do try our best to help Google discover these URLs faster. A lot of it falls under linking and promoting, so I’d say those are your best levers.”

An example from inside Botify on HTTP codes you can check to spot check and identify site issues

Part and parcel to publisher SEO is being adaptable to change – and core to that is Google and their algorithm changes. “Algorithm updates are something we monitor religiously. We want to get better at scaling the monitoring, but for now we leverage internal and external tools to help us get a sense of fluctuations in the search landscape. Historically, Google’s indexation bugs and glitches have impacted us more than core algorithm updates.” Due to the nature of publisher websites like the NYT that have recognized authority and size, it makes sense that smaller bugs would affect them significantly more than core updates. The NYT is a shining example of a site that has always prioritized the value of quality content and every core update in recent years has focused on elevating and rewarding that approach. 

Sitemaps Still Play A Large Role 

In other verticals, sitemaps can often be something that is set once and forgotten. For Christine and the team at the NYT, the webinar highlighted the value and return that they see in optimizing their sitemaps – a result of the large number of URLs they have across the domain. They took the time to evaluate what type of content they have and how to best arrange that in their sitemap. When asked what a “perfect” sitemap might look like, Christine answered “It really depends on your site structure. We broke it down by content type so it’s more manageable and easier for us to spot check.” 

Sitemap maintenance for enterprise websites is also important because there can be unknown errors if you aren’t paying attention. A problem Christine acknowledged early in her time with NYT was that there were sitemap issues: “Our technical audit showed the XML sitemap wasn’t complete, and contained many non-200 status code URLs. We also saw how many times Googlebot was hitting these files so it was essential to give the spider clean, valid URLs to improve crawling and indexation” On a website the size of the NYT, these kinds of changes can make dramatic impacts in the overall website health. 

SEO Across An Enterprise Organization

In every organization, SEO is a cross-departmental function, whether defined as such, or not.  At the NYT, the search team is part of the audience team, alongside their editorial partners, however collaboration with engineering, product, and marketing, to name a few, is foundational. 

Team Structure at NYT

“The SEO team is composed of seven people. We have some supporting cast members that will embed with the team from time to time. They usually come in when we have a lot going on, or when we are down on resources.” 

– Christine Liang

All of these teams look for insights from Christine to guide their roadmap and to prioritize projects, while Christine is an advocate for education across the organization to ensure that even non-SEOs understand the impact of their roles on organic search. Key to success is making sure that everyone is on the same page, notably the editorial team. While there may only be seven people on the NYT SEO team, they are always working closely with editorial staff and understand the importance of managing up in the organization “The SEO editors on the team do a fantastic job selling in SEO to the newsroom. They send trend emails, they send out a wrap-up note, so everyone is informed and can act on these insights. They also highlight wins, particularly for the biggest news stories of the day. People love hearing about wins! So it’s an on-going feedback loop,” says Christine. 

This strategy can also work well when trying to get prioritization for technical SEO across a product or developer team. 

Evergreen Content Is Always Useful 

The backbone for many publisher strategies is maximizing evergreen content. At the NYT, while winning the news cycle is critical, driving engagement from evergreen content is also the core focus of 2021, requiring its strategy and transparent collaboration with editorial teams. “Article updates are essential in keeping content up-to-date and accurate for readers. Editors also see how article updates play a major role in improving rankings, securing top search rankings, and beating out the competition. They know this because the SEO team constantly communicates with them, relaying performance updates and positive results. This type of rapport has editors seeing and believing.” 

Working evergreen content updates into the wider content calendar can also help an organization plan better what their workload will look like. For example, having a calendar of what content is going to be updated throughout the year can help make sure that new content development is given just as much attention as pruning and optimizing older content to current search intent. 

Want to learn more about ways to win at SEO? Download our Publisher SEO Playbook for a deep understanding of the complexities and common challenges facing publisher SEOs, as well as solutions for navigating them. Produced in collaboration with WordPress VIP, the playbook provides use cases and examples, as well as best practices for every day publishers. 

Dec 12, 2023 - 6 mins

Split Testing SEO avec Botify : Est-ce la fin du “ça dépend” ?

Botify News, Events, & Trends Crawl & Render Budget Publishers Technical SEO
Jan 1, 2022 - 2 mins

Botify Supports IndexNow’s Move Toward a “More Efficient Internet”

Botify News, Events, & Trends Crawl & Render Budget Publishers Technical SEO
Jun 6, 2016 - 5 mins

“SEO Is A Difficult And Major Subject” – Interview With Botify Trainer Alpha Keita

Botify News, Events, & Trends Crawl & Render Budget Publishers Technical SEO