18 min read

In this SEO refresher, we’ll focus on enterprise SEO for developers. We’ll cover search engine basics developers need to know, some technical SEO and how SEOs and developers can work better together.

Key outcomes:

  • Understand key SEO components
  • Understand how technical platforms influence SEO performance
  • Spot items/areas where your work can affect SEO

Agenda:

  1. Search engine basics
  2. Technical SEO
  3. Working better together

Search Engine Basics

SEO: A Byproduct of Good UX and Marketing

Fullstack SEO Team

Fullstack SEO Team

Developers don’t do SEO. They make sure sites are SEO-ready. That means developers hold the key to SEO. It’s true. If you’re a developer, laugh maniacally. You’re in control.

You control three things: 

  1. Viability
  2. Visibility
  3. Site flexibility

Technical SEO for Enterprises

The bigger the site, the more technical SEO matters.

Technical Enterprise SEOA slow and steady approach is necessary for large sites. Give Google the time to understand and update their index. On many occasions when large changes are implemented across a website, fluctuations in rankings occur. Month on month we’ve seen results of +73% growth at the end of a year.

Google as a Specific Accessibility Needs User

Google Understands Patterns of Quality

Google is a machine, it does not understand quality. It interprets patterns. It matches against quality benchmarks (via direct human input or derived from machine computation & algorithms) and relies on user data for validation. 

Google’s Experience of a Site

Search engines and humans experience the same text differently. Here is a breakdown of Google’s experience of a site.

Crawl Render Index Rank Traffic
Crawl as many pages as I can based on resources allocated for this site. Load and “Read” the content of each URL so I can understand it. Store pages that meet quality requirements and are fully rendered by Google Bots. Rank based on Search Query + Context from a User POV.

And based on content, quality, user signals.

Users click on our relevant search results and land on our page.

The SEO Pyramid of Needs

Mozlows Hierarchy of SEO NeedsRanking Factors in 2020 (According to MOZ)

  1. Quality links to the site
  2. Good content with relevant keywords
  3. Good (accessible) user experience
  4. User data (quality feedback loop)

Influence on Google's Algorithm2020 Ranking Factors

Ranking Signals % Ranking Importance Key Success Factors
Technical 30% Stable platform, consistent experience, maximise use of crawl resources, good UX related metrics (speed, bounce, 404s), mobile deep-linking.
Content 30% Good quality content, limited duplication, use of structured markup for easy content analysis.
Authority Signals 40% Monitor and reduce spam linking. Maximise authority flow to deeper pages.

Search Engine Basics for Developers

  • SEO success is a collective effort. SEOs need developers help to make sure the site is working well for both users and Google. 
  • Google wants the same thing as humans. However, it does have different accessibility needs than humans.
  • You do not race against Google, you race against competitors. Create a better user experience for both users and Google.
  • Technical SEO and platform stability is key to enterprise SEO success. Slow, steady (and structured) wins the large scale, enterprise SEO race.

Technical SEO

Viability: Server Setup and Maximising Google Resources

Google’s Experience of a Site as an Accessibility Needs User

Crawl Render Index Rank Traffic
Crawl as many pages as I can based on resources allocated for this site. Load and “Read” the content of each URL so I can understand it. Store pages that meet quality requirements and are fully rendered by Google Bots. Rank based on Search Query + Context from a User POV.

And based on content, quality, user signals.

Users click on our relevant search results and land on our page.

Too Many Pages Not Enough Quality

Many enterprise companies have huge sites with many pages but without enough quality pages. When we have too many pages, Google can have a hard time accessing all these URLs. For example, a huge site with many pages could mean only 0.5% of pages make it to the index and only 0.2% of pages actually get traffic. 

As an example, last year Greek e-commerce giant Skroutz grew their average page position on Google by 1.2 positions. This doesn’t sound like a huge increase, but for a site with millions of pages, this small push resulted in growing organic clicks by 6.6 million. After thorough analysis the main issues addressed were optimising crawl budget and eliminating duplicate URLs.

Robots.txt: Your 1st Solution to Manage Crawls

A robots.txt file tells search engine crawlers which pages or files the crawler can or can’t request from your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. To keep a web page out of Google, you should use noindex directives, or password-protect your page.

  • Enable us to stop Google from visiting a page
  • Does not stop Google from indexing a page
  • Block user pages
  • Block resources that we don’t need Google to crawl
  • Let Google crawl JS files if used to load the page
  • Best used when URLs have parameters we don’t want indexed

Small Changes in Server Configuration Can Have a Huge Impact

  • Even small, smooth changes can impact our crawl rate
  • Crawl freshness is important to keep up to date records in the index
  • Fresh records in the index are more likely to be trusted and thus ranked higher

Changes to IA and Page Volume Have a Huge Impact

Changes to your information architecture and page volume can have a huge impact. Here’s an example:

Changes to IA and Page Volume Have a Huge ImpactThis guy ← will happily DdOS your site. 

 

Why? Let’s say you add a bunch of attributes to your pages. This would create millions of URLs in a short period of time. 

 

Google gets excited and your servers get exhausted resulting in hundreds of errors for bots and human users alike.

Stability: Creating a Trusted Experience

Index Bloat

Enterprise websites often suffer from index bloat. Parameters can create index bloat for enterprise websites. While parameters are good practice, instead consider using hashtags. Anything after the hashtag is considered a location on the page.

How to Fix Index Bloat

Fix index bloat by effectively showcasing the existing supply to satisfy a non-trivial audience demand.

  • Balance opportunity to provide specific landing page experience vs the number of pages we want to provide to Google
  • Look at how many searches exist for a given category/subcategory/attribute/location combination
  • Use internal search data and patterns to build rules

Should I index a specific faceted page?

Ask yourself:

  1. Does my audience search using specifically relevant faceted targeted queries for it?
  2. Is the search volume for its specific faceted queries high enough to compensate for the indexation effort?
  3. Are there enough specifically relevant products, services or offerings to be featured in the faceted page along a targeted description?

If the answer is no. Don’t index it. 

We can either block them by Robots.txt or obfuscate their internal linking, to avoid crawling non-indexable pages.

Deduplication

Another thing many enterprise websites suffer from is deduplication issues. For example, you could have a website with suggested searches that are near identical. 

Plurals vs Singulars

For example, you may be offering suggestions with plurals and singulars. Such as “t-shirt” and “t-shirts” resulting in different pages offering the same information.

Synonyms

Another example could be offering suggestions like “t-shirts” and “tops”.

Near-zero results

Another issue could stem from suggestions with very little value and information.

Canonical to the Rescue?

This is (kinda) OK

Canonical to the rescueThis is NOT OK

Bad CanonicalType of Keywords

While enterprise businesses can often rank well for head keywords, the true potential lies in long-tail keywords. The effort should be in balancing the volume and value of pages while giving your users what they want. 

Type of Keywords - Enterprise SEOWhat is Link Equity and Why is it Important?

What is Link Equity and Why is it Important?Link equity can be understood as how the flow of a page’s ranking power passes from one link to another and both internal and external links can pass link equity.

Principles of Link Equity 
  1. External links generally give more ranking ability than internal links.
  2. Well-linked-to pages have more equity to pass than poorly-linked-to pages. Think about internal links from the homepage, and then links from pages that got linked from the homepage, and so on.
  3. Pages with fewer links tend to pass more equity to their targets than pages with more links. You need to be selective and strategic! 
  4. Redirects and canonicalisation lose a small amount of link equity. 

Internal Link Value and Flow Let’s You Rank for Long-tail

A website’s link graph shows how Link Equity is passed to pages within the website (link value and flow).

Having an Optimal Link Graph:

  1. Pass page authority down from pages with high equity to lower equity. For example:
  2. Every page should be linked from somewhere on the website. Orphan pages are not ideal for SEO and should be avoided.
  3. Don’t use nofollow. It burns PageRank/LinkValue rather than sculpting it. If you need to remove links to non-indexable pages, use a fragment (behind #).

Stability: Creating a Trusted Experience

  • Create a trusted experience by ensuring pages presented to Google are either canonical + live or redirect to such a page.
  • Canonical and robots.txt can help you carve the site, but it’s a hack. Find a more effective solution to create and manage pages at scale.
  • Deduplication is a huge opportunity for enterprise SEO. Work on a solution that reduces low quality and duplicate pages.
  • Focus on long-tail keywords. Find a solution to balance user needs and pages with enough value for Google.

Visibility: Making Content Visible to Google (JavaScript SEO)

Google’s Experience of a Site as an Accessibility Needs User

Crawl Render Index Rank Traffic
Crawl as many pages as I can based on resources allocated for this site. Load and “Read” the content of each URL so I can understand it. Store pages that meet quality requirements and are fully rendered by Google Bots. Rank based on Search Query + Context from a User POV.

And based on content, quality, user signals.

Users click on our relevant search results and land on our page.

Standard HTML Rendering by Google

Standard HTML Rendering by Google - Enterprise SEO for DevelopersWebsites like mobile.walmart and Twitter did research showing that increasing speed on the first page (first load) improved general site performance.

This research confirms the practice that server side rendering needs to show the first page as quickly as possible, while other code can be loaded while the user browses the page. 

As a result, when the user loads the first page he/she won’t see the “loading…” message; they’ll see a functional page, thus having a better user experience (UX) and generally better experience of the app in general.

Rendering of JavaScript in GoogleGoogle Can Only Index a Fully Rendered Page

Google Can Only Index a Fully Rendered Page - Enterprise SEO for DevelopersSecond wave rendering could be missing critical elements, including:

  • Metadata 
  • Correct HTTP Code
  • Canonical 

Dynamic rendering is dangerous. Why? Because it’s a hack, mostly used for search engines likely to create a different experience.

Google Suggests Hybrid Rendering as the Best Method

Google Suggested Hybrid Rendering as the Best Method - Enterprise SEO for DevelopersServer Side Rendering is currently fine, although Google is likely to push the above hybrid rendering model and format in the future.

Google Bots Now Updated to Have Similar XP as Latest Chrome Version

Google Bots Now Updated to Have Similar XP as Latest Chrome Version
GoogleBot has been updated to have a similar experience as the latest Google Chrome browsers, with only a few delays.

The GoogleBot Agent will be updated more frequently.

Let’s Talk About Speed

Enterprise SEO for Developers - Speed MattersSpeed matters a great deal to Google and to enterprise SEO. 

How to boost your speed?

  • Third-Party Scripts
  • Use DNS Prefetch
  • Use Prefetch
    • Find the most popular resources on your site and use prefetch (not to be confused with DNS prefetch, above). That loads the asset when the browser is idle, reducing load time later: <link rel=”prefetch” href=”fonts.woff” /> Be careful with prefetch. Too much will slow down the client. Pick the most-accessed pages and other resources and prefetch those.

Working Better Together

Things Change in SEO. Constantly.

Enterprise SEO is constantly changingGoogle changes constantly and with that, so do our recommendations as SEOs.

SEOs Are Always Testing Something

SEOs are always testing something - Enterprise SEO for Developers - In Marketing We Trust

  • SEOs work against a black box
  • We are never 100% sure how Google will react
  • Your site and content is very dynamic
  • We are always finding bugs, weird things and opportunities to grow
  • Many of our tests may not turn into something worthwhile
  • But without testing and learning, we are stalling progress

Building a Viable, Stable, Flexible Enterprise SEO Platform will Take a Few Iterations

Building a Viable, Stable, Flexible Enterprise SEO Platform will Take a Few IterationsAs a developer, you need to work WITH your SEO team. The more you know, the less friction. 

You can make/break SEO performance so SEOs need your help.

Remember, steady, stable and structured wins the enterprise SEO race. The more organised developers and SEOs are with content and URLs produced and maintained, the more likely they are to win against their competitors.

Google wants the same things as human users. It just has different accessibility needs and processes information differently. Improved UX for users that Google also understands, often results in better rankings and organic growth.

Reduce your workload and improve efficiencies, by allowing your SEO team to tweak and adjust site performance.

Need some help? We help many enterprise businesses and their developers with SEO. Get a free consultation.


 

Now that you’re refreshed on the basics, it’s time to move on to content at scale in our next enterprise SEO training.

Read now

 

Categories

Recommended for you

Get Our Newsletter

Sign up for our newsletter and receive monthly updates on what we’ve been up to, digital marketing news and more.

Your personal information will not be shared, and we don’t like mail spam or pushy salesmen either!