fbpx

Enterprise SEO Audit

For enterprise organisations an SEO Audit is a complex task which merges code inspection, data backed analytics, server logs, keyword research and more. At Metric Labs we work with a range of medium to large organisations offering comprehensive consultations to capture lost opportunities, accelerate growth and provide consistent digital marketing support & service to ensure all enterprises are at their most competitive. 

Here, we are going to outline the general process of an Enterprise SEO Audit and exhibit the breadth and range of a comprehensive checks and analysis involved to deliver world-leading search engine optimisation from a consultative and management perspective.

At first internal marketing directors may think that enterprise SEO Audits are once-off tasks, however by describing this process we hope to illustrate the ongoing value-add that a trusted digital agency can provide to global organisations across multiple markets. We also hope to guide readers through the process in the hopes that they are well-informed and make knowledgeable choices when selecting their enterprise digital agency.

Enterprise SEO Audit Tasks

Initially there are a series of audits which can be broken down into various section, however at time cross-over and are interrelated.

  1. Crawling Audit
  2. Indexing & Technical Audit
  3. Product, Persona User & Keyword Audit
  4. Content Audit
  5. Performance Audit
  6. External link Audit
  7. Implementation Strategy

Before we going through the various areas the form an enterprise SEO audit report and analysis we want to illustrate the range of tools which are used to evaluate the health of enterprise resource.

Enterprise-Level Technical SEO Site Audit Tools

To understand how your digital agency achieves the insights the following is a list of essential tools and requirements:

  • Website(s) Administrative Access, including C-Panel Access, .HT Access, Server Logs, Google Analytics, Google Tag Manager and other Search Engine analytics tools.
    •  Depending upon the scope these will all relevant digital assets will be required, whether multiple domains, sub-domains or other relevant digital properties.
  • A Website Crawler: to perform manual inspections of each site.
  • Web Ranking and Indexation Tools: these tools allow your digital agency to view the performance of your site from the search engines perspective in terms of ranking, crawling and indexing.
    • SEMrush: is one of the leading tools for enterprise analysis. This tools offers not only keyword rank checking, but also provides a comprehensive simulation of a Googlebot web spider.
    • Ahrefs: provides great list of organic keyword positions, referring domains and pages which help evaluate the authority of each page for you and your competition.
    • Google Search Console: An SEO’s essential tool, which provides direct communication with Google about the status of your site, it also valuable tools to analyse in-detail each crawl of your site by Google. (Note: Other Webmaster tools e.g. Bing, Baidu will also be used).
  • Technical Website Inspection Tools: A series of specific tools to analyse technical website elements.
  • Old Fashioned Code Inspection
    • For professional digital agencies sometimes the quickest and easiest way to get to the root of the problem is view source and inspect the code manually.

There are the additional tools scattered across the web however these in our opinion these are probably provide the greatest functionality and analysis.

Enterprise Crawling Audit

For enterprise-level businesses they usually have more than one website with a wide range of product or service offerings mixed with a healthy volume of blogs, whitepapers, video’s and other media, here interpreting the whole information architecture it key. Existing online is continually increasing in complexity as the various technologies evolve and the volume of content on each site expands. The initial task is to address the current state of performance in terms of Crawling and Indexation.

At the user (UX) level things may appear to be running relatively smoothly. pages, menus and form may appear to be working. However upon performing a series of crawls across an enterprises assets will reveal some shocking limitations to performance, usability and ultimately indexing.

Crawl Audits investigate results from a number of different sources:

  • Manual Crawls
  • Sitemap Crawls
  • Server Logs
  • Googlebot Crawl Reports
  • SEMrush Crawl Audits
  • Code Validation Audits
  • Lighthouse Audits
  • Code Inspection

All of these audits and reports are run across every relevant online enterprise asset, in which we collate the results to reveal larger systemic issues. Crawl audits reveal some fundamental discrepancies which upon deeper inspection we can pinpoint the exact type of error. From this point we can establish one (or more) implementations to achieve our outcome. These issues may be restricted to one asset or cascade across multiple areas.

As the initial starting point, these crawling limitations can have dramatic affects upon organic performance, conversion rate and ultimately the success of an enterprise online for years.

What’s’ problematic for enterprise organisations is that while there exists in-house marketing teams, as well as highly skilled and contracted IT support teams there exists a gap between the subtleties of implementing large-scale online infrastructure, supply chain integration and consumer facing websites.

A valuable global or enterprise aligned digital agency should be considered a essential asset to work seamlessly between each component of the enterprise.

Enterprise Indexing & Technical Audit

Once an agency has completed a full inspection of crawl limitations within the existing enterprise assets, the next logical step to interpret how search engines (not just Google) evaluate the enterprises sites. A website can always change however its not possible to ever change the search engines algorithm for interpretation.

With no central filing system of the World Wide Web, each search engines attempts to canvass the information from more than a billion websites (and growing) so that it may retrieve relevant content for a search query . This indexation plays an integral part to how search engines display web pages.

When a crawler such as Googlebot, Bingbot (or one of the 54 Baidu spiders) visits a site, it will request for a file called “robots.txt”. This file tells the crawler which files it can request, and which files or sub-folders it is not permitted to crawl.

From this point the spiders explore through the fabric of the website(s). Interacting across a number of components which make up the site:

  • Protocols (HTTP vs. HTTPS)
  • XML Sitemap
  • HT Access File
  • Meta Data (Title, Descriptions)
  • Images & ALT Tags
  • Robot.txt & Robot Tags
  • Caching & Delivery
  • Internal Links
  • Content
  • JavaScript & AJAX

All of these components if not correctly aligned will reduce the amount of time (literally electricity and network bandwidth) the search engine is going to parse and index your website. When it comes to the range of errors that affect indexation the range can be particularly broad.

Typical Protocol Issues

  • Multiple entry points between (http, https, http://www, https://www)
  • Invalid certificates or certificate expiration
  • Mixed content serving
  • Redirect Loops

Typical Sitemap Issues

  • Malformed Sitemap and structures
  • Missing components from sitemaps
  • Address within robots.txt file
  • Sitemap submission to search engines
  • XML Sitemap size

Typical HT Access Issues

  • Incorrect mod rewrite rules
  • Unoptimised mod rewrite order
  • Security vulnerabilities
  • Incorrect crawl access given to system folders.

Meta Data (Title, Descriptions) Issues

  • Missing
  • Duplicates
  • Too long or short
  • Unoptimised with keywords.

Images & ALT Tag Issues

  • Excessive image strings
  • No ALT Tags
  • Image size and format

Robots.txt & Robots Tag Issues

  • Incorrect use of nofollow or noindex
  • Incorrect Allow or Disallow directives
  • Allow query string crawling
  • Missing Sitemap Address

Caching and Delivery Issues

  • CDN Deployment Mis-configuration
  • Browser Caching
  • Load Impedance
  • Script and CSS Size and Delivery
  • Time to first Byte

Internal Linking Issues

  • Weak internal link structure (More than 3 links away from being accessible)
  • Lack of Anchor Text and ALT Tags
  • Incorrect Rel Attributes
  • Link network does not flow to money (important) pages.

Content Issues

  • Content volume (too short)
  • Incorrect Keyword usage
  • Incorrect HTML Tags for Keywords
  • Pages are not optimised towards the query in which it is targeting.
  • Note: This will be addressed further in the proceeding section.

JavaScript & AJAX Issues

  • Incompatible implementation with search engine spiders
  • Concealing content, form, links and products
  • Un-indexable menus

Enterprise Product, Persona & Keyword Audit

When it comes to a presenting an enterprises range of services or products online there can be a genuine misalignment between how internal personal refer to their own products and the market itself. Typical issues associated with enterprise websites (especially IT Services and Infrastructure industries) are:

  • Difference between Brand Marketing Vs. Market Product/Service Terminology.
  • Lack of attribution between users query and their purchasing intent.
  • Customer persona and Customer Journey has not been paired with keyword search terminology.
  • Regional, country and language difference in search terminology & behaviour.

As a general rule, the concept of creating pages to match a set of keywords terminologies need to not only to reflect the best organic search volume, but also needs to be answering the users query, at a particular stage of their consumer journey.

This ultimately translates into significant permutations depending upon the range of products or services, persona and customer journeys. Analysis and decision made here will have significant impacts upon how an optimised site is constructed. The sites information architecture (IA) is strongly informed by this research and supported by expansive keyword research.

Enterprise Content Audit

New and existing content alike needs to be aligned with the keyword research, customer journey and persona modelling to inform the structure of content on-site. Most enterprise clients already have vast sites with landing pages, conversion funnels, whitepaper and more.

At times existing content is lacking in various ways:

  • Heading Keyword Choice
  • Content Positioning
  • Sub-Headings
  • Content Length
  • Call to Actions
  • Pop-up / Excessive Banners and Imagery

The process to reorganise and map content across an enterprise site requires engaging with marketing managers, stakeholders, development teams and content writers. Decision here are backed by:

  • Keyword Research
  • SEO Best Practice
  • Heatmapping
  • A & B Testings
  • Conversion Optimisation

Enterprise Performance Audit

With the knowledge gained from performing crawling, indexing, technical and content audits concurrently the digital agency will deliver an audit of the current performance. This performance audit actually takes the elements that are working across the enterprise websites and incorporates a range of recommendations to deliver longer term improvements.

At the enterprise level most site have some levels of success whether within Paid Search, Branded keywords, or one specific product matched to an organic keyword phrase.

The value-add of a digital agency is to expand all areas of performance from load-times, conversions, brand awareness, organic keywords and sales overall.

Enterprise External Link Audit

Enterprise level web assets have a surprisingly large amount of site equity already established. Over years of operating an a company may have generated potentially hundreds of thousands of links across all assets. In itself this is great achievement however usually full of unrealised potential to push organic growth further.

Some common enterprise links issues are:

  • Expired Inbound links not captured
  • Site acquisition link opportunities
  • Blanket redirects from existing inbound links
  • Redirect Chains

These issues (and more) can have a substantial impact upon not only Domain Authority but specific keyword terminologies and their placement within organic search results.

If your enterprise is operating in a highly competitive market place (as most usually do operating trans-nationally) this can mean a significant loss of potential traffic, sales and conversions.

Implementation Strategy

With all of these audits complete the digital agency will usually deliver a comprehensive enterprise SEO Audit. Presentations are usually necessary to competently delivery all issues, risks and opportunities to Marketing and Development teams. It important for all parties to openly discuss, probe the data findings and discuss projections and timelines.

Depending upon the level of complexity of your agency’s findings, the strategies and scope of solutions will come across in a variety of deliverables, generally phased to ensure all parties are prepared to action.

Typical Enterprise SEO Audit Deliverables

  • SEO Technical Audit & Recommendations
  • International SEO Strategy Document
  • Keyword Ranking Performance
  • Keyword Research Datasheet
  • Keyword Insights Documents
  • Inbound Link Reports
  • Competition Performance & Risk Analysis
  • SEO Metadata Schema
  • SEO Information Architecture (IA) Document
  • SEO Content Planner
  • Link Building Strategy

Note: This is not an exhaustive list and very much dependent upon the nature of the enterprise site(s).

Closing Remarks

Having now read a quite general overview of the processes involved within an Enterprise SEO Audit, your new found understanding of the technical and data-backed depths an SEO Agency undertakes. This should help illustrate how a professional SEO Partner can identify key issues across an enterprises digital assets, make changes, accelerate returns, be risk averse and give your enterprise a strategic-edge.

New developments, tools and data markups come out regularly so it’s important to keep up to date with them and test out everything where possible. If your currently looking for professional, expert and a refined level of enterprise SEO services, look no further than Metric Labs, with a wealth of experience in international and enterprise level websites we are happy to discuss delivering a premium solution in the ever-competitive online enterprise industry.

Contact us today.

Looking to start using GA4?

X