A SaaS Technical SEO Checklist – And How Not to Waste Your Time

Published: April 14, 2022Updated: February 27, 2024

Noindex Tags and Canonical Tags

Optimize your Robots.txt file

404 Pages and Broken Internal Links

Set Up Your XML Sitemap

Broken External Links

Duplicate Title Tags and Meta Descriptions

Duplicate Content

H1s and H2s

Avoid 301 Redirect Chains

Serve Images in Next-Gen Format

Remove Unused or Outdated Plugins

Add Alt Tags

Conclusion

The Amplifyed Approach to Technical SEO for SaaS

When you think of technical SEO for SaaS, the first thought is often auditing your current site. A technical audit is a great place to start to diagnose any major problems that could be holding your website back from taking your SaaS company’s online presence to the next level. It’s part of any comprehensive SaaS SEO services that are worth their salt, and it’s a relatively easy way to kickstart your SEO efforts with a black-and-white checklist to make your way through.

That’s not to say fixing technical SEO for SaaS doesn’t require a level of expertise–knowing how to navigate the backend of your site and exactly how the technical SEO issues shown on your audit impact your SEO is a skill that can only be honed through years of working on turning websites around. Despite that, here’s a list of issues that you may see on your technical SEO audit in general order of importance.

Get more unbeatable SEO tips from the SaaS SEO experts on our blog!

1. Noindex Tags and Canonical Tags

As a webmaster, you have the ability to tell a search engine to not look at certain pages. This can be a great practice to help combat index bloat or to section off pages with little to no SEO value. We also use this to help guide search engines towards the best version of a particular page.

A great example is on e-commerce websites. Let’s say you sell an item in 15 sizes and each size has its own webpage. All of the pages have nearly identical content, which would typically be a bad thing. To combat the duplicate content problem, we would add a canonical tag to each page that would select one version as the master copy and point all SEO equity there.

The problem is that, sometimes, webmasters will accidentally add a noindex tag or a canonical tag to a page without realizing it. This will make it impossible to rank that page in any search engine, no matter how much time you’ve invested into content marketing.

One place to look to ensure your pages are being indexed properly is your robots.txt file, especially for a new site. The robots.txt file is where Googlebots and other search engine crawlers check first to see where they’re allowed to go on your site. Often, a website developer will disallow all robots while a site is in development and forget to remove that command before the site goes live. If you’re noticing that all of or a large percentage of your site isn’t showing up anywhere on SERPs, check there first.

Search Console Inspection Tool

Pictured above: what your robots.txt should look like if you want a search engine to crawl your entire site.

By pasting the URL of pages from your SaaS website into the Inspect Tool within Google Search Console, you can ensure that your pages are positioned for success from the start. You do have Google Search Console set up, don’t you?

Use Google Search Console’s free URL Inspection Tool to get a clear picture on how Google views the most important URLs on your website.

2. Optimize your Robots.txt file

Speaking of your Robots.txt file, it’s worth making sure you’re using it as effectively as possible. These days, some webmasters assume Robots.txt files aren’t necessary, because Google can usually find and index pages on your site automatically. However, having a well-designed Robots.txt file actually delivers a lot of benefits. 

It allows you to block non-public pages you don’t want Google to index (such as log-in pages, or staging pages). It also gives you an opportunity to maximize your “crawl budget”. By blocking unimportant pages with robots.txt, you can tell Google to spend more time indexing the pages that really matter. You can usually find out whether your site has a robots.txt file already by typing the full URL for the homepage into your search browser, and adding “/Robots.txt” to the end.

If you don’t have a robots.txt file, you can either use a generator tool online, or create your file in a standard plain text editor. You’ll need to familiarize yourself with the syntax used in a robots.txt file before getting started, but Google has a great resource to help you here. 

When implementing your robots.txt file, make sure you:

  • Use new lines for each directive: Each directive in a Robots.txt file should sit on a new line, to ensure search engines can read and follow your instructions.
  • Use each user agent once: While you can use the same user-agent multiple times, referencing them only once will reduce the risk of human error. 
  • Experiment with wildcards: Wildcards (*) can apply a directive to all user agents and match URL patterns. These allow you to simplify the directions you give the search engines.
  • Use $ at the end of URLs: Adding $ to the end of an URL will let the search engines know which URLs to avoid indexing.
  • Use # symbols for comments: Crawlers will ignore anything that starts with a hash symbol, so you can use these to add comments to your robots.txt file for developers. 
  • Use different files for subdomains: Robots.txt files only control crawling behavior on the subdomains they’re hosted on. This means you need a separate robots.txt file for every subdomain you’re working with. 

 

3. 404 Pages and Broken Internal Links

Nothing is more frustrating than landing on the page you’re looking for and realizing that it’s broken. 404’d pages not only provide a poor user experience, but Google despises them! To make things worse, a 404’d page also implies the existence of broken internal links on your site that point to this non-existent page, which makes the problem twice as bad. When a Googlebot is crawling your site, it’s got nowhere to go when an internal link leads it to a 404 and makes it harder for your site to be properly indexed and understood by Google.

The good news is that if you can identify all of your broken internal links and correct them, you will have resolved your 404 page problem as well. This is arguably the highest priority item to tackle when fixing your technical SEO. A site that flows through uninterrupted links provides an enjoyable experience and passes SEO equity properly throughout the domain.

You never want potential customers to see this–no SaaS SEO strategy is complete without cleaning up all the 404s on your website!

4. Set Up Your XML Sitemap

Your sitemap is the map that makes it easier for Googlebots to understand the layout of your site. While a page being in the sitemap doesn’t guarantee that it gets indexed, it does provide context for Google to present your site the way you intend it to be presented to users.

For websites built on a platform like WordPress that can integrate with Yoast SEO or a similar plugin, their XML sitemap is usually automatically built for them, and lives at www.yourwebsite.com/sitemap.xml. If you have a custom site or don’t have or want an SEO plugin, there’s plenty of free tools on the internet to build an XML sitemap.

5. Broken External Links

The same principle applies to links that flow from your site into the wider web. If you reference other articles, partnered businesses, charts, graphs, or socials on your site, and those links point to external 404 pages, you’ll want to get those fixed as well. Start by identifying every instance of a broken external link across your site with audit tools like Screaming Frog, SEMrush, or Ahrefs. Once you’ve done so, try to find an operational page that you can link to instead.

While this doesn’t necessarily have a direct negative impact on your SEO, it is a huge part of user experience, which is a large component of how a search engine views the value of a page.

Use a site auditor tool (like this free one from Ahrefs) to locate broken links on your website.

Broken external link error

Use a site auditor tool to locate broken links on your website.

6. Duplicate Title Tags and Meta Descriptions

Every page on the web is supposed to have a title tag and a meta description to help explain the overall topic to a search engine and the public at large. In SEO, these two inputs are golden opportunities to add your target keyword for maximum value.

Sometimes programmers will duplicate existing pages to save time during the design process. However, this practice can lead many webmasters to forget to input a new title tag and meta description on the duplicate page. This leaves multiple pages with the same title and description, which can be very confusing to a search engine or person. Five pages titled “Homepage” would confuse me too!

Each page on your site is an opportunity to rank for valuable SEO keywords! Include them in the title tags and meta descriptions to get more of those juicy SaaS SEO rankings.

Make sure that your title tags have unique verbiage to distinguish one page from another.

Need help making your title tags and meta descriptions? Take a look at what the guys on page 1 are doing with theirs. Keep in mind that the title tag is more for the search engine, while the meta description is more for the user. Use the title tag for your target keyword, and use your description to entice a user to click. 

Duplicate title tags

Make sure that your title tags have unique verbiage to distinguish one page from another.

7. Duplicate Content

You know that movie Groundhog Day where every day is the same? That’s how it feels to browse a site with tons of duplicate content. Google’s not a fan either!

Here’s an example: A painting company offers interior and exterior painting services. When designing the site they simply copied their interior painting page and replaced every instance of the word “interior” with “exterior.” These pages now share 95% of the exact same verbiage, which a search engine like Google can read.

Google loves unique content, and when pages are nearly identical, Google thinks of the website as lazy and uninteresting (which is definitely not a good thing). According to Zippia , there are about 6,000 SEO content writers employed in the U.S. With the increasing importance of quality content, we can expect this number of writers to increase.

Be sure to make your pages stand out with their own verbiage, links, and images.

Free tools like Siteliner allow you to easily identify serious duplicate content offenders on your site.

Not everyone has time to write unique, interesting content that will rank on Google, so book a call with Amplifyed to hire experts who will!

Duplicate content report

Free tools like Siteliner allow you to easily identify serious duplicate content offenders on your site.

8. H1s and H2s

H1s and H2s, despite what many SEOs might tell you, don’t carry the same ranking power they used to in the early days of exact match ranking. However, that doesn’t mean you should completely ignore your duplicate H1s and H2s that show up in your audit.

Headings on your website are a visually appealing way to break up your content into more digestible sections for a user, and as we know, user experience is one of the most important factors when it comes to ranking a page.

In addition to readability, H1s and H2s act as a signal to a search engine for what certain sections of content are about. This is especially useful if you think a section of your page could appear in the answer box for a specific search query on top of a SERP. If you’re getting an opportunity to tell Google more about your content that helps it understand it and put it in context, take it.

9. Avoid 301 Redirect Chains

301 redirects are a great tool for cleaning up your website and ensuring that all the proper “link juice” is being passed along to the right pages while getting rid of those ugly 404s that we talked about earlier.

However, they don’t come without their pitfalls. 301 redirects slow your website down just a hair, and if you have a wealth of outdated or incorrect content built up over years of running your site, it can be easy to accidentally start a 301 chain, where one 301 redirects to another 301. The load time builds up, and has an adverse impact on UX plus Google’s Core Web Vitals.

Once a year, make it a point to take a look at the 301 redirects at your site to clean up any chains that might have formed.

10. Serve Images in Next-Gen Format

Images are an important, and often underrated, component of a great website. They break up blocks of text, provide important context to content, and make a site look modern and visually appealing.
Obviously, your images should never be low-quality. What good does it do you or a user if all they’re looking at is a pixelated mess?

The problem with high-res images is that they affect your page load time–a metric that Google places a high value on.

A good way to combat this is by serving your images in next-gen format, which compress images into manageable sizes while still retaining the high quality that your site needs. The two types of next-gen formats are AVIF and WebP.

Remember, a lot of technical SEO is just fixing issues that you think a user might find annoying.

11. Page Loading Speeds

Improving page loading performance is one of the most important things you can do to ensure your site performs well in the search engines from a technical perspective. Google has used “page speed” as a ranking factor for more than a decade. In 2018, the search engine also implemented the “speed update”, which increased the importance of having a fast-loading site. 

The easiest way to check your page loading speed, and any issues you might have, is to use Google’s PageSpeed Insights tool. It will provide insights into exactly which issues are preventing your pages from loading as quickly as they should. If your site isn’t performing according to your expectations, there are a few things you can do to address the problem:

  • Compress images: Images often take up around 50-90% of a page’s size, having a direct impact on loading speeds. If you’re using a WordPress site, you can use plugins to compress images automatically. If not, there are plenty of other image compression options available online for you to experiment with. 
  • Clean your code: Minifying the resources found on your page is crucial to improving loading speeds. This means cleaning up your HTML, CSS, JavaScript, and any other code you need the search engines to read. Work with a developer to get rid of any complex or unnecessary pieces of code that don’t add value to your site.
  • Upgrade your hosting: Sometimes, the best way to improve your loading speed is to simply invest in better hosting. If you’re sharing a server with millions of other websites, your pages just aren’t going to load as quickly as they could. Consider upgrading your hosting service if you’re struggling to get the right results. 
  • Activate browser caching: Browser caching won’t increase loading speeds for people who are visiting your site for the first time. However, it will make pages load a lot faster for anyone who’s a repeat visitor to your website. You can set up caching using either a plugin, or your .htaccess file. 
  • Use a CDN: A content delivery network, or CDN is one of the best ways to improve page loading speeds. They work by determining where website visitors are located, then serving site resources on a server closest to them. 

Remove unused or outdated plugins: Unnecessary plugins can damage your site’s loading speed, and lead to security issues. Take the time to remove any plugins you’re not using completely from your site, rather than just hiding them from the front-end of your site.

12. Add Alt Tags

Alt tags are HTML attributes added to images in order to add a description to them. Google and other search engines are unable to understand images without context, and the alt tag does just that. Alt tags should be keyword-rich, descriptive attributes that are useful for a number of reasons:

  • They tell Google what an image depicts so it can show up in image searches
  • They are used by text-to-voice readers that users with vision impairments rely on, so instead of reading something like “IMG005,” it will read an accurate description of the image.
  • If, for whatever reason, images on your site aren’t loading, the alt tag will replace it, so a user can at least know what image is supposed to be there.

It all comes down to UX, with the added benefit of searchability for the images on your site!

This is a labor-intensive and tedious task, and should only be taken care of on the site’s most important images or on the most important pages.

Conclusion

By using resources like Google Search Console, Screaming Frog, or another SEO tool, you can identify all of the problems listed above. A SaaS SEO expert will be able to prioritize these items to tackle the most important issues first and offer valuable insight into where you can improve. A complete SaaS SEO strategy does include an in-depth technical SEO audit to accompany your keyword research, content creation, and link building.

The Amplifyed Approach to Technical SEO for SaaS

Amplifyed is the premiere SaaS SEO Agency that focuses specifically on B2B SEO. Our hyper-focused approach to search engine optimization allows us to maximize potential for your SaaS business. Our tried and true SaaS SEO services revolve around the most impactful aspects of creating an SEO-friendly website; quality content with relevant keywords. During the content creation process we will also implement these above best practices related to SaaS technical SEO.

We receive SaaS SEO questions day and night. One of the most frequent requests when someone comes to us is that they want an audit of their website’s technical SEO.

Maybe they want us to review their site structure, title tags, schema, and more. 9 times out of 10, that would be a complete waste of time.

Because if you don’t have a solid keyword and content strategy in place, contracting SEO services to do technical SEO for your SaaS company won’t do a dang thing for your rankings.

Here’s why:

The main elements of a strong SaaS SEO strategy, in order, are content, link building, and technical.
Quality content is by far the most important. Your rankings will soar once you have keyword-optimized content about topics people actually care about.

Then at some point, your ranking will get stuck. It depends on how competitive the keyword is, but for example, let’s say your page gets to the top of the 2nd page of Google.

That’s where link building comes in. Building links to that page will get it unstuck and over the hump in the search results, generating more juicy SaaS leads for ya.

The role of technical SEO for SaaS websites is to ensure your site doesn’t have issues that are sabotaging your rankings. Slow websites and broken links are the most common offenders.

Your website can be a technical masterpiece, but if it doesn’t have a strong content marketing strategy and backlinks, you’re not going to rank.

Agencies love doing a technical SEO audit because they’re easy! We can just run a tool.

But it’s probably not what you need.

If you’re looking to get an SEO campaign going for your SaaS company so you can get more inbound sales, start with the keyword research & content marketing strategy, not a technical audit.

Scott Johnson

Founder

Hey! I live in San Diego and have been involved with SEO since 2010. Our amazing team at Amplifyed specializes in helping SaaS and tech companies dominate the search rankings. We serve as an extension of your team to make sure your content ranks and drives the right people to your website. Let’s connect on LinkedIn and schedule a chat.

Book a 20-minute call where I guarantee 3 SEO fixes that will increase traffic.