-
If you have good content and you have high-quality backlinks, you might want to make sure the technical aspects of your site are up to snuff.
-
Any page that doesn’t feature unique content risks being omitted from relevant search results, so evaluating your content is a good place to start.
-
Screaming Frog, PageSpeed Insights, Mozbar, and Google’s robots.txt tester are a few of our favorite, easy-to-use tools.
-
Pay attention to 301 redirects, 404 errors, and what your competitors are doing.
Digital marketing encompasses many disciplines, both technical and non-technical that, when combined, are designed to drive traffic to a website. Roger that.
So you’ve got the links and great content, but what do you do when the code on your website needs sorting? You have two options: hire a technical SEO agency, or educate yourself on a few of the following issues that can negatively impact your site.
Factor 1: Duplicate content
Duplicate content is bad for SEO because search engine algorithms are designed to filter only the most unique and relevant content so they can return the best possible search result. Any page that does not feature unique content risks being omitted from relevant search results.
The truth is a lot of websites knowingly or unknowingly feature duplicative content for a variety of reasons. However, the simplest way to go about figuring this out is to download the Screaming Frog SEO spider and run a manual crawl of your website. This way you can see your site the way Google does.
Pro tip: You can crawl up to 500 URLs with the free version of Screaming Frog. Simply select “Spider” from the Mode menu, enter a URL and hit “Start.” Once the crawl is complete, navigate to “Overview” and view your canonical tags. Canonical tags tell a search engine what the preferred URL is for indexation purposes. To manually inspect a page’s canonical tag, right click and hit “view source”, control ‘F’ and type “canonical” to highlight the canonical URL. Once you’ve performed the analysis, you can populate an excel doc with URLs that contain duplicate content (column A), and then map them to a preferred “canonical” URL (column B).
Bonus Tip: Leverage a Screaming Frog crawl to uncover duplicate metadata, such as page titles and page descriptions.


Factor 2: 301 redirects
Let’s say you have a website that recently migrated, and a bunch of missing 404 pages were created in the process. The correct solution is to create a list of 404 pages and map each one to a new preferred landing page. This ensures that any traffic coming from missing pages is automatically redirected to relevant content. Doing so ensures a better user experience with lower bounce rates.
Use the Screaming Frog tool to view a list of 404 pages under “Response Codes” and filter for “Client Error (4xx)”. You can then export this list under the “Bulk Export” main menu function. From there, it’s a manual process. Once you have a document of written solutions, set a meeting with your developer to explain what on the site needs fixing.
Pro Tip: Make sure to not create any redirect chains (a missing URL should not redirect multiple times).
Factor 3: Robots.txt file
A robots.txt file is a manually created text file that instructs web crawlers how to crawl your website. In most instances, a given website will have pages you do not want to be indexed, in which case you specify the following syntax within the text file itself: User-agent: * Disallow: /. The easiest way to test your robots.txt file is to head over to Google’s very own robots.txt tester.
Factor 4: PageSpeed insights
Don’t shy away from other helpful troubleshooting tools like PageSpeed Insights. PageSpeed Insights is a user-friendly free web tool anyone can leverage to analyze on-page content and generate suggestions to make pages load faster.
Pro Tip: Compare scores of the desktop and mobile versions of your site to gain a better understanding of where the trouble areas are. Pay special attention to your mobile site, as Google’s mobile-first index prefers mobile over desktop when it comes to rankings.
Factor 5: Competitive analysis
Competitive analysis should never be overlooked. Pay close attention to what your competitors are doing. Are they using structured data to mark up pages?
Download the Mozbar and simply visit a competitor page. If you toggle the toolbar on and click on “Markup,” you can see what type of data markup a page is using to enhance ranking signals. Create a separate document to log your findings, and when the time is right, talk to your developers to see where they can implement structured data.

Content, links, and code are the three main ranking factors. If you’ve got the content and you’ve got the backlinks, it’s time to address the code on your website. The tools we covered in this article should have you off and running in the right direction. Don’t be afraid to flex those technical skills of yours! For more questions related to technical SEO, contact us today to see how we can help!