SEO Part 2: SEO Ranking Factors and How to Check Them

Posted: January 14, 2015

After giving you some time to let the first installment of this blog sink in, we are ready to share the technical side of things.

In the first blog of this series on DIY SEO, I told you about the tool you need to conduct your own audit. Now you’ll want to know how exactly to use them. To do that, you should know what you’re looking for when you use these tools.

Bot Access/Site Reachability

In order for your website to show up (rank) in search results, a web robot has to have access to your site to crawl all of its pages. When a search engine crawls the pages of your site, it files the pages in that search engine’s index of pages. A site owner should ensure these bots have access to the pages on a website so they will rank in search. VisionPoint recommends checking your website’s /robots.txt file and sitemap to ensure crawling access. You can also assess current page ranking to check any critical issues that need immediate attention.

Robots.txt

/Robots.txt is located at yourschool.edu/robots.txt. It’s a file with instructions for web robots to instruct what pages they can and cannot crawl. This file should be checked to ensure the website is not blocking bots from crawling any critical pages that should be indexed by search engines. Pages that need to be crawled include any pages you are attempting to drive traffic to, such as the home page, landing pages and informational pages. The /robots.txt file is typically used to block bots from crawling non-public parts of a website and duplicate content (like print versions of an html page). To check this file, just type the extension /robots.txt at the end of your website’s homepage and scan over it to ensure no important extensions are being disallowed.

Sitemap

A sitemap is a list of all the pages on a website that should be indexed by search engines. It’s important to have one located on your website for easy indexability by web bots. A sitemap should include all pages from your website that you want to show up in search. There should not be any duplicate URLs in your sitemap and you should ensure they are all relative links, rather than absolute.

The Screaming Frog SEO Spider Tool that I referenced in the toolkit can be used to generate a sitemap. It should be placed on the website and then uploaded to the site’s Google Webmaster Tools account. By doing this you are ensuring the pages listed on the sitemap will be crawled. Any issues with accessing URLs on the sitemap will be reported through Webmaster Tools.

Google will crawl a sitemap usually within 24 hours of being submitted and Webmaster Tools will report how many of the URLs listed have been indexed by Google. Once the number of pages indexed is known, that number can be divided by the total number of pages on the sitemap to come up with the inclusion ratio. The higher the inclusion ratio, the healthier the indexation. A lower inclusion ratio signals an issue with indexation, indicating problematic URLs that should be fixed.

Page Ranking

Once bots know your website exists and can access its pages, it’s time to look at how search engines rank your website’s pages in search results. Pagerank is affected by many elements of your website and the individual page, including the website’s accessibility as well as the content and keywords on a page. Because pagerank is affected by so many things, VisionPoint likes to start by taking a look at a website’s current situation.

By going to pagerankchecker.com, you see where on a search engine results page (SERP) your homepage, and other important pages, are sitting. This can be an indication of the health of your website. Other factors of your domain can be checked on this site as well. When the domain is typed into the rank checker, a profile is generated to show your domain’s pagerank, keyword density of the domain’s most prevalent keywords, and social signals. Social signals show counts of Facebook likes and shares as well as Twitter and Google +1 activity for the domain.

Through the process of checking your top pages’ rank on search engines, along with the sitemap and /robots.txt file, you’ll an idea of your domain’s current performance. If the /robots.txt file is blocking bots from crawling your homepage and your website isn’t ranking in the top 10 SERP results, then you know right at the start of your audit that there is a critical issue to address. After checking all of these things, you have successfully finished the first part of your SEO audit!

User Experience (UX)

User experience is an important factor for the users on your site. UX is about the ease at which users can find and navigate your site. A good user experience can help visitors accomplish their goals and, in turn, help your institution meet its goals.

The following factors should be assessed to optimize a website’s UX:

User Friendly URLs

URLs are important for ease of navigability. URLs with excessive parameters are hard to read and can create confusion for users. URLs on a site should be simple and straightforward, including keywords so a user will know what page they are being taken to just by reading the URL. A readable URL looks like visionpointmarketing.com/about-us. By reading this URL you can pretty safely assume you will be taken to VisionPoint’s About Us section. Every URL on a website should be written like this one.

Other guidelines to follow when creating URLs for your site:

  • Keep URLs to 155 characters or less
  • Have them read in a natural fashion. For example, visionpointmarketing.com/about-me vs visionpointmarketing.com/=?4275n
  • Use a hyphen (-) to separate words rather than an underscore (_)

To avoid clicking through your entire website to check all of your URLs, you can just download a single report from the Screaming Frog tool and quickly analyze your top landing pages for best practices. Ideally you would analyze every single URL on your site for optimization. However, time and budget are valuable and often limited resources. VisionPoint recommends focusing on your website’s most important pages. Pages that don’t contribute to measurable business goals, don’t address a common user goal, or don’t really matter in the first place can be dismissed in this part of the audit. We want to focus on those critical pages that prospective students should have easy access to.

First, log into your Google Analytics profile and determine the most common entry points to your website and focus your efforts on these. Depending on the size of the website you should compile a list of 10-30 top landing pages. Find your landing pages by scrolling down to Behavior in the left navigation on the main dashboard of the profile page, then click Site Content and then Landing Pages. Export the top 50 landing pages into a CSV file and then sort them by Sessions in descending order.

By going to the Internal Links tab on the Screaming Frog crawl and clicking Export, an Excel file will be generated. At first glance this spreadsheet may look like an overwhelming amount of data. Here at VisionPoint, we love pivot tables. So throw the data into one and find your top landing pages. Analyze their URLs for readability. Do they read naturally? Is each word separated with a hyphen? Are they each 155 characters or less? Improving your website’s URL structure is an easy way of optimizing website UX!

Another UX factor to look at is your site’s structure. Site structure refers to the hierarchy of pages on your website. It’s important to arrange information so it can be easily found by users as well as easily crawled by search engines. Typically, you want a flat (vs deep) website structure and you want your pages to be no more than 3 clicks away from the home page. This standard seems pretty impossible to reach with a site that has 10,000 pages. We like to take another look at the top pages to ensure that they are no more than 3 clicks away from the home page and that they are easily accessible to users and search engines.

An easy way to take a look at your website’s structure is to look at your Screaming Frog crawl. On the right side of the window there is a Site Structure tab that allows you to view the layout of your site. The graph below is from a crawl showing depth (number of clicks to reach a page) across the x-axis and the number of pages at that depth along the y-axis. This is a good view of a website at a high level.

If you download your internal crawl from Screaming Frog you can take a look at your top pages to see how deep in the site’s structure they are located. If your Student Admissions page is located in a place where a visitor has to click on 7 different links before they reach it, they may give up before finally finding the page.

Sharing/Linking Content

Social Engagement

Your social footprint is another important off-page factor that can help optimize your site for better search performance. If you think about it, social media is just another form of link building. Just a few years ago you were ahead of the game if your institution had its own Facebook or Twitter. Now you are missing out on valuable exposure if you aren’t plugged in and constantly networking. A social footprint is crucial in order to continue to spread brand awareness and encourage brand engagement.

Active social media profiles drive high quality traffic! If someone is engaging on your social media profile, they are an interested user. This is why it’s important to create relevant and engaging content for your social media profiles and to encourage sharing. Further, a different profile for individual programs is important in order to share relevant content to the right audience. This may call for a long term social media strategy for your institution. This can be well worth the time if it means driving highly targeted audience of high quality.

Backlinks

A backlink profile refers to all external links pointing to your website. The external websites that make up this profile are important indicators to search engines of your website’s credibility and popularity. Links on the Internet form relationships between pages and websites. By building a strong backlink profile you are collecting votes that signal quality and relevance to search engines so your website can bypass competing websites and rank higher in organic search results. There are some measures you can take to ensure your website has a healthy backlink profile.

You can download your backlink profile through your Google Webmaster Tools account into a spreadsheet to see who is already linking to your site. Ideally your website’s backlink profile should be diverse and full of natural links (links to real websites with relevant content).

When analyzing the information you want to keep your eye out for spammy links, websites that have no relevance to the topics of your website, and websites with low domain authority. By rule of thumb I usually eliminate links from websites with a domain authority of 25 or lower. Bulkdachecker.com is a good tool to use to check domain authority because it lets you check up to 200 URLs at a time and download the results into an Excel spreadsheet for easy sorting.

After singling out the unwanted backlinks you need to disassociate them from your website. You can request to have your website’s link removed from their website or have that link disavowed. Disavowing backlinks is the act of asking Google not to take particular links into account when assessing your website for quality and relevance. The disavow tool should not be used excessively per Google. Abuse of the tool can lead to more harm to your site’s performance than good so it’s important to only disavow links that are particularly spammy or malicious and to only use it if you are certain those links causing your website issues.

It’s important to note that having 100 backlinks doesn’t mean you’ll be ranked at the number one spot on a SERP. The quality of these links is a much more important indicator to search engines than the quantity of the links. You want to focus your efforts on obtaining links from high quality sites that are relevant to yours. You can gain these links by creating valuable content that can be shared and referenced by other people. It’s important to encourage the sharing of your content and make it easy to do so.

Get Started!

If you hit the biggest pressure points in the 4 main categories of SEO above, and lean on Google Help for some extra support, I guarantee you’ll have a better understanding of what search engines measure when pulling up search engine results and how you can optimize your website for better performance. That being said, even the simplest SEO audit can be pretty involved. If you’d like to talk more in depth, don’t hesitate to reach out. We’re always happy to lend a hand.