Posted on June 11, 2020

​6 Ways to Make SEO Gains

Posted on June 11, 2020 by .BIBLE Registry Categories: SEO Research

Are you doing your part to get your website indexed by search engines? Have you checked all the necessary boxes to make your site bot-ready?

There remains a bit of mystery around search engine ranking, but we do know some of the critical components for creating content that is indexed and ranked as relevant. Let’s explore those points, and also provide a little clarity to the subject while we’re at it.

First, let’s explore how your website ends up on a results page of a search engine (SERP)?

Moz explains that web crawlers, also referred to as spiders or bots, have three main functions: to crawl, index and rank.

  • Bots crawl the Internet looking for content to analyze. They’re almost always operated by search engines, which is a software system like Google, Bing or DuckDuckGo.
  • Once the information--in the form of web pages, images, PDFs, etc.--is analyzed, it’s indexed in a way that’s organized and retrievable by the search engines.
  • Finally, the content is ranked so the search engine can “provide the pieces of content that will best answer a searcher's query.”

Cloudflare paints a great picture of this process:

“A web crawler bot is like someone who goes through all the books in a disorganized library and puts together a card catalog so that anyone who visits the library can quickly and easily find the information they need. To help categorize and sort the library's books by topic, the organizer will read the title, summary, and some of the internal text of each book to figure out what it's about.”

Once a site is logically categorized and retrievable by search engines, gains can be made in the area of search engine optimization (SEO).

What kinds of things can you do to help this indexing process and achieve SEO? Use this checklist to cover critical components of making your site crawler-friendly.

1. Submit a sitemap

The structure of the site needs to be crawlable. Although a sitemap isn’t required, it does greatly help the bots crawl, index and rank your site. This is particularly true if your site is large or if it’s new and not backlinked yet.

For example, if you create a web page for a new project and that link is distributed for promotional purposes during a high-profile campaign, but the page isn’t accessible through the site navigation, that page and the valuable content it holds will likely not be indexed. Avoid that pitfall and submit a sitemap to the Google console.

2. Name every file

Do you have a friend with a label maker? Think of this exercise as virtual labeling. If every piece of content is named in a way that aptly describes the corresponding resource, it has the chance to be ranked as more relevant. If you want to attempt an even higher ranking, try connecting the name to a keyword used in your site. This is especially important for files like images and videos.

Pro tip: Just to be safe and ensure indexing, go ahead and add the description or connected text within the <HTML> markup of your webpage.

3. Curate what’s indexed

Whether you want to hide or highlight pages, there are methods to achieve your objective. Pages that you want to be indexed should be visible on your sitemap, as discussed above. There might be items you don’t care to have indexed, like duplicates or pages that are used internally. There’s simple code to block the bots from indexing those pages. Learn more and find the code on Neil Patel’s blog.

4. Clean up material that isn’t easily crawlable

Here’s a sampling of material on your site that will stall the crawl:

  • JavaScript language isn’t 100% crawlable. Use the philosophy of graceful degradation and either replace links or add HTML to pages that contain questionable crawling material.
  • Clean up bad backlinks, where low-ranking sites link to your site.
  • Eliminate links to social bookmarking sites like Reddit.
  • Press release links aren’t indexed because of their ease of publishing. If you want that material indexed, find another short-form method of sharing.
  • Delete links from questionable sites.

5. Carefully link

Crawlers like content, but won’t necessarily see multiple links listed on a page as indexable. Add helpful links to logical places on the site, and remember to look at your site architecture to make sure every page is able to be fixed. Don’t forget to remove bad backlinks that are floating around on the web, which is a hefty but worthwhile undertaking.

6. See what the bots see

Take a look at your site through a crawler simulator. These tools give you the vantage point of a bot, seeing your pages without any bells and whistles. From here you will notice flaws and be prepared to make changes to make your site. Here are a few we like:

With a simulator, you can see more clearly which content is preventing search engines from ranking your site on the SERPs.

Creating a website that’s easily indexed is paramount to being found and listed on SERPS. Plan to work with your team to start making some of these changes, which will improve your user experience as well as your site’s indexability.

Learn more about SEO and digital marketing best practices on the .BIBLE blog.