In Website 102: 11 Tasks To Search Optimization you will be completing very important tasks that have a huge impact of the health and effectiveness of your website in getting found online. You’ll be doing key items to make sure Google, Bing, and Yahoo! are able to find and database your website as well as be sure to implement key enhancements for measurement. It’s all about Establishing Your Online Presence

Once a number of these site-wide optimizations are completed, we will then do page-by-page optimizations to take advantage of key opportunities.

Create an account at Google Search Console and implement on the website.

If you reviewed Website 101: 11 Steps to Set Up Your Online Success, most of this is already completed. However, we need dig deeper.

About Task

Google Search Console provides a free and easy way to make your website more Google-friendly. See your website the way Google sees it:

  • View which of your pages are included in Google’s index
  • See any errors encountered while Google attempted to crawl or database your site
  • Find search queries that list your site as a result
  • Find out which sites link to yours

How To Complete

To get setup with Google Search Console, click here, submit your sites, and start collecting data in your Search Console Dashboard.

Click here for more information.

Make sure all content on your website is unique.

About Task

Having unique content on your website is one of the most important factors in contemporary search engine optimization. Major penalties might occur if search engines find that some of the information on your website is same as information found elsewhere on the web (unless you provide clear reference to the source). After the Google Panda algorithm update uniqueness of content on site became essential part of the Internet marketing of each business.

How To Complete

Click here and insert your domain in the box. If no duplicate content is found, you are good to go! If some duplicate content is found though, you have to make sure you either completely remove it from your website, or you substitute it with unique content.

Confirm that your website’s robots.txt settings allow search engines to crawl and database your website.

About Task

“Robots.txt” is a text file that tells search engines whether they can access and thus crawl parts of your website. If this text file is set up incorrectly, it may unintentionally keep your website from getting included in search engines.

How To Complete

If you edit your site’s HTML directly, you will most likely find robots.txt in the default directory. Do not worry if it does not exist, as it is not a mandatory file. You can address this task as needed per the Google Webmaster Help Center.

If you use a content management system (CMS) to manage your site, refer to your individual CMS documentation about how to manage robots.txt through the CMS.

Note that not all CMS providers directly control the robots.txt file.

Confirm that your website’s meta tag settings allow search engines to crawl and database your website.

About Task

A meta noindex tag can be used to entirely prevent a page’s contents from being listed in search engine results even if other websites link to it. If a meta noindex tag is accidentally inserted into your website, it will prevent the page from showing up in Google’s search results.

How To Complete

This is a technical task that requires you to edit via HTML directly. In order to remove this tag from the homepage, search for the following text on your homepage source code:

<meta name=“robots” content=“noindex”>

If you are certain that you do not want the page included in search engines, then leave as is. If you do want the page to be included, then delete it and save your changes.

Confirm that you are not serving duplicate content over both http and https.

About Task

If your site allows both the http and the https version of pages to exist with the same content served for each page, search engines may view that as duplicated content.

E.x., http://www.example.com & https://www.example.com

 

How To Complete

There are two major solutions for duplicate content:

  • Redirect duplicate content to the canonical URL using a 301-redirect
  • Add a rel=”canonical” link element to the duplicate page pointing towards the canonical URL

Confirm that you are not serving duplicate content due to a trailing “/” for any pages.

About Task

Google will treat a URL with and without a trailing slash as two unique URLs:

  • http://example.com/page/ (with trailing slash)
  • http://example.com/page (without trailing slash)

If you have identical content on both the trailing slash and non-trailing slash versions of the page, and both pages return a 200 response code (no redirect, no 404-error), Google will recognize this as duplicate content. Duplicate content is bad for your search engine optimization and should be avoided.

How To Complete

There are two major solutions for duplicate content:

  • Redirect duplicate content to the canonical URL using a 301-redirect
  • Add a rel=”canonical” link element to the duplicate page pointing towards the canonical URL

Read more about the trailing slash from Google’s Webmaster Central Blog: To Slash Or Not To Slash

Confirm that you are redirecting to either www.URL.com or URL.com.

About Task

When you own a domain, you can actually have your site show up for two web addresses:

www.YourURL.com or YourURL.com

For the purposes of search engine optimization, you want to take users and search engines to one web address. You should select either www.URL.com or URL.com and then have one site “redirect” to the other.

For example, if you want users and search engines to all go to www.URL.com, you would have permalatt.com redirect to www.permalatt.com. In this scenario, if a user types in “URL.com” into the web address of a browser they will actually get taken to www.URL.com.

How To Complete

In order to do a redirect from one version of your web address to the other, you need to do a “301 Redirect.”

This is a pretty technical process and you should reach out to either your website designer or contact support related to your content management system (CMS).

Monkeyhouse does NOT currently have a blog on how to set-up a 301 redirect. [Maybe we should.

Until then, if you would like to try it on your own, here are two articles that provide some how-to instructions:

Daily Blog Tips – How to setup a 301 redirect.

Bruce Clay – How to implement a 301 redirect.

Please contact us if you have any questions on this.

Create and submit an XML Sitemap to Google and Bing.

About Task

A Google Sitemap is an XML file with a list of all of the pages on your website. Creating and submitting a Sitemap helps make sure that the engines know about all the pages on your site, including URLs that may not be discoverable by normal crawling processes.

How To Complete

Click here to create your XML Sitemap. Run the XML creation script, download the sitemap file and upload it into the “public_html/” or root folder of your site.

Be sure to alert Google that you have created a Sitemap. You can do this by creating a Google Webmaster Account and adding your sitemap URL.

Identify and fix any broken links on the website.

About Task

Dead pages are also defined as ‘broken’ links These are the links on a page of a site that lead nowhere and can hurt your standing with Google.

Dead pages return a 404 error message when you click on them. This means the page no longer exists on the web. It may have been taken down by the host or moved to a different URL.

How To Complete

Run your site through this ‘broken’ link checker. It will tell you which links no longer work. If you find broken links, go into your website and remove or update all of the dead links.

Implement fixes to address any PageSpeed issues identified.

About Task

After running Google’s PageSpeed Insights tool, work to fix any priority items with a red exclamation point icon. These will have the largest impact on your page’s performance.

How To Complete

The results of your PageSpeed Insights test will determine the magnitude of this task. Learn more about PageSpeed Insights and a few of the best practices associated with the tool here: About PageSpeed Insights

If your website is being penalized by Google, disavow links that could be problematic.

About Task

If your site is penalized due to problematic links, you will need to disavow those links before submitting your site for reconsideration.

You can find Google’s disavow tool here.

How To Complete

Using Google’s disavow tool requires a lot of time and attention to detail, but can be extremely rewarding if done the right way.

Here is a guide to properly using Google’s disavow tool: About Disavow Tool