Search engine optimization, or SEO, grows decidedly more complicated every year. Back in the late 2000’s, one could still create a highly ranked website by stuffing it full of keywords and links. That is no longer the case, and search engines like Google continue to tighten their many rules around what is and isn’t good optimization. They also vocalize (through forums, social media networks and periodic announcements) what webmasters should be doing to improve SEO.

Google deliberately hides its SEO algorithm so that webmasters don’t game the system to rank #1 in the search engine results pages. However, several ‘hard and fast” rules do exist on what constitutes good SEO. These rules  also help determine which mistakes should be avoided and/or corrected.

Here are several SEO mistakes that many webmasters make in their rush to improve SEO- and ways that these issues can be resolved.

Keyword stuffing

Keywords should not be mentioned more than a few times in good long-form content; otherwise, that content comes off looking spammy. Despite Google’s continued warnings to not stuff keywords unnecessarily into content, many webmasters are tempted to overuse their target terms.

Google uses Latent Semantic Indexing (LSI) to determine if a piece of content flows naturally via its use of phrase similarities and related keywords. So, if your content contains only one keyword that is repeated over and over, there’s a good chance that this content will appear to be spam to Google.

To correct this problem, use a free keyword density analyzer from SEM Rush or Moz to determine the percentage of your selected keyword. If the density is unusually high (>3%), delete the target keyword’s extra mentions and use synonyms. Those related terms will in turn be picked up by Google’s LSI and work in your favor.

Unfocused content

Just like a person who gets distracted during work and does a bunch of unrelated tasks, thus never finishing the original job, content that veers off on different tangents weakens overall page SEO.

Rambling content that has several different foci will confuse search engines, even if that content is long-form and otherwise good. This is because search engines have one mission only: to deliver relevant content that quickly and succinctly answers the user’s query. When unrelated content is encountered, A search engine like Google flags it for irrelevant keywords.

If you are adding unrelated keywords and content with the intention of ranking for multiple keywords, stop this practice. A page should ideally rank well for one keyword, not three or five.

Not using the keyword tool

Google offers the Keyword Planner through its AdWords program, and this resource enables webmasters to create content centered around in-demand keywords.

google-adwords_-keyword-planner

Some webmasters simply forget to incorporate the planner into their content creation strategy. Others “shoot for the moon” and incorporate keywords with unusually high competition. Such high competition keywords have a low probability of being ranked well because they are fighting for recognition among older and better established websites.

Focus on using low competition and long-tail keywords within your niche via the Keyword Planner. Such keywords have a better chance of receiving a decent PageRank and subsequent traffic.

Not registering with Google/Bing Webmaster Tools.

Both Google and Bing offer webmaster tools that help website owners learn more about and optimize their content. Even better, submitting a website to Google Webmaster Tools or Bing Webmaster Tools is fairly simple.

search-console

Once your website is in place, you can access information about its traffic, internal issues, device compatibility, etc. You also get helpful hints from each respective search engine about how you can improve your stats and site “crawlability.” This is important because, as noted by Google’s Matt Cutts, not having a site that can be easily crawled will affect your site’s indexing in a negative way.

Not using unique titles and meta descriptors.

Title tags are crawled by engine bots and help a website rank higher for certain search keywords. It’s a shame that some website owners actually use the same title for several pages, which ends up confusing both search bots and humans.

Meta descriptors aren’t completely figured into Page Rank. However, each page’s meta area offers the webmaster a chance to create a kind of “sales pitch,” and this pitch is useful for readers looking for specific content. Unfortunately, too many webmasters ignore the meta or, even worse, replicate the same sales pitch for all their pages.

A good meta description tells the viewer what your page is about and increases the traffic that is relevant to it. So, don’t let this opportunity pass you by.

Not creating an XML Sitemap.

The website XML Sitemap is a virtual blueprint of all the pages on a website. It is this blueprint that search engine bots ‘see’ and crawl to better understand website structure. XML sitemaps are also termed ‘URL inclusion tools’ because they inform search engine bots what exactly should be crawled.

A good XML Sitemap will contain URLs that complement each other in terms of topics and subject areas. Search engines also compare URLs against each other for relatedness.

While not having an XML Sitemap doesn’t hurt a website’s SEO, having an XML sitemap does significantly improve SEO and Page Rank.

Improving your website’s SEO

You don’t have to spend endless hours on your site to improve its SEO. Completing just one of the tasks outlined above, such as registering your website with Google’s/Bing’s Webmaster Tools, is a huge improvement. You can do as little or as much optimization as your schedule will allow. You might also choose to focus on just one web page’s SEO and see how well you do. In short, any effort undertaken for SEO is better than none.

Last Updated: December 2, 2016

Start the Discussion

Your email address will not be published. Required fields are marked *