10 Sure-Shot Ways to Get Penalized and Lose Your Rankings in the Post EMD, Panda and Disavow Scenario

Internet marketing has turned into a fast-paced survival game where both newbies and gurus are dreading the scathing whims of Google alike. The search engine giant has made it clear with the recent Panda, EMD and Disavow updates that it will never rest till it destroys every single SEO shortcut that previously used to work. The current scenario is one of fear and anxiety as internet marketers go to sleep wondering whether their rankings will disappear overnight. In fact, many of the so called ‘gurus’ have now begun to use scare tactics to lure newbies and inexperienced marketers into ‘update-proof’ SEO courses, and many of them end up being scams.10-sure-shot-ways-to-get-penalized-and-lose-your-rankings-in-the-post-emd-panda-and-disavow-scenario

What Google looks for in websites is clearly mentioned in its guidelines. Though the whole thing looks like Google hiring a pack of henchmen to run around the web, and take down thousands of ‘low quality’ sites, the algorithm is a program- a bunch of codes that rank sites according to their quality and relevance. It is a widely known fact that the preferences of the algorithm are not top-secret, and there are many who know a lot about the algorithm.

Here are the top 10 causes for which Google may downgrade a website. It is not guaranteed that following any of these methods can land a website in Google’s hit-list, but each of these methods can expose a site to penalization. It is also not guaranteed that staying away from these tactics will keep any website safe forever, but websites that refrain from committing such crimes will have an edge over others in ranking longer.

Over-optimization

Over-optimization has turned into the most prominent topic in the post-Penguin world. Though over-optimization is a general topic that most internet marketers are familiar with, the finer details remain obscure to most.

According to Google, a website must rank by their natural relevance to the searched keyword. Over-optimization refers to purposefully increasing the relevance of content on a website to a keyword. The practice of Search Engine Optimization is not wrong, but there is a limit. Over-optimization can be either on-page or off-page.

1.  Content Over-optimization

Content is king. It always has been. Search engines, whichever they may be, prefer sites having high quality content. Content created for the sole purpose of SEO is bound to repel the search engines. There used to be a time when over-optimized content used to crawl the top rankings in the search engines, but not any more.

‘Keyword stuffing’ is the name given to the over usage of targeted keywords in the content. Though Google does not state any permissible keyword density vales in its guidelines, the best considered range is 1-2%.

Try reading the content a few times. Make sure that it has good readability, provides some value to the readers and that it makes sense. Use keywords only where they can be used, and only where they blend into the text. Avoid using the same keyword over and over frequently.

Using keyword variations is a hugely search-engine friendly habit. For example, for keywords like ‘exercises for body building’, try using the variations ‘body building exercises’ or ‘exercise and build the body’. Using keyword variations tell the search engines that the content was not written for the sole purpose of ranking.

2. Over-optimized anchor text

Using the main keyword frequently as the anchor text was one of the main practices that got hit in the search engine updates. Webmasters who used their primary keywords in the anchor text widely suffered rank drops, and some of the websites even got de-indexed. It was once a usual practice to use exact match keywords in the links to boost relevance, but in the current scenario it would be wise not to resort to such practices.

Try using more natural anchor text like “Click here” or “Click to find out more.” It is to be noted that over-usage of any text can be harmful. The link text must be varied as much as possible.

3. Over-optimized meta-tags

Always try to keep the meta-tags natural. Meta-tags are vital to any website page as they let the search engines know about the content on a page, and add up to the page’s relevance in searches. Over-optimizing meta-tags refers to keyword stuffing of these tags.

For example, for the keyword ‘Credit Management’, it would be better using ‘Credit Management Company’ or ‘Manage Your Credit’ for the meta tags rather than directly using ‘Credit Management’ or ‘Good Credit Management’.

4. Paid Linking

The paid linking strategy has been on Google’s hit list from the very beginning, and it is worse now. Never sell or buy links, period. In case of paid advertising display the link as sponsored, and use only no-follow links. There has been a long line of sites that got de-indexed due to paid linking.

5. Non-contextual Links

Always try to use contextual links. Many often resort to linking within the page headers or footers, and this just does not look natural to the search engines.

6. Poor Link-Diversity

Link diversity is all about looking natural to the search engines. It is always ideal to have a mixture of both relevant and irrelevant, high and low quality links in the backlink structure of the website. Try to obtain links from a wide variety of sources. A 70-30 ratio between the high quality and low quality links is considered ideal.

7.  Rapid Backlinking

Ever thought about how fast a link building campaign should be? It should be as slow as possible, period. Google looks for natural websites that have high quality content, and a diverse backlink arsenal created over a long period. A new site will never attract backlinks as fast as an established authority site. Just be slow and steady.

8. Duplicate Content

Content is king, period. Never steal content from other websites. Stealing content from other sites, if discovered, can lead to de-indexing within no time. Many article directories provide content with usage rights, but credit must be given to the author. It is a no-brainer that Google undervalues sites having duplicate content.

9. Thin or Spun Content

Article spinning was once an SEO revolution. Webmasters could create tons of unique content easily from a single article. However, spinning is against the guidelines of Google. In most cases, spun content is illegible, and filled with grammatical and structure errors that hinder readability. Google does not like this one bit.

Thin content refers to poorly written content used for the sole purpose of directing traffic to PPC or affiliate ads. It can either be too shallow with very few words, or can be of low quality providing little or no value to the visitors. Google considers sites having thin content as those created just for hosting ads, and not as a service to the user.

10. Content Cloaking

Content cloaking is a spammy method in which the content that is visible to the search engine bots is made to be different from the content that is visible to the users. In the current scenario, the practice of cloaking is utterly meaningless. It is simply impossible to fool Google. Cloaking might have worked earlier, but now it is a ‘do and die’ strategy that can get a site banned within no time in most cases. It would be wise never to resort to such infamous practices, which may piss off the search engines.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s