Why Has My Website Been Removed from Google?

It’s a website owner’s worst nightmare. Having a site removed from Google’s index means you no longer receive traffic from the search engine. With no search engine traffic, your sales plummet, and it can greatly affect your online business’ future. Being removed from Google’s index isn’t permanent as long as you fix the problem. The problem can be egregious violations of Google’s guidelines, or you could have made a simple server configuration error. Here are some quick ways to identify the issue and get your site back into Google’s good graces.

Google Webmaster Tools URL Removal

One of the most common reasons for a site to drop from the index is misusing the URL removal tool in Webmaster Tools. Webmaster Tools has a section where webmasters can remove URLs from the index quickly. It’s intended for pages that have sensitive information that you need removed, or it’s for times when you have several pages indexed that you no longer want in the index.

Many SEOs and webmasters misuse this tool to remove pages that return a 404 (temporarily not found) server message or for canonical issues. The tool should never be used for either one of these issues. The result is the SEO or webmaster accidentally removes the entire site.

In some cases, disgruntled employees or SEOs given access to your Webmaster Tools console will maliciously remove the domain from the index. For this reason, you always want to give SEOs and employees “read only” access to Webmaster Tools.

To remedy this issue, go into Webmaster Tools and click the “Remove URLs” link in the “Google Index” section. Click “All” in the drop-down to view all removal requests. Click “Cancel” if your domain is listed. It takes a few days for the URL to return to the index, but Google will return the site to its previously indexed status.

Spun Content or Extremely Poor Quality Content

Spun content makes horrible content, and you should never engage in machine-created content. Software called a “spinner” takes specific adjectives and adverbs and attempts to create unique content by replacing certain words with related words. The result is horrible content that’s never useful for readers.

If your site is caught spinning content, Google places a manual penalty on the site, and in severe cases, Google removes the site entirely from the index. The only way to recover from this issue is to remove all spun content from the site. In some cases, this is all the content on the site. However, you must remove all spun content to have the manual penalty removed. After you remove the content, file a reconsideration request in Webmaster Tools.

If your site has any type of manual penalty, Google recently started displaying manual penalties in Google Webmaster Tools. Click “Search Traffic” in the navigation panel and click “Manual Actions.” This page tells you if any penalties are applied to the site, so you know you need to take action to fix indexing issues.

Server Configuration Errors

Any type of server configuration error that blocks Google from accessing the site affects your site’s index status. If you have DNS errors, it means Google can’t access the domain. If your server times out, it means Google can’t crawl your site.

One big issue with websites and indexing is poorly configured firewalls. Some web hosts configure firewalls to block too many consecutive page requests. Several requests made consecutively can trigger denial of service (DOS) detection software.

While this is good when it’s a malicious hacker, bots crawl website pages very fast and the crawling can look like a DOS attempt. If your host is incorrectly detecting a bot as a DOS attempt, the firewall blocks Google and your site can drop in rank or be entirely removed. The only way to correct this issue is to ask your host to fix the firewall configurations or move host providers entirely.

If any server configuration errors exist, Google usually reports these errors in Webmaster Tools. Check Webmaster Tools for any errors that could affect your site’s index such as 500 errors or DNS errors. Server 500 errors are general errors, but they are typically coding issues. DNS errors mean Google can’t resolve your site’s IP with the domain name. Your website host can help you fix any DNS errors.

“Noindex” on Your Pages

The “noindex” meta tag tells Google not to index your pages. The meta tag can be manually coded into pages, or some content management software such as WordPress and Joomla have configurations to set the tag in the software’s control panel.

To find out if you have the meta tag set in your site’s code, open your website in any browser. Right-click the page and select “View Source Code.” The meta tag is within the “head” HTML tags. Find all meta tags in this HTML section and make sure none of them have “noindex” in the code.

To fix this issue, remove the meta tag and wait for Google to recrawl the pages. If you use a content management system such as WordPress or Joomla, check the software’s control panel for any settings that block search engines from indexing the pages.

The Site is Brand New and You Need Patience

Site owners think that Google indexes instantly, but Google needs to crawl the site before indexing any pages. Indexing using takes a few days unless you have a backlink to the site. If you submit a sitemap in Webmaster Tools, wait about two weeks before panicking. In most cases, your site is indexed within a week, but some sites take longer.

These are just a few ideas when checking for indexing issues in Google. Google rarely indexes all pages in a domain, but as long as your site focuses on users and you maintain quality, Google will index and rank your website in time. While indexing is easy, ranking is difficult, so always focus on users and quality and have patience when working with your site’s search engine optimization.

10 Sure-Shot Ways to Get Penalized and Lose Your Rankings in the Post EMD, Panda and Disavow Scenario

Internet marketing has turned into a fast-paced survival game where both newbies and gurus are dreading the scathing whims of Google alike. The search engine giant has made it clear with the recent Panda, EMD and Disavow updates that it will never rest till it destroys every single SEO shortcut that previously used to work. The current scenario is one of fear and anxiety as internet marketers go to sleep wondering whether their rankings will disappear overnight. In fact, many of the so called ‘gurus’ have now begun to use scare tactics to lure newbies and inexperienced marketers into ‘update-proof’ SEO courses, and many of them end up being scams.10-sure-shot-ways-to-get-penalized-and-lose-your-rankings-in-the-post-emd-panda-and-disavow-scenario

What Google looks for in websites is clearly mentioned in its guidelines. Though the whole thing looks like Google hiring a pack of henchmen to run around the web, and take down thousands of ‘low quality’ sites, the algorithm is a program- a bunch of codes that rank sites according to their quality and relevance. It is a widely known fact that the preferences of the algorithm are not top-secret, and there are many who know a lot about the algorithm.

Here are the top 10 causes for which Google may downgrade a website. It is not guaranteed that following any of these methods can land a website in Google’s hit-list, but each of these methods can expose a site to penalization. It is also not guaranteed that staying away from these tactics will keep any website safe forever, but websites that refrain from committing such crimes will have an edge over others in ranking longer.

Over-optimization

Over-optimization has turned into the most prominent topic in the post-Penguin world. Though over-optimization is a general topic that most internet marketers are familiar with, the finer details remain obscure to most.

According to Google, a website must rank by their natural relevance to the searched keyword. Over-optimization refers to purposefully increasing the relevance of content on a website to a keyword. The practice of Search Engine Optimization is not wrong, but there is a limit. Over-optimization can be either on-page or off-page.

1.  Content Over-optimization

Content is king. It always has been. Search engines, whichever they may be, prefer sites having high quality content. Content created for the sole purpose of SEO is bound to repel the search engines. There used to be a time when over-optimized content used to crawl the top rankings in the search engines, but not any more.

‘Keyword stuffing’ is the name given to the over usage of targeted keywords in the content. Though Google does not state any permissible keyword density vales in its guidelines, the best considered range is 1-2%.

Try reading the content a few times. Make sure that it has good readability, provides some value to the readers and that it makes sense. Use keywords only where they can be used, and only where they blend into the text. Avoid using the same keyword over and over frequently.

Using keyword variations is a hugely search-engine friendly habit. For example, for keywords like ‘exercises for body building’, try using the variations ‘body building exercises’ or ‘exercise and build the body’. Using keyword variations tell the search engines that the content was not written for the sole purpose of ranking.

2. Over-optimized anchor text

Using the main keyword frequently as the anchor text was one of the main practices that got hit in the search engine updates. Webmasters who used their primary keywords in the anchor text widely suffered rank drops, and some of the websites even got de-indexed. It was once a usual practice to use exact match keywords in the links to boost relevance, but in the current scenario it would be wise not to resort to such practices.

Try using more natural anchor text like “Click here” or “Click to find out more.” It is to be noted that over-usage of any text can be harmful. The link text must be varied as much as possible.

3. Over-optimized meta-tags

Always try to keep the meta-tags natural. Meta-tags are vital to any website page as they let the search engines know about the content on a page, and add up to the page’s relevance in searches. Over-optimizing meta-tags refers to keyword stuffing of these tags.

For example, for the keyword ‘Credit Management’, it would be better using ‘Credit Management Company’ or ‘Manage Your Credit’ for the meta tags rather than directly using ‘Credit Management’ or ‘Good Credit Management’.

4. Paid Linking

The paid linking strategy has been on Google’s hit list from the very beginning, and it is worse now. Never sell or buy links, period. In case of paid advertising display the link as sponsored, and use only no-follow links. There has been a long line of sites that got de-indexed due to paid linking.

5. Non-contextual Links

Always try to use contextual links. Many often resort to linking within the page headers or footers, and this just does not look natural to the search engines.

6. Poor Link-Diversity

Link diversity is all about looking natural to the search engines. It is always ideal to have a mixture of both relevant and irrelevant, high and low quality links in the backlink structure of the website. Try to obtain links from a wide variety of sources. A 70-30 ratio between the high quality and low quality links is considered ideal.

7.  Rapid Backlinking

Ever thought about how fast a link building campaign should be? It should be as slow as possible, period. Google looks for natural websites that have high quality content, and a diverse backlink arsenal created over a long period. A new site will never attract backlinks as fast as an established authority site. Just be slow and steady.

8. Duplicate Content

Content is king, period. Never steal content from other websites. Stealing content from other sites, if discovered, can lead to de-indexing within no time. Many article directories provide content with usage rights, but credit must be given to the author. It is a no-brainer that Google undervalues sites having duplicate content.

9. Thin or Spun Content

Article spinning was once an SEO revolution. Webmasters could create tons of unique content easily from a single article. However, spinning is against the guidelines of Google. In most cases, spun content is illegible, and filled with grammatical and structure errors that hinder readability. Google does not like this one bit.

Thin content refers to poorly written content used for the sole purpose of directing traffic to PPC or affiliate ads. It can either be too shallow with very few words, or can be of low quality providing little or no value to the visitors. Google considers sites having thin content as those created just for hosting ads, and not as a service to the user.

10. Content Cloaking

Content cloaking is a spammy method in which the content that is visible to the search engine bots is made to be different from the content that is visible to the users. In the current scenario, the practice of cloaking is utterly meaningless. It is simply impossible to fool Google. Cloaking might have worked earlier, but now it is a ‘do and die’ strategy that can get a site banned within no time in most cases. It would be wise never to resort to such infamous practices, which may piss off the search engines.