Why Has My Website Been Removed from Google?

It’s a website owner’s worst nightmare. Having a site removed from Google’s index means you no longer receive traffic from the search engine. With no search engine traffic, your sales plummet, and it can greatly affect your online business’ future. Being removed from Google’s index isn’t permanent as long as you fix the problem. The problem can be egregious violations of Google’s guidelines, or you could have made a simple server configuration error. Here are some quick ways to identify the issue and get your site back into Google’s good graces.

Google Webmaster Tools URL Removal

One of the most common reasons for a site to drop from the index is misusing the URL removal tool in Webmaster Tools. Webmaster Tools has a section where webmasters can remove URLs from the index quickly. It’s intended for pages that have sensitive information that you need removed, or it’s for times when you have several pages indexed that you no longer want in the index.

Many SEOs and webmasters misuse this tool to remove pages that return a 404 (temporarily not found) server message or for canonical issues. The tool should never be used for either one of these issues. The result is the SEO or webmaster accidentally removes the entire site.

In some cases, disgruntled employees or SEOs given access to your Webmaster Tools console will maliciously remove the domain from the index. For this reason, you always want to give SEOs and employees “read only” access to Webmaster Tools.

To remedy this issue, go into Webmaster Tools and click the “Remove URLs” link in the “Google Index” section. Click “All” in the drop-down to view all removal requests. Click “Cancel” if your domain is listed. It takes a few days for the URL to return to the index, but Google will return the site to its previously indexed status.

Spun Content or Extremely Poor Quality Content

Spun content makes horrible content, and you should never engage in machine-created content. Software called a “spinner” takes specific adjectives and adverbs and attempts to create unique content by replacing certain words with related words. The result is horrible content that’s never useful for readers.

If your site is caught spinning content, Google places a manual penalty on the site, and in severe cases, Google removes the site entirely from the index. The only way to recover from this issue is to remove all spun content from the site. In some cases, this is all the content on the site. However, you must remove all spun content to have the manual penalty removed. After you remove the content, file a reconsideration request in Webmaster Tools.

If your site has any type of manual penalty, Google recently started displaying manual penalties in Google Webmaster Tools. Click “Search Traffic” in the navigation panel and click “Manual Actions.” This page tells you if any penalties are applied to the site, so you know you need to take action to fix indexing issues.

Server Configuration Errors

Any type of server configuration error that blocks Google from accessing the site affects your site’s index status. If you have DNS errors, it means Google can’t access the domain. If your server times out, it means Google can’t crawl your site.

One big issue with websites and indexing is poorly configured firewalls. Some web hosts configure firewalls to block too many consecutive page requests. Several requests made consecutively can trigger denial of service (DOS) detection software.

While this is good when it’s a malicious hacker, bots crawl website pages very fast and the crawling can look like a DOS attempt. If your host is incorrectly detecting a bot as a DOS attempt, the firewall blocks Google and your site can drop in rank or be entirely removed. The only way to correct this issue is to ask your host to fix the firewall configurations or move host providers entirely.

If any server configuration errors exist, Google usually reports these errors in Webmaster Tools. Check Webmaster Tools for any errors that could affect your site’s index such as 500 errors or DNS errors. Server 500 errors are general errors, but they are typically coding issues. DNS errors mean Google can’t resolve your site’s IP with the domain name. Your website host can help you fix any DNS errors.

“Noindex” on Your Pages

The “noindex” meta tag tells Google not to index your pages. The meta tag can be manually coded into pages, or some content management software such as WordPress and Joomla have configurations to set the tag in the software’s control panel.

To find out if you have the meta tag set in your site’s code, open your website in any browser. Right-click the page and select “View Source Code.” The meta tag is within the “head” HTML tags. Find all meta tags in this HTML section and make sure none of them have “noindex” in the code.

To fix this issue, remove the meta tag and wait for Google to recrawl the pages. If you use a content management system such as WordPress or Joomla, check the software’s control panel for any settings that block search engines from indexing the pages.

The Site is Brand New and You Need Patience

Site owners think that Google indexes instantly, but Google needs to crawl the site before indexing any pages. Indexing using takes a few days unless you have a backlink to the site. If you submit a sitemap in Webmaster Tools, wait about two weeks before panicking. In most cases, your site is indexed within a week, but some sites take longer.

These are just a few ideas when checking for indexing issues in Google. Google rarely indexes all pages in a domain, but as long as your site focuses on users and you maintain quality, Google will index and rank your website in time. While indexing is easy, ranking is difficult, so always focus on users and quality and have patience when working with your site’s search engine optimization.

Six Effective Ways to Reduce Cart Abandonment in Your Online Store

In 2012, Comscore estimated that 67 percent of shopping carts are abandoned by customers upon checkout. In 2013, Listrak pegged shopping cart abandonment rate at 75 percent. It looks like this conversion-lowering woe of e-commerce site owners has not been properly addressed.

Use your checkout page to drive conversion. Simple tweaks to your e-commerce site design can work wonders for your sales and reduce the number of abandoned carts. Here are six basic ways to minimize cart abandonment.

1. Do Not Ask a Customer to Register Before Shopping

Prompt your customer to register after he/she has completed placing the order. The annoying pop-up or redirect asking to create an account before you allow a user to shop turns away potential buyers. On a different note, functionally like Facebook, Google, LinkedIn, PayPal connect could be tried out depending on the audience you are targeting.

2. Help Your Customers by Suggesting Related ProductsSix Effective Ways to Reduce Cart Abandonment in Your Online Store

Suggest related products before your customer reaches the checkout stage. Many shoppers relish being confident about their purchase. And one way of ensuring that they have added to their shopping carts the product options that they find most satisfactory is to situate your recommended related items before checkout. If you sell electronic gadgets, for instance, a customer may forget to buy a much-needed adapter for a certain product. Preempt that need and along with reducing cart abandonment, you also increase your cross-selling or upselling rate.

3. Show All the Fees While the Customer Loads Items Onto the Shopping Cart

“Hidden” fees that only appear upon checkout are the bane of many online shoppers. Display taxes and shipping fees on the cart at all times. If you show applicable fees only at the final checkout page, then you risk having the customer abandon his cart.

Also, consider offering free shipping. A study made by Forrester Research revealed that 44 percent of online shoppers abandoned their carts due to high shipping costs and 22 percent did so because there was no mention of shipping fees at all. If you cannot afford to offer free shipping, then have a flat shipping rate instead.

4. Improve the Usability of Your Shopping Cart

Items added to the cart must be visible at all times. Display small images of the items that are already loaded by the customer to the cart. That way, he does not need to backtrack. Show all the applicable fees, as well.

And while the customer is looking for other products to buy from your store, the shopping cart must be readily accessible from all web pages, preferably through a dropdown menu on the upper right hand corner of the screen.

Make the shopping cart button distinct from the checkout button. There should also be enough space separating them. Help the user avoid clicking the wrong button as that can easily put him off from completing the purchase.

Design your shopping cart to accommodate changes in product quantity. Your customer may remove items from the cart, and he should be given the ability to do so in one click.

Another way to limit cart abandonment is to allow your customers to easily email or print out the contents of their cart. An assistant, for example, is buying for his boss and may need a go-signal before charging a corporate credit card. If there are interruptions to the checkout process, make it easy for the customer to resume buying and finally pay for the items.

Amazon.com allows users to add products to their wishlist and save items on their cart. It is an excellent cart usability feature that encourages future sales. You might want to consider adopting it.

5. Shorten Your Checkout Process

If you have multiple forms to fill out, a webpage for survey questions and recommended products, and another webpage to showcase your current promotions before the customer reaches the billing page, and then you might as well count that customer as an abandoned-cart case.

Consider making guest checkout an option. Book Depository is not only successful because of its free shipping deal. Buy from Book Depository and use PayPal as a mode of payment. Notice how the cumbersome registration and billing forms are missing. The checkout process is remarkably seamless and fast. There are only two web pages after clicking the checkout-with-PayPal button, lessening the chances of a customer changing his mind about the purchase.

6. Offer Different Payment Options

On top of credit cards, third-party online payment services like PayPal, Interac (in Canada), Debit Cards, are crucial if you want more customers to complete the checkout process. Accept a variety of payment methods and grow your customer base.

The Hashtag: Pervasive, Powerful and Indispensable

The “hashtag” is a powerful tool that originated on Twitter and has since spread to other social media sites. At its most basic, the hashtag is a word or group of words that follows the pound sign and is used to assign messages to a particular topic. Casual users utilize hashtags to organize their messages and make them searchable under a given search term. Internet marketers can use them to solidify their online presence and increase brand awareness, although many marketers remain ignorant of the power of this tool.

The surface-level benefits of hashtags are many and immediate. Most importantly, placing a hashtag within a tweet makes it visible to anyone searching for that hashtag. Furthermore, when other users click on the hashtag, they are taken to a list of all other tweets that contain it. This makes it easy for users to organize their tweets under a single banner, making their message more memorable to other Twitter members. Twitter users can place a hashtag anywhere within a tweet.

Internet marketers are well advised to embrace hashtag technology. Awareness of the hashtag and the power that it represents is increasing. During the 2013 Super Bowl, for instance, 38% of advertisements contained hashtags. If a marketer is not using hashtags, the chances are good that their competitors are. Additionally, other social networks including Facebook, Instagram, Pinterest, Google+ and Tumblr have adopted it. Functionality of the tool remains largely the same across platforms.

Hashtag usage still occurs most often on Twitter, and full-scale adoption on social network Facebook has been slow. However, many Internet marketers anticipate Facebook adoption to increase rapidly as casual users learn of the benefits of the tool. Twitter marketers can search for a hashtag to use on hashtag.org. Anyone can use a hashtag, and there is no need to register one before use. Consequently, marketers are encouraged to create a hashtag that is both relevant to their brand and not in use. Facebook users can look for any hashtag using the site’s search functionality found at the top of almost every page.The Hashtag Pervasive, Powerful and Indispensable

Marketers wishing to take advantage of this powerful tool should ensure that they are using the same hashtag on Twitter and other social media platforms. Using multiple hashtags confuses fans and reduces brand recognition. A single hashtag also allows marketers to gain a clear bird’s-eye view of all the discussion concerning their brand.

To begin working toward hashtag cohesion, marketers are advised to use the same hashtag on their tweets or posts for several weeks. During this time, it is likely that fans will pick up on this gentle prompting and follow suit. To further raise awareness of their brand, marketers can use their new hashtag in a promotional campaign. Promotions, whether on Twitter, Facebook or any other channel create heightened awareness of the pertinent brand for their duration. Facebook has dropped their requirement that contests be run through dedicated apps, making it easier than ever to run promotions.

To take advantage of trending topics or fads—and the traffic they generate—marketers can simply identify the hashtag associated with the event and use it in a tweet, along with their own hashtag. This tactic results in their hashtag showing up in the more popular hashtag’s tweet thread. It is important, however, that users integrate the trending hashtag into their tweets in a natural way.

Once users gain basic proficiency with hashtags, they should branch out into wider tags. This tactic may be especially effective on Facebook or anywhere else where many people are posting to a general yet popular hashtag, such as “#summer,” “#baseball,” or “#fastfood.” Users can simply integrate the wider hashtag naturally into a tweet or post that contains their own hashtag. This is an excellent way to reach new fans and customers. All Internet marketers are encouraged to take full use of hashtags. They provide an efficient manner to reach new leads and keep customers engaged.

Mobile SEO Now More Critical after Google Alters Algorithms

It is not often that an algorithm change from Google goes relatively unnoticed, but the latest move from Google has flown under the radar a bit despite the big impact it could have on many websites. In a recent post on the company’s webmaster blog, Google announced that algorithm changes were now in place that could negatively impact those sites that offer a poor user experience to visitors using smartphones or other mobile devices.

Recognizing the obvious shift toward mobile internet usage, Google is encouraging webmasters to offer visitors more than just a mobile version of their site. Google wants all users to get the full internet experience they are looking for, even when they access the web via smartphones. As a result, algorithm changes from Google will now boost the rankings of those pages that offer top notch mobile infrastructure, while punishing sites that do not offer a good mobile experience.

Identifying Two Common Mistakes

The blog post pointed out that Google has yet to roll out the planned algorithm changes to enhance mobile user experience, meaning there is time to for webmasters to fix key areas of concern before the algorithms begin to have a negative impact on page ranking. Google highlighted two common mistakes that webmasters make when developing mobile versions, offering reasons for the problem and helpful tips in solving them.

First, it was pointed out that faulty redirects are common problem that irritate smartphone users and is easily fixed by a capable webmaster. Faulty redirects occur when a mobile user is redirected by a site to the main page of the mobile version rather than to the corresponding mobile version of the desktop version of the page they were on. The following diagram puts this concept into image form:

In simpler terms; when a smartphone user is on a subpage and tries to navigate to the mobile version of that sub-page, they are more often than not redirected to the mobile version homepage. This means they have to start their navigation over again to the sub-page they were on. While it may be a minor inconvenience, it can interrupt a visitors flow and cause them to not return to the site in the future.

The problem is easily fixed by setting up redirects such that users are sent from the desktop version directly to the same content on a mobile version. If the site doesn’t have the content available in a mobile version, it is best to leave them on the desktop version rather than interrupt their workflow on the site.

Secondly, Google pointed out that many mobile sites have smartphone-only errors on them. This means that desktop or tablet visitors using desktop version might not experience any errors on a given page, where mobile-version users will encounter an error page when loading the same content in a mobile version.

Common mistakes in smartphone-only errors, as listed by Google, included the following:

–          Incorrect handling of Googlebot-Mobile. When employed incorrectly, Googlebot-Mobile creates an infinite redirect loop where mobile visitors are redirected to feature phone optimized sites, which in turn redirects smartphones back to the desktop site.

–          Unplayable videos on smartphones. A lot of video content is embedded on websites and designed for desktop viewing, but fails to load on smartphones.

The issue with Googlebot-Mobile can be avoided by setting up Googlebot-Mobile user agents to identify smartphones as smartphones, not feature phones, and redirecting them to the appropriate mobile version (if it exists, remember no faulty redirects) rather than simply sending them directly to the desktop version.

Additional Misconfiguration Problems

While Google highlighted the issues above as the most common, webmasters will need to address all mobile configuration issues in order to avoid punishment at the hands of the new algorithms. The blog post contained links to specific smartphone misconfiguration issues Google believes webmasters should be aware of. These other misconfiguration included:

–          App download ads that hinder usage: Some on-page advertisements for a site’s apps actually hinder a smartphone user’s experience. Google recommends using either a smart app banners or a simple HTML image with a link to the proper app download store.

–          Develop content that loads faster on mobile versions. Smartphone users are facing steeper costs for data and restrictions on usage as unlimited plans disappear. Pages should be designed to load faster and offer an optimized experience for mobile users.

Lasting Message for SEOs

In the end, Google’s overall message with this algorithm changes appears to be that SEOs and webmasters need to focus on delivering the right content at the right time and less on delivering the specific mobile experience on a device by device basis. Mobile access is differentiating as more and more tablets and smartphones hit the market.

SEOs and webmasters need to adapt a more responsive design that delivers the right content to the mobile user, even if that content is not necessarily designed for the mobile device they are using to access the site.

Five Common Social Media Mistakes

Social networking conceptMore and more small businesses are catching on to the fact that social media is more than the latest hype, and they are beginning to see its value for marketing purposes. Larger businesses have paved the way for these smaller businesses to follow. They have done this by making a lot of mistakes from which small businesses can learn a few valuable lessons. Below is an overview of five common mistakes you should try to avoid making with your business.

Mistake 1: Constantly offering special deals.

While daily or weekly offers are no doubt appealing to consumers, you run the risk of your business developing the reputation of being an online discount store. Be sure to offer added value to consumers beyond being the cheapest. Your goal is to get people interested in buying from you for other reasons than just price.

Mistake 2: Waiting for consumers to find you.

Just because your business has a Facebook page and a Twitter account, that doesn’t mean that consumers will spontaneously drop by. Setting up a social media account or two alone is not enough. You will need to put some time and effort into the various social media channels by engaging in conversations with potential fans or followers.

Mistake 3: Posting long (news) items.

It is best to avoid posting entire articles on Facebook or Google+, no matter how interesting or relevant the information might be to your fans or followers. Instead, you should post a teaser. This usually consists of a short introduction and a link to the relevant web page.

Mistake 4: Frequently organizing competitions.

If you run competitions all the time, consumers will start to feel less emotionally connected with your brand. In time, they might not even remember what products or services you offer. Instead you will forever be associated with the chance of winning something. You can, however, run competitions periodically, as long as you make sure that you leave plenty of time in between competitions and you never run two competitions at the same time.

Mistake 5: Blocking negative feedback.

Larger companies have been known to ignore or even remove negative comments by consumers. Funnily enough, consumers are suspicious of businesses that receive only positive comments all of the time. When consumers see some negative comments as well, they get a much better impression of the business, because their suspicion that the comments might be fake falls away completely.

Social media is a great marketing tool for small businesses, but when things go wrong, they go wrong very quickly. With social media, your mistakes are there for everyone to see. That is why it is important to set out a clear strategy before you begin using social media. Some sound practical judgment will stand you in good stead.

10 Sure-Shot Ways to Get Penalized and Lose Your Rankings in the Post EMD, Panda and Disavow Scenario

Internet marketing has turned into a fast-paced survival game where both newbies and gurus are dreading the scathing whims of Google alike. The search engine giant has made it clear with the recent Panda, EMD and Disavow updates that it will never rest till it destroys every single SEO shortcut that previously used to work. The current scenario is one of fear and anxiety as internet marketers go to sleep wondering whether their rankings will disappear overnight. In fact, many of the so called ‘gurus’ have now begun to use scare tactics to lure newbies and inexperienced marketers into ‘update-proof’ SEO courses, and many of them end up being scams.10-sure-shot-ways-to-get-penalized-and-lose-your-rankings-in-the-post-emd-panda-and-disavow-scenario

What Google looks for in websites is clearly mentioned in its guidelines. Though the whole thing looks like Google hiring a pack of henchmen to run around the web, and take down thousands of ‘low quality’ sites, the algorithm is a program- a bunch of codes that rank sites according to their quality and relevance. It is a widely known fact that the preferences of the algorithm are not top-secret, and there are many who know a lot about the algorithm.

Here are the top 10 causes for which Google may downgrade a website. It is not guaranteed that following any of these methods can land a website in Google’s hit-list, but each of these methods can expose a site to penalization. It is also not guaranteed that staying away from these tactics will keep any website safe forever, but websites that refrain from committing such crimes will have an edge over others in ranking longer.


Over-optimization has turned into the most prominent topic in the post-Penguin world. Though over-optimization is a general topic that most internet marketers are familiar with, the finer details remain obscure to most.

According to Google, a website must rank by their natural relevance to the searched keyword. Over-optimization refers to purposefully increasing the relevance of content on a website to a keyword. The practice of Search Engine Optimization is not wrong, but there is a limit. Over-optimization can be either on-page or off-page.

1.  Content Over-optimization

Content is king. It always has been. Search engines, whichever they may be, prefer sites having high quality content. Content created for the sole purpose of SEO is bound to repel the search engines. There used to be a time when over-optimized content used to crawl the top rankings in the search engines, but not any more.

‘Keyword stuffing’ is the name given to the over usage of targeted keywords in the content. Though Google does not state any permissible keyword density vales in its guidelines, the best considered range is 1-2%.

Try reading the content a few times. Make sure that it has good readability, provides some value to the readers and that it makes sense. Use keywords only where they can be used, and only where they blend into the text. Avoid using the same keyword over and over frequently.

Using keyword variations is a hugely search-engine friendly habit. For example, for keywords like ‘exercises for body building’, try using the variations ‘body building exercises’ or ‘exercise and build the body’. Using keyword variations tell the search engines that the content was not written for the sole purpose of ranking.

2. Over-optimized anchor text

Using the main keyword frequently as the anchor text was one of the main practices that got hit in the search engine updates. Webmasters who used their primary keywords in the anchor text widely suffered rank drops, and some of the websites even got de-indexed. It was once a usual practice to use exact match keywords in the links to boost relevance, but in the current scenario it would be wise not to resort to such practices.

Try using more natural anchor text like “Click here” or “Click to find out more.” It is to be noted that over-usage of any text can be harmful. The link text must be varied as much as possible.

3. Over-optimized meta-tags

Always try to keep the meta-tags natural. Meta-tags are vital to any website page as they let the search engines know about the content on a page, and add up to the page’s relevance in searches. Over-optimizing meta-tags refers to keyword stuffing of these tags.

For example, for the keyword ‘Credit Management’, it would be better using ‘Credit Management Company’ or ‘Manage Your Credit’ for the meta tags rather than directly using ‘Credit Management’ or ‘Good Credit Management’.

4. Paid Linking

The paid linking strategy has been on Google’s hit list from the very beginning, and it is worse now. Never sell or buy links, period. In case of paid advertising display the link as sponsored, and use only no-follow links. There has been a long line of sites that got de-indexed due to paid linking.

5. Non-contextual Links

Always try to use contextual links. Many often resort to linking within the page headers or footers, and this just does not look natural to the search engines.

6. Poor Link-Diversity

Link diversity is all about looking natural to the search engines. It is always ideal to have a mixture of both relevant and irrelevant, high and low quality links in the backlink structure of the website. Try to obtain links from a wide variety of sources. A 70-30 ratio between the high quality and low quality links is considered ideal.

7.  Rapid Backlinking

Ever thought about how fast a link building campaign should be? It should be as slow as possible, period. Google looks for natural websites that have high quality content, and a diverse backlink arsenal created over a long period. A new site will never attract backlinks as fast as an established authority site. Just be slow and steady.

8. Duplicate Content

Content is king, period. Never steal content from other websites. Stealing content from other sites, if discovered, can lead to de-indexing within no time. Many article directories provide content with usage rights, but credit must be given to the author. It is a no-brainer that Google undervalues sites having duplicate content.

9. Thin or Spun Content

Article spinning was once an SEO revolution. Webmasters could create tons of unique content easily from a single article. However, spinning is against the guidelines of Google. In most cases, spun content is illegible, and filled with grammatical and structure errors that hinder readability. Google does not like this one bit.

Thin content refers to poorly written content used for the sole purpose of directing traffic to PPC or affiliate ads. It can either be too shallow with very few words, or can be of low quality providing little or no value to the visitors. Google considers sites having thin content as those created just for hosting ads, and not as a service to the user.

10. Content Cloaking

Content cloaking is a spammy method in which the content that is visible to the search engine bots is made to be different from the content that is visible to the users. In the current scenario, the practice of cloaking is utterly meaningless. It is simply impossible to fool Google. Cloaking might have worked earlier, but now it is a ‘do and die’ strategy that can get a site banned within no time in most cases. It would be wise never to resort to such infamous practices, which may piss off the search engines.

How to “Steal” Your Competitors’ Links Legally

Many SEO companies are nothing more than link building companies. While there’s nothing wrong with that, these link builders typically only have one way to build links: they buy them. Yes, you’re not supposed to do it, but many big names in the industry do it, and are getting pretty covert about it. Of course, for every link broker, there are clients. Big-name companies are engaging in link buying on a massive scale.

Unless you’re a big company, with a big budget, link buying might not be feasible. Even if it is, it might not be a smart idea. Companies like J.C. Penny have the means to get back into the good graces of Google. You might not. You need SEO tactics that can crush your competition without going blackhat.

Take Advantage Of Your Lazy Competition.how-to-steal-your-competitors-links-legally

Let’s face it: most of your competitors are lazy. Even the big boys. They’d rather throw money at the problem and try to solve their ranking problems that way. Take advantage of this. There comes a point when many people give up on an idea they had for a website. They either get lazy or they lose motivation or something else happens.

However, chances are they created a lot of content and they actively worked to get backlinks. Old sites that are no longer kept up in good working order probably have a lot of broken links. This means that there are websites out there that are pointing to 404 pages. These 404 pages used to contain some good stuff. They don’t anymore – obviously.

This is a goldmine. Find these broken websites (i.e. geocities communities) and drop the 404 page into the way back machine. Find out what used to be on there. Find out who links to these now defunct pages. Contact those websites and tell them “hey, you’re linking to a dead page.” Create content that is at least as good as the dead content on that 404 page. Now you have some kind of valuable offer for these people who are linking to the 404 page.

These websites can continue linking out to a broken webpage, or they can link to you. Remember, you came to them with a value proposition. You’re not trying to scam a link from them. You actually have something they want – otherwise they wouldn’t have linked out to that old website in the first place. You can help their SEO and yours at the same time. They’ll probably even thank you for it.

Rinse. Repeat. Crush your competition.

Outranking Your Competitors

If you want to outrank your competitors and get that first spot in the Google search engine you need to investigate the websites that are ranking above you right now. You need to know what they are doing right so that you can copy their moves. But copying their moves will not be enough for you to outrank them, as you will then be two equally good websites. You instead have to be better than they are and have greater content, better backlinks and a better reputation.

Check the Backlinks of Your Competitors

Do some research to find out where your competitors get their backlinks. You can do the simple Google search where you type in “link:websitename.com” in the search field, but this will only give you some of the sites that links to your competitor. There are also some free tools online that allow you to check websites’ backlinks. So go through the list and select the sites that look good and relevant and have a PageRank over 3. Find out how you can get a backlink from that site. Do they allow DoFollow comments or should you E-mail them about guest blogging opportunities?

You should not focus on how many backlinks your competitors have but more on the quality of the backlinks. If they for instance have ten thousand backlinks but only five of them come from websites with a high PageRank, you may have a good chance of outranking them if you can get those five high quality backlinks as well.

So try to find some really great sites such as CNN, BBC, or some sort of popular online news-site and see if it is possible to get a backlink from some of them.

Offer Something That They Do Not

Take a look at your competitors’ websites and see what they offer people. What kind of information do they offer, do they have some sort of free download or do they have a community such as a forum or membership site? Write down everything you can find on your competitors and why you think this makes them successful. So for instance, why would a forum make a website popular? People love to be a part of a community, especially if they are not that many websites in that particular niche. They like to share their own knowledge and experience and get feedback and help from others who are in the same situation and are interested in the same topic.

Are They Bookmarked?

Outranking Your CompetitorsSee if you can find any social bookmarking share buttons on their sites such as StumbleUpon, Digg, Reddit or a Facebook like button. If they have some of these you will sometimes be able to see how many have liked, tweeted or stumbled their sites. This will give you an idea of how popular the sites really are.

How is Their Content?

Take some time to actually read some of the website content to see how good the sites really are. Is the content unique or just like what you could find on most websites and blogs? If the content is not that great you will have an easier time outranking them.

Make Keyword Targeted Videos

Submitting videos to those well-known sites such as YouTube and Metacafe is a really good idea, as Google sometimes tends to rank videos much higher than websites. So make some videos on some of your keywords and then hope that Google will place the videos on the number one spot.

So improve your ranking and outrank all of your competitors. Getting to the top in Google means that you will get most of the visitors, which means your will get more traffic, which leads to more sales in the end.

Is LSI better than backlinking for SEO?

Have you ever wondered how Google can guess what you’re searching in just a few letters? Were you ever surprised to find that the advertisement on the side of the screen always seemed to match whatever you happened to be thinking about? That’s no coincidence. That is a clever technique known as Latent Semantic Index in development at Google – now progressing in leaps and bounds.

In short, LSI is the ability to spot patterns in the words. People can remember dozens of ‘synonyms’, or words which mean basically the same thing. If in doubt, we can fish out a thesaurus or, yes, look it up on the internet. Even though the computer can tell us word alternatives, it doesn’t understand in the same sense, because it doesn’t speak human languages. It just processes data rather than ‘reading’ in the way a person does so easily. This makes matching text by content far more complicated.

The analysis comes from a math principle called Singular Value Decomposition, or SVD (simply put, a computer algorithm for working out statistics). You can read this wiki article on LSI for better understanding. After finding that its Adsense campaign was pretty successful, Google hired Applied Semantics to figure out how to improve the smooth running of their search engine. Before that, irrelevant pages would sometimes appear simply because they were stuffed with keywords.

Any webmaster familiar with Search Engine Optimization will know how it used to work – the text of a website would be filled to the brim with the same words over and over again, in order to trick search engines to think that the site had more relevant content than it really did. This ‘keyword stuffing’ technique bumped up the page ranking of these sites so that they appeared on the first page of search results. So, whatever you typed, you’d be likely to get a boring article, so full of key phrases it was practically unreadable.

On the other end of the scale, companies would insert irrelevant words into their text in order to ensnare people who were looking for something completely unrelated. Sneakily putting popular search words in to get hits on the site used to work (and still does on sites like Youtube) but with the semantic indexing project, all this does is water everything down. It’s a bit like if you plant seeds over a wide field, some of them will get neglected – if a web navigator scatters the keywords over too many distant topics, Google won’t recognize the site as a specialist in any topic.

Google’s ability to search for related content is a breakthrough. It ignores repetition and strikes offending sites from the rankings. Making use of this is a simple matter of using a whole bunch of related terms which people commonly search for – a technique called Latent Semantic Indexing. It sounds complicated, but it is actually a fairly basic part of SEO and something which we do naturally.

For the best success, you will need to find the most searched for terms. Using Google itself, any regular web author can use the tilde (~) in front of the search term to find result running along the same theme. To see which terms are likely to be most popular, head to a ‘high authority’ site – a website that is renowned for its accurate information on whatever subject of your site is about. Usually this is a big company, like Apple. Whichever terms they use, web architects should use.

Not only that, but Google can tell when two words that relate to different topics but spelled the same (called ‘polysemy’). It will not confuse inflation of the economy with inflation of a balloon. The spiders are able to crawl through the results and find out which words and phrases relate to each. Inflation in the sense of money is not likely to contain the words ‘hot air’. So if you search ‘inflation hot air’, you will not get a piece from the Wall Street Journal.

In that case, it’s better not to use any terms which might cause the poor Google spiders to get confused. If you wrote: ‘this panic about inflation is a load of hot air’, your article might be mistaken for a ballooning article and those who want to read about the economy won’t find your site with the usual search items.

Now that Google and other search engines analyze content more closely, the need for endless back links and anchor text has reduced. This should be a huge relief to any individual webmaster or a starting-out company with not many links to show for themselves – now new kids on the block get a piece of the pie, rather than having all the website visits stolen by big co-operations who throw money at IT specialists.

Before, topping the results included bumping up the page rank using links to other related sites (particularly ‘high authority ones) or internal links to other pages on that particular website. Keyworded-up, the links drew the spiders. They explored the site, checked how many reliable, functioning links there were and made assumptions about the content of the site based on how many other decent sites it referenced. All the line and colors made it a hideous strain on the eyes to read and rarely was the content any good – scraps of info laced with advertising opportunities.

These days, it’s actually better to reduce the number of links. Too many links going all over the place send your friendly internet spiders away from the most important places. Say you linked to an ICT firm and your main site from your blog. You probably want more focus on your main site, so that your main site can get an increased page rank, right? That won’t happen. Those links have equal value, so the spiders will branch off in all directions and get distracted.

The best way to set up a site that makes full use of LSI is to have numerous pages relating to a number of select topics within a theme, linked together in a straight forward way. If you make the search engine crawlers’ job easier, you will reap the benefits. Here is a basic picture of how the site should be mapped:

Every page contains only one link that directly leads to the next page. The last page leads to an external website. As you can see, it is divided into distinct sections. That way, the subjects covered aren’t competing with each other. In each topic, it’s best to pick a title and an angle that isn’t likely to have been used before. That way, you have no competitors for that article.

On a general subject, like nutrition, you might have three themes (minimum). Those loose themes are ‘landing pages’, an area for all the related topics to be pooled. The themes could be “Common Food Myths”, “The Uses of Supplements” and “The Truth about Exercise”. On these themes, there should be at least five separate pages linking into each other, each explaining an individual topic on the theme – e.g. on Food Myths, one article on meat, another on diets, etc. This layout will help both spiders and people navigate the site. A search of ‘meat myths’ will fetch up that one specific page that links to the rest of your site. This is better than putting all the good stuff on one page and risking looking irrelevant in the all-seeing eyes of Google.

Smart web maintenance will involve checking the present of keywords related specifically to the subject. Take this blog – even as you read it, it has been regulated for semantic search engines to increase its probability of being picked up by the counter. Each paragraph that you have head will have contained a handful of words relating to LSI without resorting to repetition – any writing on semantically related words will contain basic related words like ‘webpage’ and ‘Google; plus more specific terms like ‘synonym’ are vital tools for a webmaster to cover all the variables of SEO. Care has also been taken not to overlap with polysemic words or clash with other major sites. All it takes is some research and careful planning. High ranking blogs and articles go to the best prepared.

A search optimized piece of writing does not have to be over-technical or mind-numbingly dull. No longer do good search engines favour pages that endlessly repeat themselves. With LSA is a system that, like most new tech, improves the way we use the internet to allow the freedom to express what needs saying without worrying about falling in the ranks. It’s a new era for web content and every web writer is chomping at the bit to stay on top – understand Latent Semantic Indexing and you will achieve this perfectly. Also, application of LSI doesn’t completely overrule the value of backlinking although with the launch of ‘Google Caffeine’ it appears that LSI’s importance is a bit higher than backlinking. LSI is just one of the components that you should consider if you would like to optimize your site/portal for search engines.

Google Panda and the online world

Google Panda: A New Beginning for Some, but the Beginning of the End for Others

Google was founded in 1998 and had almost immediately thrown the likes of Yahoo and Alta Vista out of competition with the unique algorithm that its search engine was founded on. Whereas the latter two search engines focused more on the quality of content as well as its journalistic integrity, Google had found another – and supposedly better way – of ranking sites. Although the new algorithm initially took website owners by surprise, SEO specialists soon discovered a way to make their websites gain better rankings than those with professional journalists in their staff.

Writing for Search Engines

After a period of time, wily website owners discovered that writing for search engines rather than for the readers themselves was what would give their websites higher ranking in Google’s search page results. This meant regularly updating their websites with keyword-rich content. Google’s algorithm apparently didn’t place much importance on whether or not new content actually added value to the website. 

Introducing Google Panda

Last February 2011, Google once again surprised the Internet community with its Google Panda. It consisted of a series of updates made to Google’s algorithm and which was designed to reverse the entire ranking process. Google wanted high-quality websites back on the map while low-quality sites were meant to suffer the consequences of their low-quality content. Google Panda was supposed to be different because of its incorporation of more “user feedback signals”.

Changes Caused by Google Panda

Changes caused by Google Panda were not just immediate but considerably significant as well. News websites reported a quick and huge surge in rankings. Fox News, which had previously ranked #89, climbed to #23 after one of Google Panda’s rolled out updates.

Conversely, websites like EZineArticles.com and AssociatedContent.com suffered the backlash with lowered rankings. Other “content farm” websites like WikiHow and eHow also suffered. Even comparison websites like Next Tag suffered as well. Unfortunately, the same changes also caused a decrease in ranks for websites with original and legitimate content such as the British Medical Journal.

CNET’s report also revealed many other interesting results. The report showed that Google Panda did not affect the top rankings enjoyed by websites like YouTube, Amazon, Wikipedia, and IMDB. These websites could be safely considered as those with dominant authority in their respective niches.

Google Panda updates also appeared to cause a slight improvement in rankings of social Web 2.0 sites such as Facebook and Twitter while other sites like WebMD, Flickr, Yelp, and even Apple.com suffered a minimal decrease in their rankings. On the upside, government websites enjoyed an increase in their rankings. White House’s official website, for instance, surged to #79 from its previous ranking of #125.

No More Updates for 2011

Google recently announced that it will no longer release any updates for Google Panda for the remainder of 2011. Although that only leaves website owners just a few weeks at most to regroup and rethink their strategies, additional time for adjustment could still mean a lot. With so many news reports recently flooding the Internet to help affected website owners survive Google Panda, there is no reason why website owners will not gradually re-learn the old ways and place greater priority on content quality as they once did.


How to Survive in the Post-Panda Apocalypse

In 1998, Google turned the Internet upside-down with its new search algorithm, one that SEO experts found a way to exploit and which eventually allowed low-quality websites to lord it over high-quality websites. Last February 2011, Google once more initiated a shakedown with the launch of Google Panda, which updated and supposedly improved its search algorithm. The new algorithm was supposed to bring back the former glory of high-quality websites. Whether this remained true or not is, in the end, unimportant. As Google Panda seems here to stay, website owners have no choice but to follow the beat of the drummer and march accordingly. 

Black Hat versus White Hat SEO Techniques

In the past, black hat SEO techniques was frowned upon but were nonetheless allowed to exist and help improve website rankings. Now, Google has become much stricter and websites that employed black hat SEO techniques are likely to suffer from a huge blow in terms of rankings. In fact, it has already happened to “content farm” websites such as EZineArticles.com as well as FindArticles.com.

Black hat SEO technique is best defined as anything that is meant to please search engines without a care to how it would please human readers or visitors. Examples of commonly used black hat SEO techniques would include invisible keyword text, keyword stuffing, and mirror websites. If your website uses any black hat SEO technique, it’s best to rectify the matter immediately before Panda catches on and red-flags your website.

White hat SEO techniques are, of course, the opposite. These techniques please both search engines and human readers. Examples of white hat SEO techniques would include posting original and well-researched and immensely readable content.

Obviously, employing white hat SEO techniques could appear to be an enormous challenge. This is especially true for website owners who had gotten lax over the years and mostly relied on black hat SEO techniques to improve their rankings. Change, however, is not just inevitable but is already taking place. Websites have to learn how to adapt to Google Panda if they want to survive. 

Top Tips on How to Survive Google Panda

Even if your website has already been flagged, the good news is that Google is a relatively forgiving search engine. You can always start again by making the necessary changes to your website, and Google will rank your website accordingly. The tips below can help you get started.

  • Duplicated content is a big no-no.Ignoring the results of copyright infringement detection tools like CopyScape is one of the worst mistakes you can ever make. If there’s content from another website that you truly wrote, then rewrite and repost it. Google doesn’t mind you doing that, but it does care if you simply cut and pasted the entire content on your website.
  • Place greater emphasis on original and value-rich content.Conversely, reduce the amount of web space that you dedicate to advertisements. The new algorithm frowns upon sites that have tons of these so tone them down on your site as soon as possible.
  • Google Panda loves buzz-worthy website.The new search algorithm places higher rankings on websites or pages that are shared socially. Think of webpages that get bookmarked, tweeted, shared through Facebook, and blog posts that have lots of comments. This means not only totally opening your site to user comments, but integrating tools that allow for social media sharing of your site as well.


Major Areas Affected by the Google Panda Update 

One cannot simply avoid being reliant on Google’s search engine for visibility on the online world. After all, much of the search engine optimization techniques that have been so far conceived are based on how this search giant develops its algorithms for indexing and ranking sites on their organic search results pages. Thus it comes as no surprise that a huge amount of complaints and arguments have surfaced since the Panda update rolled out. In an attempt to understand this significant change in the online visibility game, presented here is an outline of some of the types of websites that have been greatly affected by the Panda.

Affiliate Marketing

According to a poll conducted by Search Engine Roundtable, affiliate sites seem to be the hardest hit. Given that the general aim of the update was to improve user experience, stricter categories have been applied regarding the content of indexed websites. It is too often forgotten that usability goes hand-in-hand with search engine optimization. Unfortunately this relapse occurs more frequently in affiliate marketing sites that tend to over-optimize.

One specific aspect that usually goes against such sites is their ad-to-content ratio. Some of these sites can have too many clickable banners that put off visitors. From a monetization perspective this may be understandable. But content is king now more than ever with the coming of Panda and affiliate marketers need to adjust to these stricter standards for quality content to survive the update. 

Online Stores

Web retailers come in second place in that poll. Some Yahoo! Stores were also listed on the Sistrix Visibility Index that presented specific domains that were negatively affected by the update. The way pundits analyze the loss in traffic for e-commerce sites, it seems that once again content is the issue.

Admittedly it is not easy to come up with unique descriptions and blurbs for every product when one has an inventory of hundreds. The typical solution has been to use templates and what are called boilerplate text. It is not hard to assess how this can pull down an online store’s content quality and thus affect its ranking based on the new search algorithms.

Answer and How-to

Google has publicly stated that content farms are the main targets of the update. While putting more focus on separating the wheat from the chaff is certainly a most welcome initiative, the process can be quite complex in terms of online quality content. Can one unambiguously evaluate sites such as eHow or WonderHowTo as content farms?

The latter’s CTO, Bryan Crow, divulged that scrapers as well as legitimate partner sites were actually outranking them on the SERPs after the update. The painful point here is that WonderHowTo was the original source of the duplicate content. Syndication and curating is part of the operations of such sites, which means that certain content are inevitably going to get re-published. According to his analysis, one possible solution is to look for specific problem content and execute a no-index tag for their pages. As the Panda update works on the domain level, this method would spare the other pages on the site that actually have unique quality content. On Google’s part, they have at least accommodated such situations by publicly asking for data points from sites that are losing to scrapers.


Implications of the Google Panda Update on E-Commerce

Browse through the news regarding this latest major overhaul in Google’s search engine, and you can see that e-commerce websites are among those that got hit by the Panda. After playing by the rules of SEO and tenaciously maintaining their visibility, a lot of small and medium online retailers seem to be at a loss at their sudden drop in ranking in the SERPs (search engine results pages).

The changes made by Google on their algorithms seem to refocus and put more emphasis on content quality. Albeit ‘quality’ is quite a subjective term, the search giant was good enough to publish guidelines. Here is how some of the online marketing experts would apply the new set of criteria for those who sell products or services on the web. 


Sometimes the major concern of visibility can lead owners to go over the top. There’s a thin line between keyword optimization and spamming and retailers should be more careful as the new modified search algorithm can be more merciless when it comes to such practices. Retailers also tend to put too much emphasis on optimization at the expense of user experience. Getting traffic is certainly important as it is a foot in the door. But conversion is actually what brings in the profits and that is more dependent on how the user finds the site, which means it all depends on how useful the content is.


Maintaining a multipage site filled with products under various categories can be a tedious operation. More often than not, retailers tend to simplify things by using cookie-cutter texts slapped into over-used design templates. While this makes the editing and updating easier, such monotony can readily drain any initial interest in the prospective customer. When the same product description can be found on a competitor’s site where similar merchandise is available, that’s likely going to be flagged as duplicated content.

Devalued Content

One can have good content but still fail to show it. The really useful stuff could be buried too deep in other pages where the search engine spiders can’t find them. One can also make the mistake of going too far in the opposite direction and distribute indiscriminately through feeds and affiliate sites. There are cases seen after the update where shopping and affiliate sites outranked the online stores where the re-published content actually originated. Some SEO experts recommend that when you do come up with unique quality content, it is best too keep them and maintain their uniqueness.


Linked texts that use keywords for anchors are often used for site navigation such as getting back to the top-level categories. In this respect, retailers can still commit the repetition mistake. It can be annoying for users to keep returning to the same page despite what they click on. There has to be clear and diversified path for prospective costumers to follow where succeeding content build on previous ones. This creates an avalanche of compelling messages that hopefully leads to conversion.

Social Media Elements

One of the significant changes that Google implemented was to put more weight on user feedback. Ever since social media gained more ground on the online world, its impact on e-commerce has been discussed frequently by marketing experts. This update only serves to solidify this inevitable influence. Product reviews, commenting and all the other relevant social media plug-ins when used properly can positively affect online businesses directly as well as improve SERP ranking when the new Googlebots drop by.


Social Media’s Role in a Post-Google Panda SEO World

There’s no doubt about it that the changes after Google Panda’s roll out have been sweeping. Websites that were once on top of the search engine results saw a marked decrease in their traffic. Needless to say, practically all of those who have their own real estate spaces on the web were caught off-guard. Once the ruckus died out, what website owners, and even website management firms and SEO specialists were left with, were websites that needed a total overhaul.

This overhaul is needed, plain and simple. And one of the crucial factors you need to look into would be your site’s overall social media value. Google’s Panda update places significant consideration on a site’s buzz-worthiness, and if yours is anything but, then you’ll have a challenging time crawling back to Google’s good favors. 

Have Content That’s Worth Sharing

Content has always been and will always be a crucial and reliable gauging mechanism of a site’s quality. But because of Google’s prior algorithm that favored pages with tons of keywords, marketers found a clever way to out-Google Google, and that is by churning out keyword-heavy articles by the thousands, and on a daily basis too.

But it wasn’t too long before the end consumers complained as all they got to see were useless content. And add to that the fact that most of the pages that they were redirected to were mostly those with ad-heavy web pages that had keyword-spammy articles.

But this practice can no longer survive post-Google Panda. Putting out quality content on a regular, and preferably on a daily basis too, should be top priority. Once you’ve got this in order, then your goal of having a buzz-worthy site will slowly materialize and everything should slowly fall into place. Your site should once again be indexed by Google. Also, the people in your social media network will, without second thought, share your content to their respective networks as well.

Catch New Visitors’ Attention With Your Sitelinks

Perhaps the newest and most noticeable change that happened after the Google Panda was rolled out was the appearance of sitelinks for specific website searches. Sitelinks provide a one-click access to specific subpages within a site which Google thinks might be of interest to the searcher. Sitelinks automatically appear under the top entry for a search to enable users to quickly find relevant content.

Now, if you’ve long been with your own website, then you know just how much difficult it is to actually entice visitors, particularly the new ones, to further click through your site’s subpages. Internet users have very fleeting attention, even those people who have been directed to your site because of their respective social media network’s prodding.

So opt to assign high-value subpages with short and concise descriptions to be your sitelinks. Doing so will ensure that you always get users’ interest simply by providing them snippets of your content so that they’ll have a quick idea of what they’ll see on your site right there and then, without at all leaving the Google search page.   

Maximize Google+ Business

Putting up a website for your business used to be the paradigm. But this practice no longer proves to be sustainable, not only because of Google Panda, but because people will always check their networks for advise on a product or service before actually shelving out money for it. You simply can’t put a business website and hope and pray that your targeted audience will eventually see it. It’s up to you to offer your business to people who you think really wants your business.

Of course, you’ll have to do so in an unabashed and sophisticated manner so that you don’t end up dissuading your audience. And you can do this by maximizing everything Google+ has to offer, particularly its Business Pages. For one, having a profile on the leading search engine’s own social media channel will ensure that your site gets indexed every time, especially for localized searches.

Secondly, Google+ Business Pages makes use of a ton of applications that will be very useful and handy in managing your overall business. You’ll have access to its built-in payment system so financial transactions within it can be done. You can make use of advertising and analytics tools so that you get to check whether or not the strategy you’ve taken is efficient as well.

And best of all, within Google+ Business would be productivity and communication tools. You can readily manage your online operations using the built-in word processors and calendars, plus connect with your network through text and video chat too. Practically everything can be done within Google now. Think of it this way, you as a business owner already do pretty much of your operation using Google, right? So just imagine what the impact would be if you offer those in your social media network the chance to not leave their respective Google accounts just to connect with you and a whole lot of other people on the web.

No website owner will want to have to scratch back up again every single time Google releases an algorithm update. So while there won’t be anymore of these updates for the rest of 2011, it is imperative that you keep yourself abreast of the latest algorithm weather reports from the search engine giant so that you get to implement them accordingly and timely too.