Mobile SEO Now More Critical after Google Alters Algorithms

It is not often that an algorithm change from Google goes relatively unnoticed, but the latest move from Google has flown under the radar a bit despite the big impact it could have on many websites. In a recent post on the company’s webmaster blog, Google announced that algorithm changes were now in place that could negatively impact those sites that offer a poor user experience to visitors using smartphones or other mobile devices.

Recognizing the obvious shift toward mobile internet usage, Google is encouraging webmasters to offer visitors more than just a mobile version of their site. Google wants all users to get the full internet experience they are looking for, even when they access the web via smartphones. As a result, algorithm changes from Google will now boost the rankings of those pages that offer top notch mobile infrastructure, while punishing sites that do not offer a good mobile experience.

Identifying Two Common Mistakes

The blog post pointed out that Google has yet to roll out the planned algorithm changes to enhance mobile user experience, meaning there is time to for webmasters to fix key areas of concern before the algorithms begin to have a negative impact on page ranking. Google highlighted two common mistakes that webmasters make when developing mobile versions, offering reasons for the problem and helpful tips in solving them.

First, it was pointed out that faulty redirects are common problem that irritate smartphone users and is easily fixed by a capable webmaster. Faulty redirects occur when a mobile user is redirected by a site to the main page of the mobile version rather than to the corresponding mobile version of the desktop version of the page they were on. The following diagram puts this concept into image form:

In simpler terms; when a smartphone user is on a subpage and tries to navigate to the mobile version of that sub-page, they are more often than not redirected to the mobile version homepage. This means they have to start their navigation over again to the sub-page they were on. While it may be a minor inconvenience, it can interrupt a visitors flow and cause them to not return to the site in the future.

The problem is easily fixed by setting up redirects such that users are sent from the desktop version directly to the same content on a mobile version. If the site doesn’t have the content available in a mobile version, it is best to leave them on the desktop version rather than interrupt their workflow on the site.

Secondly, Google pointed out that many mobile sites have smartphone-only errors on them. This means that desktop or tablet visitors using desktop version might not experience any errors on a given page, where mobile-version users will encounter an error page when loading the same content in a mobile version.

Common mistakes in smartphone-only errors, as listed by Google, included the following:

–          Incorrect handling of Googlebot-Mobile. When employed incorrectly, Googlebot-Mobile creates an infinite redirect loop where mobile visitors are redirected to feature phone optimized sites, which in turn redirects smartphones back to the desktop site.

–          Unplayable videos on smartphones. A lot of video content is embedded on websites and designed for desktop viewing, but fails to load on smartphones.

The issue with Googlebot-Mobile can be avoided by setting up Googlebot-Mobile user agents to identify smartphones as smartphones, not feature phones, and redirecting them to the appropriate mobile version (if it exists, remember no faulty redirects) rather than simply sending them directly to the desktop version.

Additional Misconfiguration Problems

While Google highlighted the issues above as the most common, webmasters will need to address all mobile configuration issues in order to avoid punishment at the hands of the new algorithms. The blog post contained links to specific smartphone misconfiguration issues Google believes webmasters should be aware of. These other misconfiguration included:

–          App download ads that hinder usage: Some on-page advertisements for a site’s apps actually hinder a smartphone user’s experience. Google recommends using either a smart app banners or a simple HTML image with a link to the proper app download store.

–          Develop content that loads faster on mobile versions. Smartphone users are facing steeper costs for data and restrictions on usage as unlimited plans disappear. Pages should be designed to load faster and offer an optimized experience for mobile users.

Lasting Message for SEOs

In the end, Google’s overall message with this algorithm changes appears to be that SEOs and webmasters need to focus on delivering the right content at the right time and less on delivering the specific mobile experience on a device by device basis. Mobile access is differentiating as more and more tablets and smartphones hit the market.

SEOs and webmasters need to adapt a more responsive design that delivers the right content to the mobile user, even if that content is not necessarily designed for the mobile device they are using to access the site.

10 Sure-Shot Ways to Get Penalized and Lose Your Rankings in the Post EMD, Panda and Disavow Scenario

Internet marketing has turned into a fast-paced survival game where both newbies and gurus are dreading the scathing whims of Google alike. The search engine giant has made it clear with the recent Panda, EMD and Disavow updates that it will never rest till it destroys every single SEO shortcut that previously used to work. The current scenario is one of fear and anxiety as internet marketers go to sleep wondering whether their rankings will disappear overnight. In fact, many of the so called ‘gurus’ have now begun to use scare tactics to lure newbies and inexperienced marketers into ‘update-proof’ SEO courses, and many of them end up being scams.10-sure-shot-ways-to-get-penalized-and-lose-your-rankings-in-the-post-emd-panda-and-disavow-scenario

What Google looks for in websites is clearly mentioned in its guidelines. Though the whole thing looks like Google hiring a pack of henchmen to run around the web, and take down thousands of ‘low quality’ sites, the algorithm is a program- a bunch of codes that rank sites according to their quality and relevance. It is a widely known fact that the preferences of the algorithm are not top-secret, and there are many who know a lot about the algorithm.

Here are the top 10 causes for which Google may downgrade a website. It is not guaranteed that following any of these methods can land a website in Google’s hit-list, but each of these methods can expose a site to penalization. It is also not guaranteed that staying away from these tactics will keep any website safe forever, but websites that refrain from committing such crimes will have an edge over others in ranking longer.


Over-optimization has turned into the most prominent topic in the post-Penguin world. Though over-optimization is a general topic that most internet marketers are familiar with, the finer details remain obscure to most.

According to Google, a website must rank by their natural relevance to the searched keyword. Over-optimization refers to purposefully increasing the relevance of content on a website to a keyword. The practice of Search Engine Optimization is not wrong, but there is a limit. Over-optimization can be either on-page or off-page.

1.  Content Over-optimization

Content is king. It always has been. Search engines, whichever they may be, prefer sites having high quality content. Content created for the sole purpose of SEO is bound to repel the search engines. There used to be a time when over-optimized content used to crawl the top rankings in the search engines, but not any more.

‘Keyword stuffing’ is the name given to the over usage of targeted keywords in the content. Though Google does not state any permissible keyword density vales in its guidelines, the best considered range is 1-2%.

Try reading the content a few times. Make sure that it has good readability, provides some value to the readers and that it makes sense. Use keywords only where they can be used, and only where they blend into the text. Avoid using the same keyword over and over frequently.

Using keyword variations is a hugely search-engine friendly habit. For example, for keywords like ‘exercises for body building’, try using the variations ‘body building exercises’ or ‘exercise and build the body’. Using keyword variations tell the search engines that the content was not written for the sole purpose of ranking.

2. Over-optimized anchor text

Using the main keyword frequently as the anchor text was one of the main practices that got hit in the search engine updates. Webmasters who used their primary keywords in the anchor text widely suffered rank drops, and some of the websites even got de-indexed. It was once a usual practice to use exact match keywords in the links to boost relevance, but in the current scenario it would be wise not to resort to such practices.

Try using more natural anchor text like “Click here” or “Click to find out more.” It is to be noted that over-usage of any text can be harmful. The link text must be varied as much as possible.

3. Over-optimized meta-tags

Always try to keep the meta-tags natural. Meta-tags are vital to any website page as they let the search engines know about the content on a page, and add up to the page’s relevance in searches. Over-optimizing meta-tags refers to keyword stuffing of these tags.

For example, for the keyword ‘Credit Management’, it would be better using ‘Credit Management Company’ or ‘Manage Your Credit’ for the meta tags rather than directly using ‘Credit Management’ or ‘Good Credit Management’.

4. Paid Linking

The paid linking strategy has been on Google’s hit list from the very beginning, and it is worse now. Never sell or buy links, period. In case of paid advertising display the link as sponsored, and use only no-follow links. There has been a long line of sites that got de-indexed due to paid linking.

5. Non-contextual Links

Always try to use contextual links. Many often resort to linking within the page headers or footers, and this just does not look natural to the search engines.

6. Poor Link-Diversity

Link diversity is all about looking natural to the search engines. It is always ideal to have a mixture of both relevant and irrelevant, high and low quality links in the backlink structure of the website. Try to obtain links from a wide variety of sources. A 70-30 ratio between the high quality and low quality links is considered ideal.

7.  Rapid Backlinking

Ever thought about how fast a link building campaign should be? It should be as slow as possible, period. Google looks for natural websites that have high quality content, and a diverse backlink arsenal created over a long period. A new site will never attract backlinks as fast as an established authority site. Just be slow and steady.

8. Duplicate Content

Content is king, period. Never steal content from other websites. Stealing content from other sites, if discovered, can lead to de-indexing within no time. Many article directories provide content with usage rights, but credit must be given to the author. It is a no-brainer that Google undervalues sites having duplicate content.

9. Thin or Spun Content

Article spinning was once an SEO revolution. Webmasters could create tons of unique content easily from a single article. However, spinning is against the guidelines of Google. In most cases, spun content is illegible, and filled with grammatical and structure errors that hinder readability. Google does not like this one bit.

Thin content refers to poorly written content used for the sole purpose of directing traffic to PPC or affiliate ads. It can either be too shallow with very few words, or can be of low quality providing little or no value to the visitors. Google considers sites having thin content as those created just for hosting ads, and not as a service to the user.

10. Content Cloaking

Content cloaking is a spammy method in which the content that is visible to the search engine bots is made to be different from the content that is visible to the users. In the current scenario, the practice of cloaking is utterly meaningless. It is simply impossible to fool Google. Cloaking might have worked earlier, but now it is a ‘do and die’ strategy that can get a site banned within no time in most cases. It would be wise never to resort to such infamous practices, which may piss off the search engines.

Google Panda and the online world

Google Panda: A New Beginning for Some, but the Beginning of the End for Others

Google was founded in 1998 and had almost immediately thrown the likes of Yahoo and Alta Vista out of competition with the unique algorithm that its search engine was founded on. Whereas the latter two search engines focused more on the quality of content as well as its journalistic integrity, Google had found another – and supposedly better way – of ranking sites. Although the new algorithm initially took website owners by surprise, SEO specialists soon discovered a way to make their websites gain better rankings than those with professional journalists in their staff.

Writing for Search Engines

After a period of time, wily website owners discovered that writing for search engines rather than for the readers themselves was what would give their websites higher ranking in Google’s search page results. This meant regularly updating their websites with keyword-rich content. Google’s algorithm apparently didn’t place much importance on whether or not new content actually added value to the website. 

Introducing Google Panda

Last February 2011, Google once again surprised the Internet community with its Google Panda. It consisted of a series of updates made to Google’s algorithm and which was designed to reverse the entire ranking process. Google wanted high-quality websites back on the map while low-quality sites were meant to suffer the consequences of their low-quality content. Google Panda was supposed to be different because of its incorporation of more “user feedback signals”.

Changes Caused by Google Panda

Changes caused by Google Panda were not just immediate but considerably significant as well. News websites reported a quick and huge surge in rankings. Fox News, which had previously ranked #89, climbed to #23 after one of Google Panda’s rolled out updates.

Conversely, websites like and suffered the backlash with lowered rankings. Other “content farm” websites like WikiHow and eHow also suffered. Even comparison websites like Next Tag suffered as well. Unfortunately, the same changes also caused a decrease in ranks for websites with original and legitimate content such as the British Medical Journal.

CNET’s report also revealed many other interesting results. The report showed that Google Panda did not affect the top rankings enjoyed by websites like YouTube, Amazon, Wikipedia, and IMDB. These websites could be safely considered as those with dominant authority in their respective niches.

Google Panda updates also appeared to cause a slight improvement in rankings of social Web 2.0 sites such as Facebook and Twitter while other sites like WebMD, Flickr, Yelp, and even suffered a minimal decrease in their rankings. On the upside, government websites enjoyed an increase in their rankings. White House’s official website, for instance, surged to #79 from its previous ranking of #125.

No More Updates for 2011

Google recently announced that it will no longer release any updates for Google Panda for the remainder of 2011. Although that only leaves website owners just a few weeks at most to regroup and rethink their strategies, additional time for adjustment could still mean a lot. With so many news reports recently flooding the Internet to help affected website owners survive Google Panda, there is no reason why website owners will not gradually re-learn the old ways and place greater priority on content quality as they once did.


How to Survive in the Post-Panda Apocalypse

In 1998, Google turned the Internet upside-down with its new search algorithm, one that SEO experts found a way to exploit and which eventually allowed low-quality websites to lord it over high-quality websites. Last February 2011, Google once more initiated a shakedown with the launch of Google Panda, which updated and supposedly improved its search algorithm. The new algorithm was supposed to bring back the former glory of high-quality websites. Whether this remained true or not is, in the end, unimportant. As Google Panda seems here to stay, website owners have no choice but to follow the beat of the drummer and march accordingly. 

Black Hat versus White Hat SEO Techniques

In the past, black hat SEO techniques was frowned upon but were nonetheless allowed to exist and help improve website rankings. Now, Google has become much stricter and websites that employed black hat SEO techniques are likely to suffer from a huge blow in terms of rankings. In fact, it has already happened to “content farm” websites such as as well as

Black hat SEO technique is best defined as anything that is meant to please search engines without a care to how it would please human readers or visitors. Examples of commonly used black hat SEO techniques would include invisible keyword text, keyword stuffing, and mirror websites. If your website uses any black hat SEO technique, it’s best to rectify the matter immediately before Panda catches on and red-flags your website.

White hat SEO techniques are, of course, the opposite. These techniques please both search engines and human readers. Examples of white hat SEO techniques would include posting original and well-researched and immensely readable content.

Obviously, employing white hat SEO techniques could appear to be an enormous challenge. This is especially true for website owners who had gotten lax over the years and mostly relied on black hat SEO techniques to improve their rankings. Change, however, is not just inevitable but is already taking place. Websites have to learn how to adapt to Google Panda if they want to survive. 

Top Tips on How to Survive Google Panda

Even if your website has already been flagged, the good news is that Google is a relatively forgiving search engine. You can always start again by making the necessary changes to your website, and Google will rank your website accordingly. The tips below can help you get started.

  • Duplicated content is a big no-no.Ignoring the results of copyright infringement detection tools like CopyScape is one of the worst mistakes you can ever make. If there’s content from another website that you truly wrote, then rewrite and repost it. Google doesn’t mind you doing that, but it does care if you simply cut and pasted the entire content on your website.
  • Place greater emphasis on original and value-rich content.Conversely, reduce the amount of web space that you dedicate to advertisements. The new algorithm frowns upon sites that have tons of these so tone them down on your site as soon as possible.
  • Google Panda loves buzz-worthy website.The new search algorithm places higher rankings on websites or pages that are shared socially. Think of webpages that get bookmarked, tweeted, shared through Facebook, and blog posts that have lots of comments. This means not only totally opening your site to user comments, but integrating tools that allow for social media sharing of your site as well.


Major Areas Affected by the Google Panda Update 

One cannot simply avoid being reliant on Google’s search engine for visibility on the online world. After all, much of the search engine optimization techniques that have been so far conceived are based on how this search giant develops its algorithms for indexing and ranking sites on their organic search results pages. Thus it comes as no surprise that a huge amount of complaints and arguments have surfaced since the Panda update rolled out. In an attempt to understand this significant change in the online visibility game, presented here is an outline of some of the types of websites that have been greatly affected by the Panda.

Affiliate Marketing

According to a poll conducted by Search Engine Roundtable, affiliate sites seem to be the hardest hit. Given that the general aim of the update was to improve user experience, stricter categories have been applied regarding the content of indexed websites. It is too often forgotten that usability goes hand-in-hand with search engine optimization. Unfortunately this relapse occurs more frequently in affiliate marketing sites that tend to over-optimize.

One specific aspect that usually goes against such sites is their ad-to-content ratio. Some of these sites can have too many clickable banners that put off visitors. From a monetization perspective this may be understandable. But content is king now more than ever with the coming of Panda and affiliate marketers need to adjust to these stricter standards for quality content to survive the update. 

Online Stores

Web retailers come in second place in that poll. Some Yahoo! Stores were also listed on the Sistrix Visibility Index that presented specific domains that were negatively affected by the update. The way pundits analyze the loss in traffic for e-commerce sites, it seems that once again content is the issue.

Admittedly it is not easy to come up with unique descriptions and blurbs for every product when one has an inventory of hundreds. The typical solution has been to use templates and what are called boilerplate text. It is not hard to assess how this can pull down an online store’s content quality and thus affect its ranking based on the new search algorithms.

Answer and How-to

Google has publicly stated that content farms are the main targets of the update. While putting more focus on separating the wheat from the chaff is certainly a most welcome initiative, the process can be quite complex in terms of online quality content. Can one unambiguously evaluate sites such as eHow or WonderHowTo as content farms?

The latter’s CTO, Bryan Crow, divulged that scrapers as well as legitimate partner sites were actually outranking them on the SERPs after the update. The painful point here is that WonderHowTo was the original source of the duplicate content. Syndication and curating is part of the operations of such sites, which means that certain content are inevitably going to get re-published. According to his analysis, one possible solution is to look for specific problem content and execute a no-index tag for their pages. As the Panda update works on the domain level, this method would spare the other pages on the site that actually have unique quality content. On Google’s part, they have at least accommodated such situations by publicly asking for data points from sites that are losing to scrapers.


Implications of the Google Panda Update on E-Commerce

Browse through the news regarding this latest major overhaul in Google’s search engine, and you can see that e-commerce websites are among those that got hit by the Panda. After playing by the rules of SEO and tenaciously maintaining their visibility, a lot of small and medium online retailers seem to be at a loss at their sudden drop in ranking in the SERPs (search engine results pages).

The changes made by Google on their algorithms seem to refocus and put more emphasis on content quality. Albeit ‘quality’ is quite a subjective term, the search giant was good enough to publish guidelines. Here is how some of the online marketing experts would apply the new set of criteria for those who sell products or services on the web. 


Sometimes the major concern of visibility can lead owners to go over the top. There’s a thin line between keyword optimization and spamming and retailers should be more careful as the new modified search algorithm can be more merciless when it comes to such practices. Retailers also tend to put too much emphasis on optimization at the expense of user experience. Getting traffic is certainly important as it is a foot in the door. But conversion is actually what brings in the profits and that is more dependent on how the user finds the site, which means it all depends on how useful the content is.


Maintaining a multipage site filled with products under various categories can be a tedious operation. More often than not, retailers tend to simplify things by using cookie-cutter texts slapped into over-used design templates. While this makes the editing and updating easier, such monotony can readily drain any initial interest in the prospective customer. When the same product description can be found on a competitor’s site where similar merchandise is available, that’s likely going to be flagged as duplicated content.

Devalued Content

One can have good content but still fail to show it. The really useful stuff could be buried too deep in other pages where the search engine spiders can’t find them. One can also make the mistake of going too far in the opposite direction and distribute indiscriminately through feeds and affiliate sites. There are cases seen after the update where shopping and affiliate sites outranked the online stores where the re-published content actually originated. Some SEO experts recommend that when you do come up with unique quality content, it is best too keep them and maintain their uniqueness.


Linked texts that use keywords for anchors are often used for site navigation such as getting back to the top-level categories. In this respect, retailers can still commit the repetition mistake. It can be annoying for users to keep returning to the same page despite what they click on. There has to be clear and diversified path for prospective costumers to follow where succeeding content build on previous ones. This creates an avalanche of compelling messages that hopefully leads to conversion.

Social Media Elements

One of the significant changes that Google implemented was to put more weight on user feedback. Ever since social media gained more ground on the online world, its impact on e-commerce has been discussed frequently by marketing experts. This update only serves to solidify this inevitable influence. Product reviews, commenting and all the other relevant social media plug-ins when used properly can positively affect online businesses directly as well as improve SERP ranking when the new Googlebots drop by.


Social Media’s Role in a Post-Google Panda SEO World

There’s no doubt about it that the changes after Google Panda’s roll out have been sweeping. Websites that were once on top of the search engine results saw a marked decrease in their traffic. Needless to say, practically all of those who have their own real estate spaces on the web were caught off-guard. Once the ruckus died out, what website owners, and even website management firms and SEO specialists were left with, were websites that needed a total overhaul.

This overhaul is needed, plain and simple. And one of the crucial factors you need to look into would be your site’s overall social media value. Google’s Panda update places significant consideration on a site’s buzz-worthiness, and if yours is anything but, then you’ll have a challenging time crawling back to Google’s good favors. 

Have Content That’s Worth Sharing

Content has always been and will always be a crucial and reliable gauging mechanism of a site’s quality. But because of Google’s prior algorithm that favored pages with tons of keywords, marketers found a clever way to out-Google Google, and that is by churning out keyword-heavy articles by the thousands, and on a daily basis too.

But it wasn’t too long before the end consumers complained as all they got to see were useless content. And add to that the fact that most of the pages that they were redirected to were mostly those with ad-heavy web pages that had keyword-spammy articles.

But this practice can no longer survive post-Google Panda. Putting out quality content on a regular, and preferably on a daily basis too, should be top priority. Once you’ve got this in order, then your goal of having a buzz-worthy site will slowly materialize and everything should slowly fall into place. Your site should once again be indexed by Google. Also, the people in your social media network will, without second thought, share your content to their respective networks as well.

Catch New Visitors’ Attention With Your Sitelinks

Perhaps the newest and most noticeable change that happened after the Google Panda was rolled out was the appearance of sitelinks for specific website searches. Sitelinks provide a one-click access to specific subpages within a site which Google thinks might be of interest to the searcher. Sitelinks automatically appear under the top entry for a search to enable users to quickly find relevant content.

Now, if you’ve long been with your own website, then you know just how much difficult it is to actually entice visitors, particularly the new ones, to further click through your site’s subpages. Internet users have very fleeting attention, even those people who have been directed to your site because of their respective social media network’s prodding.

So opt to assign high-value subpages with short and concise descriptions to be your sitelinks. Doing so will ensure that you always get users’ interest simply by providing them snippets of your content so that they’ll have a quick idea of what they’ll see on your site right there and then, without at all leaving the Google search page.   

Maximize Google+ Business

Putting up a website for your business used to be the paradigm. But this practice no longer proves to be sustainable, not only because of Google Panda, but because people will always check their networks for advise on a product or service before actually shelving out money for it. You simply can’t put a business website and hope and pray that your targeted audience will eventually see it. It’s up to you to offer your business to people who you think really wants your business.

Of course, you’ll have to do so in an unabashed and sophisticated manner so that you don’t end up dissuading your audience. And you can do this by maximizing everything Google+ has to offer, particularly its Business Pages. For one, having a profile on the leading search engine’s own social media channel will ensure that your site gets indexed every time, especially for localized searches.

Secondly, Google+ Business Pages makes use of a ton of applications that will be very useful and handy in managing your overall business. You’ll have access to its built-in payment system so financial transactions within it can be done. You can make use of advertising and analytics tools so that you get to check whether or not the strategy you’ve taken is efficient as well.

And best of all, within Google+ Business would be productivity and communication tools. You can readily manage your online operations using the built-in word processors and calendars, plus connect with your network through text and video chat too. Practically everything can be done within Google now. Think of it this way, you as a business owner already do pretty much of your operation using Google, right? So just imagine what the impact would be if you offer those in your social media network the chance to not leave their respective Google accounts just to connect with you and a whole lot of other people on the web.

No website owner will want to have to scratch back up again every single time Google releases an algorithm update. So while there won’t be anymore of these updates for the rest of 2011, it is imperative that you keep yourself abreast of the latest algorithm weather reports from the search engine giant so that you get to implement them accordingly and timely too.