SEO: Black, White & Grey

With the advent of Search Engine Optimization (SEO), various techniques and methods for improving the visibility of a website or webpage evolved. Industry experts have tried various processes to improve the ranking of a particular webpage so it appears earlier on the search results page. By optimizing search engine results, the aim revolves around the basic marketing gimmick – attract more visitors and improve frequency of visits.

SEO strategies have far surpassed the basic techniques: using appropriate keywords, editing HTML, removing barriers to indexing activities, increasing the number of backlinks, cross-linking, meta tags and much more. Over time, SEO techniques gained prominence, and now SEO is broadly distinguished into two broad categories: White Hat and Black Hat. I’ll shed some light on these two genres and also discuss how to test a site by combining White Hat and Black Hat.

Artwork: Search Engine Optimization

Black Hat SEO

Well the name itself explains a lot. Black Hat SEO serves the same purpose of improving the search engine results for a webpage. Yet the search engine optimization community has deemed Black Hat to be unethical.

However this wasn’t always the case. Black Hat used to be legitimate once. But a few folks crossed the tacit boundaries and as a result Black Hat is now shunned completely by the SEO community.

In essence Black hat SEO practices resort to short term gains. You run the risk of getting penalized (either temporarily or permanently) by search engines, if they discover the spam-advertising techniques on your website.

Black Hat is quite cheaper because it is usually done through automated software. I am going to present two basic examples so to give you a proper idea about what Black Hat really is.

Example 1:

The SEO community has regarded Black Hat techniques to be deceptive. One such practice uses text that is hidden, either in text colored similar to the background, in an invisible div, or positioned off screen.

Example 2:

Another trick might involve generating different versions of a same web page to different users: human or search engine. A human user is presented with different content, while the same content is optimized using subliminal and tacit gimmicks for search engines.

 

White Hat SEO

White Hat SEO practices are employed to improve the ranking of the website with a permanent and long-term prospective. White hat is generally slower – takes time to produce the desired results, but conforms to the search engine’s guidelines. White Hat practices costs much more than Black Hat techniques, as they are usually accomplished by humans. Time and money consumption are both high for White Hat.

An example of White hat SEO practices is to create content for users, and not search engines. Even though the content is optimized but with the intention of ensuring that the content a search engine indexes and ranks is the same content that is presented to the users.

 

Grey Hat SEO

I believe the most important factor to note are the tacit guidelines agreed-upon by the SEO community. Because these SEO rules are not a written commandment or document; therefore the distinguishing boundaries between Black, White and Gray Hats are quite opaque and often misunderstood.

Gray Hat SEO practices lie between the continuum, with White Hat on one extreme and Black Hat on the other. Gray Hat SEO techniques carry more risk than White Hat processes, but without the risk of getting penalized by search engines. They are questionable but not as controversial as Black Hat SEO techniques.

But with the evolving nature of the tech world (especially introduction of new algorithms by search engines), you never know if Gray Hat techniques are considered as Black Hat in the foreseeable future.

 

Black Hat Techniques

I am going to talk about a few well-known Black Hat techniques, that may or may not work in current algorithms used by search engines. Therefore these are not recommended at all

Keyword stuffing

This technique involves using a long list of keywords to increase the keyword count, variety and density. However such fluffed keywords are usually not part of your website content. Using out-of-context keywords can only make the web page relevant for the search engine, but not for the end-user.

Doorway/Gateway pages

This technique employs redirecting the search engine to another website. You might have often seen banners and ads on the side or bottom of a webpage with “click here to enter” tags. Doorways or gateway pages are low-quality web pages, stuffed with keywords and little or no content. This technique as usual is only relevant for the search engines and has no use for the visitor.

Cloaking

Cloaking is the technique of making the website search engine friendly by hiding animations and showing text only. Additionally Cloaking also refers to displaying different versions of the same content to users and search engines. “Server side scripting” is utilized to display a different content (or redirecting to different URLs) when a user visits the webpage and displaying different content for search engine spiders and Googlebots.

Linking

When different websites refer or hyperlink other web pages, this technique is named Linking. Linking improves the ranking of web-pages by fooling search engines.

Spamming forums, blogs and social media sites

Spamming comprises of placing links or keywords on forums, blogs and social media sites. Certain spam blogs are also created solely for commercial purposes. Links are placed into genuine sites such as Facebook, Twitter, Craigslist, etc., but which leads to illegitimate websites featuring bogus content or poorly written web content.

 

Grey Hat Techniques

If you are thinking about exploring Grey Hat SEO, you have to decide between time, money and the pertaining risks. Before moving on how to obtain the best of both worlds in the SEO universe, you need to reiterate what I’ve mentioned earlier.

In the case of White Hat, it is the “honest” approach. But playing good with search engines (Google) requires time and money. Whereas the Black Hat is concerned, it is getting blazing fast results through automated scripts but with the disadvantage of getting penalized (a.k.a. Sandboxed).

Here, I am going to talk about some techniques that are considered “safe” by some: combining black and white hats to avail benefits quickly, but for a longer period of time.

Social Bookmarking: Blog/ Forum links

Social Bookmarking may be a worthwhile experience for getting your website indexed and obtaining high-grade quality links. Websites such as “Diggs” and “Stumble Upon” will greatly enhance traffic to your website and yield durable results.

But how does Social Bookmarking works in favor for SEO? Commenting and posting on blogs and forums might earn you credible links, but it’s not possible through automated scripts or through genuine posting. Why can’t you post on blogs or forums? Because it’s very time consuming. But outsourcing is the key! I know it will add to your budget, but you can outsource posting or commenting on blogs or forums.

Some marketers use bookmarking tools alongside manual ones. According to some, 30%-40% automation is a safe mix.

Scraping & Spinning

You can grab the same content on different websites for spinning and publishing. Additionally, you can install scripts on your server which automatically pulls content from web and create content on your web pages after spinning with the help of synonyms.

You can grab content from RSS feed or other sites by mapping divs then rewrite them manually, through offline software or API calls and insert into the sites. WordPress based blog sites would come in handy to achieve this objective as the plugins are easy to get. You can literally generate a 10,000 page website only with a single click or you can schedule a fixed or random number of blog posts per day and leave on auto-pilot.

Some companies are even offering hosted solution with API capabilities.Some Grey Hatters outsource re-writing job for perfect human readability. Some of them, simply grab content, escape more than 60% of the content and insert into blog post by moving lines and paragraphs around.

Linking

Since reciprocal linking are not bringing too much value these days in most cases, Grey Hatters tend to engage more into Link Wheels, Link Pyramids and/or 3-way linking. This can be done manually or through software. According to some, mix of manual and automatic is the ideal method.

There are Australian, Canadian and Russian software vendors who are currently dominating the Grey Hat market with their software that serve this purpose.

Latent Semantic Indexing (LSI)

With Google incorporating LSI in its algorithms, LSI has become a popular technique for SEO experts. With the LSI in effect, algorithms can now think and act more like humans. Thus Google is trying to make its search experience more enhanced by pulling up the most relevant and informative pages. I’ve written a post previously on LSI that can be found here.

But what’s in it for Black or Grey Hatters? There are few LSI keyword generators floating around the web, which can optimize your content for LSI. But it’s not that simple. Because you will have to make sure that there is congruency and relevance between keywords and the content of a webpage – words, sentences and phrases. Moreover using a single keyword will adversely affect your website’s ranking. You’ll have to use multiple keywords.

Whatever I’ve mentioned so far are the basics in Black and Grey Hat SEO domain. I’ve seen webmasters and marketers using and abusing PR channels, Directories, YouTube, Craigslist, Kijiji, Yelp, BackPage, etc. to achieve their objectives. Some expert marketers suggest usage of these techniques in moderation. However, finding the magic combination takes months of experiments.

We now have new niches that accommodate Black and Grey Hatters. If you google properly or visit certain well-known forums, you will be able to buy Facebook fans, Twitter followers, Google+, YouTube views, Email accounts, Facebook & Twitter accounts, Captcha breakers and almost anything that could be used to run Black Hat and Grey Hat experiments. A lot of Black Hatters are more lenient to Silk Road (only available in the onion router network) whereas the Grey Hatters are pretty much sticking to forums and related portals. More and more webmasters and e-marketers are now getting involved in either Black Hat or Grey Hat marketing, which caused increase in sales of proxies, virtual credit cards, adwords accounts, SEO hosting, offshore VPS and so on.

All information on Black Hat and Grey Hat in this article is for educational and experimental purposes only and is not intended to promote them.

Is LSI better than backlinking for SEO?

Have you ever wondered how Google can guess what you’re searching in just a few letters? Were you ever surprised to find that the advertisement on the side of the screen always seemed to match whatever you happened to be thinking about? That’s no coincidence. That is a clever technique known as Latent Semantic Index in development at Google – now progressing in leaps and bounds.

In short, LSI is the ability to spot patterns in the words. People can remember dozens of ‘synonyms’, or words which mean basically the same thing. If in doubt, we can fish out a thesaurus or, yes, look it up on the internet. Even though the computer can tell us word alternatives, it doesn’t understand in the same sense, because it doesn’t speak human languages. It just processes data rather than ‘reading’ in the way a person does so easily. This makes matching text by content far more complicated.

The analysis comes from a math principle called Singular Value Decomposition, or SVD (simply put, a computer algorithm for working out statistics). You can read this wiki article on LSI for better understanding. After finding that its Adsense campaign was pretty successful, Google hired Applied Semantics to figure out how to improve the smooth running of their search engine. Before that, irrelevant pages would sometimes appear simply because they were stuffed with keywords.

Any webmaster familiar with Search Engine Optimization will know how it used to work – the text of a website would be filled to the brim with the same words over and over again, in order to trick search engines to think that the site had more relevant content than it really did. This ‘keyword stuffing’ technique bumped up the page ranking of these sites so that they appeared on the first page of search results. So, whatever you typed, you’d be likely to get a boring article, so full of key phrases it was practically unreadable.

On the other end of the scale, companies would insert irrelevant words into their text in order to ensnare people who were looking for something completely unrelated. Sneakily putting popular search words in to get hits on the site used to work (and still does on sites like Youtube) but with the semantic indexing project, all this does is water everything down. It’s a bit like if you plant seeds over a wide field, some of them will get neglected – if a web navigator scatters the keywords over too many distant topics, Google won’t recognize the site as a specialist in any topic.

Google’s ability to search for related content is a breakthrough. It ignores repetition and strikes offending sites from the rankings. Making use of this is a simple matter of using a whole bunch of related terms which people commonly search for – a technique called Latent Semantic Indexing. It sounds complicated, but it is actually a fairly basic part of SEO and something which we do naturally.

For the best success, you will need to find the most searched for terms. Using Google itself, any regular web author can use the tilde (~) in front of the search term to find result running along the same theme. To see which terms are likely to be most popular, head to a ‘high authority’ site – a website that is renowned for its accurate information on whatever subject of your site is about. Usually this is a big company, like Apple. Whichever terms they use, web architects should use.

Not only that, but Google can tell when two words that relate to different topics but spelled the same (called ‘polysemy’). It will not confuse inflation of the economy with inflation of a balloon. The spiders are able to crawl through the results and find out which words and phrases relate to each. Inflation in the sense of money is not likely to contain the words ‘hot air’. So if you search ‘inflation hot air’, you will not get a piece from the Wall Street Journal.

In that case, it’s better not to use any terms which might cause the poor Google spiders to get confused. If you wrote: ‘this panic about inflation is a load of hot air’, your article might be mistaken for a ballooning article and those who want to read about the economy won’t find your site with the usual search items.

Now that Google and other search engines analyze content more closely, the need for endless back links and anchor text has reduced. This should be a huge relief to any individual webmaster or a starting-out company with not many links to show for themselves – now new kids on the block get a piece of the pie, rather than having all the website visits stolen by big co-operations who throw money at IT specialists.

Before, topping the results included bumping up the page rank using links to other related sites (particularly ‘high authority ones) or internal links to other pages on that particular website. Keyworded-up, the links drew the spiders. They explored the site, checked how many reliable, functioning links there were and made assumptions about the content of the site based on how many other decent sites it referenced. All the line and colors made it a hideous strain on the eyes to read and rarely was the content any good – scraps of info laced with advertising opportunities.

These days, it’s actually better to reduce the number of links. Too many links going all over the place send your friendly internet spiders away from the most important places. Say you linked to an ICT firm and your main site from your blog. You probably want more focus on your main site, so that your main site can get an increased page rank, right? That won’t happen. Those links have equal value, so the spiders will branch off in all directions and get distracted.

The best way to set up a site that makes full use of LSI is to have numerous pages relating to a number of select topics within a theme, linked together in a straight forward way. If you make the search engine crawlers’ job easier, you will reap the benefits. Here is a basic picture of how the site should be mapped:

Every page contains only one link that directly leads to the next page. The last page leads to an external website. As you can see, it is divided into distinct sections. That way, the subjects covered aren’t competing with each other. In each topic, it’s best to pick a title and an angle that isn’t likely to have been used before. That way, you have no competitors for that article.

On a general subject, like nutrition, you might have three themes (minimum). Those loose themes are ‘landing pages’, an area for all the related topics to be pooled. The themes could be “Common Food Myths”, “The Uses of Supplements” and “The Truth about Exercise”. On these themes, there should be at least five separate pages linking into each other, each explaining an individual topic on the theme – e.g. on Food Myths, one article on meat, another on diets, etc. This layout will help both spiders and people navigate the site. A search of ‘meat myths’ will fetch up that one specific page that links to the rest of your site. This is better than putting all the good stuff on one page and risking looking irrelevant in the all-seeing eyes of Google.

Smart web maintenance will involve checking the present of keywords related specifically to the subject. Take this blog – even as you read it, it has been regulated for semantic search engines to increase its probability of being picked up by the counter. Each paragraph that you have head will have contained a handful of words relating to LSI without resorting to repetition – any writing on semantically related words will contain basic related words like ‘webpage’ and ‘Google; plus more specific terms like ‘synonym’ are vital tools for a webmaster to cover all the variables of SEO. Care has also been taken not to overlap with polysemic words or clash with other major sites. All it takes is some research and careful planning. High ranking blogs and articles go to the best prepared.

A search optimized piece of writing does not have to be over-technical or mind-numbingly dull. No longer do good search engines favour pages that endlessly repeat themselves. With LSA is a system that, like most new tech, improves the way we use the internet to allow the freedom to express what needs saying without worrying about falling in the ranks. It’s a new era for web content and every web writer is chomping at the bit to stay on top – understand Latent Semantic Indexing and you will achieve this perfectly. Also, application of LSI doesn’t completely overrule the value of backlinking although with the launch of ‘Google Caffeine’ it appears that LSI’s importance is a bit higher than backlinking. LSI is just one of the components that you should consider if you would like to optimize your site/portal for search engines.

Google Panda and the online world

Google Panda: A New Beginning for Some, but the Beginning of the End for Others

Google was founded in 1998 and had almost immediately thrown the likes of Yahoo and Alta Vista out of competition with the unique algorithm that its search engine was founded on. Whereas the latter two search engines focused more on the quality of content as well as its journalistic integrity, Google had found another – and supposedly better way – of ranking sites. Although the new algorithm initially took website owners by surprise, SEO specialists soon discovered a way to make their websites gain better rankings than those with professional journalists in their staff.

Writing for Search Engines

After a period of time, wily website owners discovered that writing for search engines rather than for the readers themselves was what would give their websites higher ranking in Google’s search page results. This meant regularly updating their websites with keyword-rich content. Google’s algorithm apparently didn’t place much importance on whether or not new content actually added value to the website. 

Introducing Google Panda

Last February 2011, Google once again surprised the Internet community with its Google Panda. It consisted of a series of updates made to Google’s algorithm and which was designed to reverse the entire ranking process. Google wanted high-quality websites back on the map while low-quality sites were meant to suffer the consequences of their low-quality content. Google Panda was supposed to be different because of its incorporation of more “user feedback signals”.

Changes Caused by Google Panda

Changes caused by Google Panda were not just immediate but considerably significant as well. News websites reported a quick and huge surge in rankings. Fox News, which had previously ranked #89, climbed to #23 after one of Google Panda’s rolled out updates.

Conversely, websites like EZineArticles.com and AssociatedContent.com suffered the backlash with lowered rankings. Other “content farm” websites like WikiHow and eHow also suffered. Even comparison websites like Next Tag suffered as well. Unfortunately, the same changes also caused a decrease in ranks for websites with original and legitimate content such as the British Medical Journal.

CNET’s report also revealed many other interesting results. The report showed that Google Panda did not affect the top rankings enjoyed by websites like YouTube, Amazon, Wikipedia, and IMDB. These websites could be safely considered as those with dominant authority in their respective niches.

Google Panda updates also appeared to cause a slight improvement in rankings of social Web 2.0 sites such as Facebook and Twitter while other sites like WebMD, Flickr, Yelp, and even Apple.com suffered a minimal decrease in their rankings. On the upside, government websites enjoyed an increase in their rankings. White House’s official website, for instance, surged to #79 from its previous ranking of #125.

No More Updates for 2011

Google recently announced that it will no longer release any updates for Google Panda for the remainder of 2011. Although that only leaves website owners just a few weeks at most to regroup and rethink their strategies, additional time for adjustment could still mean a lot. With so many news reports recently flooding the Internet to help affected website owners survive Google Panda, there is no reason why website owners will not gradually re-learn the old ways and place greater priority on content quality as they once did.

 

How to Survive in the Post-Panda Apocalypse

In 1998, Google turned the Internet upside-down with its new search algorithm, one that SEO experts found a way to exploit and which eventually allowed low-quality websites to lord it over high-quality websites. Last February 2011, Google once more initiated a shakedown with the launch of Google Panda, which updated and supposedly improved its search algorithm. The new algorithm was supposed to bring back the former glory of high-quality websites. Whether this remained true or not is, in the end, unimportant. As Google Panda seems here to stay, website owners have no choice but to follow the beat of the drummer and march accordingly. 

Black Hat versus White Hat SEO Techniques

In the past, black hat SEO techniques was frowned upon but were nonetheless allowed to exist and help improve website rankings. Now, Google has become much stricter and websites that employed black hat SEO techniques are likely to suffer from a huge blow in terms of rankings. In fact, it has already happened to “content farm” websites such as EZineArticles.com as well as FindArticles.com.

Black hat SEO technique is best defined as anything that is meant to please search engines without a care to how it would please human readers or visitors. Examples of commonly used black hat SEO techniques would include invisible keyword text, keyword stuffing, and mirror websites. If your website uses any black hat SEO technique, it’s best to rectify the matter immediately before Panda catches on and red-flags your website.

White hat SEO techniques are, of course, the opposite. These techniques please both search engines and human readers. Examples of white hat SEO techniques would include posting original and well-researched and immensely readable content.

Obviously, employing white hat SEO techniques could appear to be an enormous challenge. This is especially true for website owners who had gotten lax over the years and mostly relied on black hat SEO techniques to improve their rankings. Change, however, is not just inevitable but is already taking place. Websites have to learn how to adapt to Google Panda if they want to survive. 

Top Tips on How to Survive Google Panda

Even if your website has already been flagged, the good news is that Google is a relatively forgiving search engine. You can always start again by making the necessary changes to your website, and Google will rank your website accordingly. The tips below can help you get started.

  • Duplicated content is a big no-no.Ignoring the results of copyright infringement detection tools like CopyScape is one of the worst mistakes you can ever make. If there’s content from another website that you truly wrote, then rewrite and repost it. Google doesn’t mind you doing that, but it does care if you simply cut and pasted the entire content on your website.
  • Place greater emphasis on original and value-rich content.Conversely, reduce the amount of web space that you dedicate to advertisements. The new algorithm frowns upon sites that have tons of these so tone them down on your site as soon as possible.
  • Google Panda loves buzz-worthy website.The new search algorithm places higher rankings on websites or pages that are shared socially. Think of webpages that get bookmarked, tweeted, shared through Facebook, and blog posts that have lots of comments. This means not only totally opening your site to user comments, but integrating tools that allow for social media sharing of your site as well.

 *****************

Major Areas Affected by the Google Panda Update 

One cannot simply avoid being reliant on Google’s search engine for visibility on the online world. After all, much of the search engine optimization techniques that have been so far conceived are based on how this search giant develops its algorithms for indexing and ranking sites on their organic search results pages. Thus it comes as no surprise that a huge amount of complaints and arguments have surfaced since the Panda update rolled out. In an attempt to understand this significant change in the online visibility game, presented here is an outline of some of the types of websites that have been greatly affected by the Panda.

Affiliate Marketing

According to a poll conducted by Search Engine Roundtable, affiliate sites seem to be the hardest hit. Given that the general aim of the update was to improve user experience, stricter categories have been applied regarding the content of indexed websites. It is too often forgotten that usability goes hand-in-hand with search engine optimization. Unfortunately this relapse occurs more frequently in affiliate marketing sites that tend to over-optimize.

One specific aspect that usually goes against such sites is their ad-to-content ratio. Some of these sites can have too many clickable banners that put off visitors. From a monetization perspective this may be understandable. But content is king now more than ever with the coming of Panda and affiliate marketers need to adjust to these stricter standards for quality content to survive the update. 

Online Stores

Web retailers come in second place in that poll. Some Yahoo! Stores were also listed on the Sistrix Visibility Index that presented specific domains that were negatively affected by the update. The way pundits analyze the loss in traffic for e-commerce sites, it seems that once again content is the issue.

Admittedly it is not easy to come up with unique descriptions and blurbs for every product when one has an inventory of hundreds. The typical solution has been to use templates and what are called boilerplate text. It is not hard to assess how this can pull down an online store’s content quality and thus affect its ranking based on the new search algorithms.

Answer and How-to

Google has publicly stated that content farms are the main targets of the update. While putting more focus on separating the wheat from the chaff is certainly a most welcome initiative, the process can be quite complex in terms of online quality content. Can one unambiguously evaluate sites such as eHow or WonderHowTo as content farms?

The latter’s CTO, Bryan Crow, divulged that scrapers as well as legitimate partner sites were actually outranking them on the SERPs after the update. The painful point here is that WonderHowTo was the original source of the duplicate content. Syndication and curating is part of the operations of such sites, which means that certain content are inevitably going to get re-published. According to his analysis, one possible solution is to look for specific problem content and execute a no-index tag for their pages. As the Panda update works on the domain level, this method would spare the other pages on the site that actually have unique quality content. On Google’s part, they have at least accommodated such situations by publicly asking for data points from sites that are losing to scrapers.

 *****************

Implications of the Google Panda Update on E-Commerce

Browse through the news regarding this latest major overhaul in Google’s search engine, and you can see that e-commerce websites are among those that got hit by the Panda. After playing by the rules of SEO and tenaciously maintaining their visibility, a lot of small and medium online retailers seem to be at a loss at their sudden drop in ranking in the SERPs (search engine results pages).

The changes made by Google on their algorithms seem to refocus and put more emphasis on content quality. Albeit ‘quality’ is quite a subjective term, the search giant was good enough to publish guidelines. Here is how some of the online marketing experts would apply the new set of criteria for those who sell products or services on the web. 

Over-optimized

Sometimes the major concern of visibility can lead owners to go over the top. There’s a thin line between keyword optimization and spamming and retailers should be more careful as the new modified search algorithm can be more merciless when it comes to such practices. Retailers also tend to put too much emphasis on optimization at the expense of user experience. Getting traffic is certainly important as it is a foot in the door. But conversion is actually what brings in the profits and that is more dependent on how the user finds the site, which means it all depends on how useful the content is.

Repetition

Maintaining a multipage site filled with products under various categories can be a tedious operation. More often than not, retailers tend to simplify things by using cookie-cutter texts slapped into over-used design templates. While this makes the editing and updating easier, such monotony can readily drain any initial interest in the prospective customer. When the same product description can be found on a competitor’s site where similar merchandise is available, that’s likely going to be flagged as duplicated content.

Devalued Content

One can have good content but still fail to show it. The really useful stuff could be buried too deep in other pages where the search engine spiders can’t find them. One can also make the mistake of going too far in the opposite direction and distribute indiscriminately through feeds and affiliate sites. There are cases seen after the update where shopping and affiliate sites outranked the online stores where the re-published content actually originated. Some SEO experts recommend that when you do come up with unique quality content, it is best too keep them and maintain their uniqueness.

Links

Linked texts that use keywords for anchors are often used for site navigation such as getting back to the top-level categories. In this respect, retailers can still commit the repetition mistake. It can be annoying for users to keep returning to the same page despite what they click on. There has to be clear and diversified path for prospective costumers to follow where succeeding content build on previous ones. This creates an avalanche of compelling messages that hopefully leads to conversion.

Social Media Elements

One of the significant changes that Google implemented was to put more weight on user feedback. Ever since social media gained more ground on the online world, its impact on e-commerce has been discussed frequently by marketing experts. This update only serves to solidify this inevitable influence. Product reviews, commenting and all the other relevant social media plug-ins when used properly can positively affect online businesses directly as well as improve SERP ranking when the new Googlebots drop by.

 *****************

Social Media’s Role in a Post-Google Panda SEO World

There’s no doubt about it that the changes after Google Panda’s roll out have been sweeping. Websites that were once on top of the search engine results saw a marked decrease in their traffic. Needless to say, practically all of those who have their own real estate spaces on the web were caught off-guard. Once the ruckus died out, what website owners, and even website management firms and SEO specialists were left with, were websites that needed a total overhaul.

This overhaul is needed, plain and simple. And one of the crucial factors you need to look into would be your site’s overall social media value. Google’s Panda update places significant consideration on a site’s buzz-worthiness, and if yours is anything but, then you’ll have a challenging time crawling back to Google’s good favors. 

Have Content That’s Worth Sharing

Content has always been and will always be a crucial and reliable gauging mechanism of a site’s quality. But because of Google’s prior algorithm that favored pages with tons of keywords, marketers found a clever way to out-Google Google, and that is by churning out keyword-heavy articles by the thousands, and on a daily basis too.

But it wasn’t too long before the end consumers complained as all they got to see were useless content. And add to that the fact that most of the pages that they were redirected to were mostly those with ad-heavy web pages that had keyword-spammy articles.

But this practice can no longer survive post-Google Panda. Putting out quality content on a regular, and preferably on a daily basis too, should be top priority. Once you’ve got this in order, then your goal of having a buzz-worthy site will slowly materialize and everything should slowly fall into place. Your site should once again be indexed by Google. Also, the people in your social media network will, without second thought, share your content to their respective networks as well.

Catch New Visitors’ Attention With Your Sitelinks

Perhaps the newest and most noticeable change that happened after the Google Panda was rolled out was the appearance of sitelinks for specific website searches. Sitelinks provide a one-click access to specific subpages within a site which Google thinks might be of interest to the searcher. Sitelinks automatically appear under the top entry for a search to enable users to quickly find relevant content.

Now, if you’ve long been with your own website, then you know just how much difficult it is to actually entice visitors, particularly the new ones, to further click through your site’s subpages. Internet users have very fleeting attention, even those people who have been directed to your site because of their respective social media network’s prodding.

So opt to assign high-value subpages with short and concise descriptions to be your sitelinks. Doing so will ensure that you always get users’ interest simply by providing them snippets of your content so that they’ll have a quick idea of what they’ll see on your site right there and then, without at all leaving the Google search page.   

Maximize Google+ Business

Putting up a website for your business used to be the paradigm. But this practice no longer proves to be sustainable, not only because of Google Panda, but because people will always check their networks for advise on a product or service before actually shelving out money for it. You simply can’t put a business website and hope and pray that your targeted audience will eventually see it. It’s up to you to offer your business to people who you think really wants your business.

Of course, you’ll have to do so in an unabashed and sophisticated manner so that you don’t end up dissuading your audience. And you can do this by maximizing everything Google+ has to offer, particularly its Business Pages. For one, having a profile on the leading search engine’s own social media channel will ensure that your site gets indexed every time, especially for localized searches.

Secondly, Google+ Business Pages makes use of a ton of applications that will be very useful and handy in managing your overall business. You’ll have access to its built-in payment system so financial transactions within it can be done. You can make use of advertising and analytics tools so that you get to check whether or not the strategy you’ve taken is efficient as well.

And best of all, within Google+ Business would be productivity and communication tools. You can readily manage your online operations using the built-in word processors and calendars, plus connect with your network through text and video chat too. Practically everything can be done within Google now. Think of it this way, you as a business owner already do pretty much of your operation using Google, right? So just imagine what the impact would be if you offer those in your social media network the chance to not leave their respective Google accounts just to connect with you and a whole lot of other people on the web.

No website owner will want to have to scratch back up again every single time Google releases an algorithm update. So while there won’t be anymore of these updates for the rest of 2011, it is imperative that you keep yourself abreast of the latest algorithm weather reports from the search engine giant so that you get to implement them accordingly and timely too.