Essential Conversion Tips

There are many ways to drive traffic towards your e-commerce website. Search engine optimization can move your site closer to the top of the search results, pulling in droves of interested visitors. Social media can generate buzz for your brand and a good email list can help keep prospects informed and interested. None of these techniques are worth much, however, if they don’t convert into a sale.

When you’re first starting out, it’s a common error to focus on traffic rather than on leads and sales. True, traffic is the lifeblood of any online enterprise — but only if you can establish a good conversion rate. These tips will help your enterprise to turn clicks into real sales.

Every successful enterprise starts with a solid and effective plan. The same goes for Web-based enterprises. Determine what you’re offering, how you’ll deliver it and what makes your business outstanding. Think about the elements your site will need to communicate with your visitors and what they’ll need in order to make a purchase. The better your focus at the beginning, the less patchy, fragmented or confusing your site will be.

Essential-Conversion-TipsIf your site is already up and running, the first item on your checklist should be your page load times. If there’s one thing that will send prospective customers fleeing, it’s a page that takes too long to load completely. Even a fraction of a second can be enough to cause your visitors to disengage. By streamlining your design and getting rid of any extraneous elements, you can shorten load times and make certain that you’re not keeping your customers waiting.

While multimedia elements such as audio and video can add value to your site, they can also be distracting and irritating. Media should not play automatically or obscure other elements when it launches. If the first thing your visitors have to do when they visit your site is close down an intrusive video or animation, they won’t be well disposed towards buying from you.

Next, examine your site’s overall usability. Your color, font and background choices can affect readability — ditch any low-contrast color schemes, distracting patterns or tiny typefaces. Interestingly, using a larger font size has been shown to improve conversion rates in many cases.

There should be a smooth path from your landing page to the point where a customer can place an order. If you make the process of buying your products too complicated, many people will be discouraged before they ever get as far as your checkout. Include clear pricing for every item and make sure there are easy-to-follow links to different products or sections. You should always include a site map — this is helpful to humans and also to the web crawlers that index your site for search engines like Google.

If you have a lot of different items for sale, don’t try to cram them all onto the first page. Catalog them and make sure that you have a fast, effective search system. Don’t limit the results per page too stringently — consider making the default 75 or 100 items rather than 10 or 20. You could even use an endless scroll, since most people are happier to scroll down the page than they are to click link after link to find the item they want. Just make sure that your automated scrolling element isn’t knocking navigation links off the bottom of the screen.

This final point can never be stressed too much: Registration should always be optional. Trying to force your customers to set up an account for no real reason can seriously damage your sales. Studies have shown that even if a prospective customer has selected their items and is ready to pay, telling them they have to register an account to complete their transaction will cause almost one-third of them to abandon their purchases. Only ask for the minimum amount of information you need to provide your customers with the items they want and reassure them that their privacy will be respected. Offer the opportunity to set up an account if they wish but always give them the option to check out without registering.

Instead, provide social media integration for your visitors. Giving people an easy way to link their Google Plus or Facebook accounts with your site is a more productive way of creating connections than adding an unnecessary registration step.

By keeping websites simple and avoiding common pitfalls such as unnecessary registration, Web-based businesses can avoid losing valuable sales. This article aims to address that problem by offering some helpful advice for those new to Internet marketing and e-commerce. It outlines some general ways in which conversion rates can be improved by making the sales process easier for customers to navigate.

SEO: Black, White & Grey

With the advent of Search Engine Optimization (SEO), various techniques and methods for improving the visibility of a website or webpage evolved. Industry experts have tried various processes to improve the ranking of a particular webpage so it appears earlier on the search results page. By optimizing search engine results, the aim revolves around the basic marketing gimmick – attract more visitors and improve frequency of visits.

SEO strategies have far surpassed the basic techniques: using appropriate keywords, editing HTML, removing barriers to indexing activities, increasing the number of backlinks, cross-linking, meta tags and much more. Over time, SEO techniques gained prominence, and now SEO is broadly distinguished into two broad categories: White Hat and Black Hat. I’ll shed some light on these two genres and also discuss how to test a site by combining White Hat and Black Hat.

Artwork: Search Engine Optimization

Black Hat SEO

Well the name itself explains a lot. Black Hat SEO serves the same purpose of improving the search engine results for a webpage. Yet the search engine optimization community has deemed Black Hat to be unethical.

However this wasn’t always the case. Black Hat used to be legitimate once. But a few folks crossed the tacit boundaries and as a result Black Hat is now shunned completely by the SEO community.

In essence Black hat SEO practices resort to short term gains. You run the risk of getting penalized (either temporarily or permanently) by search engines, if they discover the spam-advertising techniques on your website.

Black Hat is quite cheaper because it is usually done through automated software. I am going to present two basic examples so to give you a proper idea about what Black Hat really is.

Example 1:

The SEO community has regarded Black Hat techniques to be deceptive. One such practice uses text that is hidden, either in text colored similar to the background, in an invisible div, or positioned off screen.

Example 2:

Another trick might involve generating different versions of a same web page to different users: human or search engine. A human user is presented with different content, while the same content is optimized using subliminal and tacit gimmicks for search engines.

 

White Hat SEO

White Hat SEO practices are employed to improve the ranking of the website with a permanent and long-term prospective. White hat is generally slower – takes time to produce the desired results, but conforms to the search engine’s guidelines. White Hat practices costs much more than Black Hat techniques, as they are usually accomplished by humans. Time and money consumption are both high for White Hat.

An example of White hat SEO practices is to create content for users, and not search engines. Even though the content is optimized but with the intention of ensuring that the content a search engine indexes and ranks is the same content that is presented to the users.

 

Grey Hat SEO

I believe the most important factor to note are the tacit guidelines agreed-upon by the SEO community. Because these SEO rules are not a written commandment or document; therefore the distinguishing boundaries between Black, White and Gray Hats are quite opaque and often misunderstood.

Gray Hat SEO practices lie between the continuum, with White Hat on one extreme and Black Hat on the other. Gray Hat SEO techniques carry more risk than White Hat processes, but without the risk of getting penalized by search engines. They are questionable but not as controversial as Black Hat SEO techniques.

But with the evolving nature of the tech world (especially introduction of new algorithms by search engines), you never know if Gray Hat techniques are considered as Black Hat in the foreseeable future.

 

Black Hat Techniques

I am going to talk about a few well-known Black Hat techniques, that may or may not work in current algorithms used by search engines. Therefore these are not recommended at all

Keyword stuffing

This technique involves using a long list of keywords to increase the keyword count, variety and density. However such fluffed keywords are usually not part of your website content. Using out-of-context keywords can only make the web page relevant for the search engine, but not for the end-user.

Doorway/Gateway pages

This technique employs redirecting the search engine to another website. You might have often seen banners and ads on the side or bottom of a webpage with “click here to enter” tags. Doorways or gateway pages are low-quality web pages, stuffed with keywords and little or no content. This technique as usual is only relevant for the search engines and has no use for the visitor.

Cloaking

Cloaking is the technique of making the website search engine friendly by hiding animations and showing text only. Additionally Cloaking also refers to displaying different versions of the same content to users and search engines. “Server side scripting” is utilized to display a different content (or redirecting to different URLs) when a user visits the webpage and displaying different content for search engine spiders and Googlebots.

Linking

When different websites refer or hyperlink other web pages, this technique is named Linking. Linking improves the ranking of web-pages by fooling search engines.

Spamming forums, blogs and social media sites

Spamming comprises of placing links or keywords on forums, blogs and social media sites. Certain spam blogs are also created solely for commercial purposes. Links are placed into genuine sites such as Facebook, Twitter, Craigslist, etc., but which leads to illegitimate websites featuring bogus content or poorly written web content.

 

Grey Hat Techniques

If you are thinking about exploring Grey Hat SEO, you have to decide between time, money and the pertaining risks. Before moving on how to obtain the best of both worlds in the SEO universe, you need to reiterate what I’ve mentioned earlier.

In the case of White Hat, it is the “honest” approach. But playing good with search engines (Google) requires time and money. Whereas the Black Hat is concerned, it is getting blazing fast results through automated scripts but with the disadvantage of getting penalized (a.k.a. Sandboxed).

Here, I am going to talk about some techniques that are considered “safe” by some: combining black and white hats to avail benefits quickly, but for a longer period of time.

Social Bookmarking: Blog/ Forum links

Social Bookmarking may be a worthwhile experience for getting your website indexed and obtaining high-grade quality links. Websites such as “Diggs” and “Stumble Upon” will greatly enhance traffic to your website and yield durable results.

But how does Social Bookmarking works in favor for SEO? Commenting and posting on blogs and forums might earn you credible links, but it’s not possible through automated scripts or through genuine posting. Why can’t you post on blogs or forums? Because it’s very time consuming. But outsourcing is the key! I know it will add to your budget, but you can outsource posting or commenting on blogs or forums.

Some marketers use bookmarking tools alongside manual ones. According to some, 30%-40% automation is a safe mix.

Scraping & Spinning

You can grab the same content on different websites for spinning and publishing. Additionally, you can install scripts on your server which automatically pulls content from web and create content on your web pages after spinning with the help of synonyms.

You can grab content from RSS feed or other sites by mapping divs then rewrite them manually, through offline software or API calls and insert into the sites. WordPress based blog sites would come in handy to achieve this objective as the plugins are easy to get. You can literally generate a 10,000 page website only with a single click or you can schedule a fixed or random number of blog posts per day and leave on auto-pilot.

Some companies are even offering hosted solution with API capabilities.Some Grey Hatters outsource re-writing job for perfect human readability. Some of them, simply grab content, escape more than 60% of the content and insert into blog post by moving lines and paragraphs around.

Linking

Since reciprocal linking are not bringing too much value these days in most cases, Grey Hatters tend to engage more into Link Wheels, Link Pyramids and/or 3-way linking. This can be done manually or through software. According to some, mix of manual and automatic is the ideal method.

There are Australian, Canadian and Russian software vendors who are currently dominating the Grey Hat market with their software that serve this purpose.

Latent Semantic Indexing (LSI)

With Google incorporating LSI in its algorithms, LSI has become a popular technique for SEO experts. With the LSI in effect, algorithms can now think and act more like humans. Thus Google is trying to make its search experience more enhanced by pulling up the most relevant and informative pages. I’ve written a post previously on LSI that can be found here.

But what’s in it for Black or Grey Hatters? There are few LSI keyword generators floating around the web, which can optimize your content for LSI. But it’s not that simple. Because you will have to make sure that there is congruency and relevance between keywords and the content of a webpage – words, sentences and phrases. Moreover using a single keyword will adversely affect your website’s ranking. You’ll have to use multiple keywords.

Whatever I’ve mentioned so far are the basics in Black and Grey Hat SEO domain. I’ve seen webmasters and marketers using and abusing PR channels, Directories, YouTube, Craigslist, Kijiji, Yelp, BackPage, etc. to achieve their objectives. Some expert marketers suggest usage of these techniques in moderation. However, finding the magic combination takes months of experiments.

We now have new niches that accommodate Black and Grey Hatters. If you google properly or visit certain well-known forums, you will be able to buy Facebook fans, Twitter followers, Google+, YouTube views, Email accounts, Facebook & Twitter accounts, Captcha breakers and almost anything that could be used to run Black Hat and Grey Hat experiments. A lot of Black Hatters are more lenient to Silk Road (only available in the onion router network) whereas the Grey Hatters are pretty much sticking to forums and related portals. More and more webmasters and e-marketers are now getting involved in either Black Hat or Grey Hat marketing, which caused increase in sales of proxies, virtual credit cards, adwords accounts, SEO hosting, offshore VPS and so on.

All information on Black Hat and Grey Hat in this article is for educational and experimental purposes only and is not intended to promote them.

Is LSI better than backlinking for SEO?

Have you ever wondered how Google can guess what you’re searching in just a few letters? Were you ever surprised to find that the advertisement on the side of the screen always seemed to match whatever you happened to be thinking about? That’s no coincidence. That is a clever technique known as Latent Semantic Index in development at Google – now progressing in leaps and bounds.

In short, LSI is the ability to spot patterns in the words. People can remember dozens of ‘synonyms’, or words which mean basically the same thing. If in doubt, we can fish out a thesaurus or, yes, look it up on the internet. Even though the computer can tell us word alternatives, it doesn’t understand in the same sense, because it doesn’t speak human languages. It just processes data rather than ‘reading’ in the way a person does so easily. This makes matching text by content far more complicated.

The analysis comes from a math principle called Singular Value Decomposition, or SVD (simply put, a computer algorithm for working out statistics). You can read this wiki article on LSI for better understanding. After finding that its Adsense campaign was pretty successful, Google hired Applied Semantics to figure out how to improve the smooth running of their search engine. Before that, irrelevant pages would sometimes appear simply because they were stuffed with keywords.

Any webmaster familiar with Search Engine Optimization will know how it used to work – the text of a website would be filled to the brim with the same words over and over again, in order to trick search engines to think that the site had more relevant content than it really did. This ‘keyword stuffing’ technique bumped up the page ranking of these sites so that they appeared on the first page of search results. So, whatever you typed, you’d be likely to get a boring article, so full of key phrases it was practically unreadable.

On the other end of the scale, companies would insert irrelevant words into their text in order to ensnare people who were looking for something completely unrelated. Sneakily putting popular search words in to get hits on the site used to work (and still does on sites like Youtube) but with the semantic indexing project, all this does is water everything down. It’s a bit like if you plant seeds over a wide field, some of them will get neglected – if a web navigator scatters the keywords over too many distant topics, Google won’t recognize the site as a specialist in any topic.

Google’s ability to search for related content is a breakthrough. It ignores repetition and strikes offending sites from the rankings. Making use of this is a simple matter of using a whole bunch of related terms which people commonly search for – a technique called Latent Semantic Indexing. It sounds complicated, but it is actually a fairly basic part of SEO and something which we do naturally.

For the best success, you will need to find the most searched for terms. Using Google itself, any regular web author can use the tilde (~) in front of the search term to find result running along the same theme. To see which terms are likely to be most popular, head to a ‘high authority’ site – a website that is renowned for its accurate information on whatever subject of your site is about. Usually this is a big company, like Apple. Whichever terms they use, web architects should use.

Not only that, but Google can tell when two words that relate to different topics but spelled the same (called ‘polysemy’). It will not confuse inflation of the economy with inflation of a balloon. The spiders are able to crawl through the results and find out which words and phrases relate to each. Inflation in the sense of money is not likely to contain the words ‘hot air’. So if you search ‘inflation hot air’, you will not get a piece from the Wall Street Journal.

In that case, it’s better not to use any terms which might cause the poor Google spiders to get confused. If you wrote: ‘this panic about inflation is a load of hot air’, your article might be mistaken for a ballooning article and those who want to read about the economy won’t find your site with the usual search items.

Now that Google and other search engines analyze content more closely, the need for endless back links and anchor text has reduced. This should be a huge relief to any individual webmaster or a starting-out company with not many links to show for themselves – now new kids on the block get a piece of the pie, rather than having all the website visits stolen by big co-operations who throw money at IT specialists.

Before, topping the results included bumping up the page rank using links to other related sites (particularly ‘high authority ones) or internal links to other pages on that particular website. Keyworded-up, the links drew the spiders. They explored the site, checked how many reliable, functioning links there were and made assumptions about the content of the site based on how many other decent sites it referenced. All the line and colors made it a hideous strain on the eyes to read and rarely was the content any good – scraps of info laced with advertising opportunities.

These days, it’s actually better to reduce the number of links. Too many links going all over the place send your friendly internet spiders away from the most important places. Say you linked to an ICT firm and your main site from your blog. You probably want more focus on your main site, so that your main site can get an increased page rank, right? That won’t happen. Those links have equal value, so the spiders will branch off in all directions and get distracted.

The best way to set up a site that makes full use of LSI is to have numerous pages relating to a number of select topics within a theme, linked together in a straight forward way. If you make the search engine crawlers’ job easier, you will reap the benefits. Here is a basic picture of how the site should be mapped:

Every page contains only one link that directly leads to the next page. The last page leads to an external website. As you can see, it is divided into distinct sections. That way, the subjects covered aren’t competing with each other. In each topic, it’s best to pick a title and an angle that isn’t likely to have been used before. That way, you have no competitors for that article.

On a general subject, like nutrition, you might have three themes (minimum). Those loose themes are ‘landing pages’, an area for all the related topics to be pooled. The themes could be “Common Food Myths”, “The Uses of Supplements” and “The Truth about Exercise”. On these themes, there should be at least five separate pages linking into each other, each explaining an individual topic on the theme – e.g. on Food Myths, one article on meat, another on diets, etc. This layout will help both spiders and people navigate the site. A search of ‘meat myths’ will fetch up that one specific page that links to the rest of your site. This is better than putting all the good stuff on one page and risking looking irrelevant in the all-seeing eyes of Google.

Smart web maintenance will involve checking the present of keywords related specifically to the subject. Take this blog – even as you read it, it has been regulated for semantic search engines to increase its probability of being picked up by the counter. Each paragraph that you have head will have contained a handful of words relating to LSI without resorting to repetition – any writing on semantically related words will contain basic related words like ‘webpage’ and ‘Google; plus more specific terms like ‘synonym’ are vital tools for a webmaster to cover all the variables of SEO. Care has also been taken not to overlap with polysemic words or clash with other major sites. All it takes is some research and careful planning. High ranking blogs and articles go to the best prepared.

A search optimized piece of writing does not have to be over-technical or mind-numbingly dull. No longer do good search engines favour pages that endlessly repeat themselves. With LSA is a system that, like most new tech, improves the way we use the internet to allow the freedom to express what needs saying without worrying about falling in the ranks. It’s a new era for web content and every web writer is chomping at the bit to stay on top – understand Latent Semantic Indexing and you will achieve this perfectly. Also, application of LSI doesn’t completely overrule the value of backlinking although with the launch of ‘Google Caffeine’ it appears that LSI’s importance is a bit higher than backlinking. LSI is just one of the components that you should consider if you would like to optimize your site/portal for search engines.