Blog posts tagged with 'optimization'

RSS
Promoting Your Site to Increase Traffic - Wednesday, September 07, 2011

The main purpose of SEO is to make your site visible to search engines, thus leading to higher rankings in search results pages, which in turn brings more traffic to your site. And having more visitors (and above all buyers) is ultimately the goal in sites promotion. For truth's sake, SEO is only one alternative to promote your site and increase traffic – there are many other online and offline ways to do accomplish the goal of getting high traffic and reaching your target audience. We are not going to explore them in this tutorial but just keep in mind that search engines are not the only way to get visitors to your site, although they seem to be a preferable choice and a relatively easy way to do it.

1. Submitting Your Site to Search Directories, forums and special sites

After you have finished optimizing your new site, time comes to submit it to search engines. Generally, with search engines you don't have to do anything special in order to get your site included in their indices – they will come and find you. Well, it cannot be said exactly when they will visit your site for the first time and at what intervals they will visit it later but there is hardly anything that you can to do invite them. Sure, you can go to their Submit a Site pages in submit the URL of your new site but by doing this do not expect that they will hop to you right away. What is more, even if you submit your URL, most search engines reserve the right to judge whether to crawl your site or not. Anyway, here are the URLs for submitting pages in the three major search engines: Google, MSN, and Yahoo.

In addition to search engines, you may also want to have your site included in search directories as well. Although search directories also list sites that are relevant to a given topic, they are different from search engines in several aspects. First, search directories are usually maintained by humans and the sites in them are reviewed for relevancy after they have been submitted. Second, search directories do not use crawlers to get URLs, so you need to go to them and submit your site but once you do this, you can stay there forever and no more efforts on your side are necessary. Some of the most popular search directories are DMOZ and Yahoo! (the directory, not the search engine itself) and here are the URLs of their submissions pages: DMOZ and Yahoo!.

Sometimes posting a link to your site in the right forums or special sites can do miracles in terms of traffic. You need to find the forums and sites that are leaders in the fields of interest to you but generally even a simple search in Google or the other major search engines will retrieve their names. For instance, if you are a hardware freak, type “hardware forums” in the search box and in a second you will have a list of sites that are favorites to other hardware freaks. Then you need to check the sites one by one because some of them might not allow posting links to commercial sites. Posting into forums is more time-consuming than submitting to search engines but it could also be pretty rewarding.

2. Specialized Search Engines

Google, Yahoo!, and MSN are not the only search engines on Earth, nor even the only general-purpose ones. There are many other general-purpose and specialized search engines and some of them can be really helpful for reaching your target audience. You just can't imagine for how many niches specialized search engines exist – from law, to radiostations, to educational one! Some of them are actually huge sites that gather Webwide resources on a particular topic but almost all of them have sections for submitting links to external sites of interest. So, after you find the specialized search engines in your niche, go to their site and submit your URL – this could prove more trafficworthy than striving to get to the top of Google.

3. Paid Ads and Submissions

We have already mentioned some other alternatives to search engines – forums, specialized sites and search engines, search directories – but if you need to make sure that your site will be noticed, you can always resort to paid ads and submissions. Yes, paid listings are a fast and guaranteed way to appear in search results and most of the major search engines accept payment to put your URL in the Paid Links section for keywords of interest to you but you also must have in mind that users generally do not trust paid links as much as they do with the normal ones – in a sense it looks like you are bribing the search engine to place you where you can't get on your own, so think twice about the pros and cons of paying to get listed.



Comments (0)
Introduction – What Is SEO - Wednesday, September 07, 2011

Whenever you enter a query in a search engine and hit 'enter' you get a list of web results that contain that query term. Users normally tend to visit websites that are at the top of this list as they perceive those to be more relevant to the query. If you have ever wondered why some of these websites rank better than the others then you must know that it is because of a powerful web marketing technique called Search Engine Optimization (SEO).

SEO is a technique which helps search engines find and rank your site higher than the millions of other sites in response to a search query. SEO thus helps you get traffic from search engines.

This SEO tutorial covers all the necessary information you need to know about Search Engine Optimization - what is it, how does it work and differences in the ranking criteria of major search engines.

1. How Search Engines Work

The first basic truth you need to know to learn SEO is that search engines are not humans. While this might be obvious for everybody, the differences between how humans and search engines view web pages aren't. Unlike humans, search engines are text-driven. Although technology advances rapidly, search engines are far from intelligent creatures that can feel the beauty of a cool design or enjoy the sounds and movement in movies. Instead, search engines crawl the Web, looking at particular site items (mainly text) to get an idea what a site is about. This brief explanation is not the most precise because as we will see next, search engines perform several activities in order to deliver search results – crawling, indexing, processing, calculating relevancy, and retrieving.

First, search engines crawl the Web to see what is there. This task is performed by a piece of software, called a crawler or a spider (or Googlebot, as is the case with Google). Spiders follow links from one page to another and index everything they find on their way. Having in mind the number of pages on the Web (over 20 billion), it is impossible for a spider to visit a site daily just to see if a new page has appeared or if an existing page has been modified, sometimes crawlers may not end up visiting your site for a month or two.

What you can do is to check what a crawler sees from your site. As already mentioned, crawlers are not humans and they do not see images, Flash movies, JavaScript, frames, password-protected pages and directories, so if you have tons of these on your site, you'd better run the Spider Simulator below to see if these goodies are viewable by the spider. If they are not viewable, they will not be spidered, not indexed, not processed, etc. - in a word they will be non-existent for search engines.


Spider Simulator

Enter URL to Spider

After a page is crawled, the next step is to index its content. The indexed page is stored in a giant database, from where it can later be retrieved. Essentially, the process of indexing is identifying the words and expressions that best describe the page and assigning the page to particular keywords. For a human it will not be possible to process such amounts of information but generally search engines deal just fine with this task. Sometimes they might not get the meaning of a page right but if you help them by optimizing it, it will be easier for them to classify your pages correctly and for you – to get higher rankings.

When a search request comes, the search engine processes it – i.e. it compares the search string in the search request with the indexed pages in the database. Since it is likely that more than one page (practically it is millions of pages) contains the search string, the search engine starts calculating the relevancy of each of the pages in its index with the search string.

There are various algorithms to calculate relevancy. Each of these algorithms has different relative weights for common factors like keyword density, links, or metatags. That is why different search engines give different search results pages for the same search string. What is more, it is a known fact that all major search engines, like Yahoo!, Google, Bing, etc. periodically change their algorithms and if you want to keep at the top, you also need to adapt your pages to the latest changes. This is one reason (the other is your competitors) to devote permanent efforts to SEO, if you'd like to be at the top.

The last step in search engines' activity is retrieving the results. Basically, it is nothing more than simply displaying them in the browser – i.e. the endless pages of search results that are sorted from the most relevant to the least relevant sites.

2. Differences Between the Major Search Engines

Although the basic principle of operation of all search engines is the same, the minor differences between them lead to major changes in results relevancy. For different search engines different factors are important. There were times, when SEO experts joked that the algorithms of Bing are intentionally made just the opposite of those of Google. While this might have a grain of truth, it is a matter a fact that the major search engines like different stuff and if you plan to conquer more than one of them, you need to optimize carefully.

There are many examples of the differences between search engines. For instance, for Yahoo! and Bing, on-page keyword factors are of primary importance, while for Google links are very, very important. Also, for Google sites are like wine – the older, the better, while Yahoo! generally has no expressed preference towards sites and domains with tradition (i.e. older ones). Thus you might need more time till your site gets mature to be admitted to the top in Google, than in Yahoo!.


Comments (0)
Dynamic URLs vs. Static URLs - Tuesday, September 06, 2011

The Issue at Hand
Websites that utilize databases which can insert content into a webpage by way of a dynamic script like PHP or JavaScript are increasingly popular. This type of site is considered dynamic. Many websites choose dynamic content over static content. This is because if a website has thousands of products or pages, writing or updating each static by hand is a monumental task.

There are two types of URLs: dynamic and static. A dynamic URL is a page address that results from the search of a database-driven web site or the URL of a web site that runs a script. In contrast to static URLs, in which the contents of the web page stay the same unless the changes are hard-coded into the HTML, dynamic URLs are generated from specific queries to a site's database. The dynamic page is basically only a template in which to display the results of the database query. Instead of changing information in the HTML code, the data is changed in the database.

But there is a risk when using dynamic URLs: search engines don't like them. For those at most risk of losing search engine positioning due to dynamic URLs are e-commerce stores, forums, sites utilizing content management systems and blogs like Mambo or WordPress, or any other database-driven website. Many times the URL that is generated for the content in a dynamic site looks something like this:

   http://www.somesites.com/forums/thread.php?threadid=12345&sort=date

A static URL on the other hand, is a URL that doesn't change, and doesn't have variable strings. It looks like this:

   http://www.somesites.com/forums/the-challenges-of-dynamic-urls.htm

Static URLs are typically ranked better in search engine results pages, and they are indexed more quickly than dynamic URLs, if dynamic URLs get indexed at all. Static URLs are also easier for the end-user to view and understand what the page is about. If a user sees a URL in a search engine query that matches the title and description, they are more likely to click on that URL than one that doesn't make sense to them.

A search engine wants to only list pages its index that are unique. Search engines decide to combat this issue by cutting off the URLs after a specific number of variable strings (e.g.: ? & =).

For example, let's look at three URLs:

   http://www.somesites.com/forums/thread.php?threadid=12345&sort=date
   http://www.somesites.com/forums/thread.php?threadid=67890&sort=date
   http://www.somesites.com/forums/thread.php?threadid=13579&sort=date

All three of these URLs point to three different pages. But if the search engine purges the information after the first offending character, the question mark (?), now all three pages look the same:

   http://www.somesites.com/forums/thread.php
   http://www.somesites.com/forums/thread.php
   http://www.somesites.com/forums/thread.php

Now, you don't have unique pages, and consequently, the duplicate URLs won't be indexed.

Another issue is that dynamic pages generally do not have any keywords in the URL. It is very important to have keyword rich URLs. Highly relevant keywords should appear in the domain name or the page URL. This became clear in a recent study on how the top three search engines, Google, Yahoo, and MSN, rank websites.

The study involved taking hundreds of highly competitive keyword queries, like travel, cars, and computer software, and comparing factors involving the top ten results. The statistics show that of those top ten, Google has 40-50% of those with the keyword either in the URL or the domain; Yahoo shows 60%; and MSN has an astonishing 85%! What that means is that to these search engines, having your keywords in your URL or domain name could mean the difference between a top ten ranking, and a ranking far down in the results pages.

The Solution
So what can you do about this difficult problem? You certainly don't want to have to go back and recode every single dynamic URL into a static URL. This would be too much work for any website owner.

If you are hosted on a Linux server, then you will want to make the most of the Apache Mod Rewrite Rule, which is gives you the ability to inconspicuously redirect one URL to another, without the user's (or a search engine's) knowledge. You will need to have this module installed in Apache; for more information, you can view the documentation for this module here. This module saves you from having to rewrite your static URLs manually.

How does this module work? When a request comes in to a server for the new static URL, the Apache module redirects the URL internally to the old, dynamic URL, while still looking like the new static URL. The web server compares the URL requested by the client with the search pattern in the individual rules.

For example, when someone requests this URL:
   http://www.somesites.com/forums/the-challenges-of-dynamic-urls.html

The server looks for and compares this static-looking URL to what information is listed in the .htaccess file, such as:

   RewriteEngine on
   RewriteRule thread-threadid-(.*)\.htm$ thread.php?threadid=$1

It then converts the static URL to the old dynamic URL that looks like this, with no one the wiser:
   http://www.somesites.com/forums/thread.php?threadid=12345

You now have a URL that only will rank better in the search engines, but your end-users can definitely understand by glancing at the URL what the page will be about, while allowing Apache's Mod Rewrite Rule to handle to conversion for you, and still keeping the dynamic URL.

If you are not particularly technical, you may not wish to attempt to figure out the complex Mod Rewrite code and how to use it, or you simply may not have the time to embark upon a new learning curve. Therefore, it would be extremely beneficial to have something to do it for you. This URL Rewriting Tool can definitely help you. What this tool does is implement the Mod Rewrite Rule in your .htaccess file to secretly convert a URL to another, such as with dynamic and static ones.

With the URL Rewriting Tool, you can opt to rewrite single pages or entire directories. Simply enter the URL into the box, press submit, and copy and paste the generated code into your .htaccess file on the root of your website. You must remember to place any additional rewrite commands in your .htaccess file for each dynamic URL you want Apache to rewrite. Now, you can give out the static URL links on your website without having to alter all of your dynamic URLs manually because you are letting the Mod Rewrite Rule do the conversion for you, without JavaScript, cloaking, or any sneaky tactics.

Another thing you must remember to do is to change all of your links in your website to the static URLs in order to avoid penalties by search engines due to having duplicate URLs. You could even add your dynamic URLs to your Robots Exclusion Standard File (robots.txt) to keep the search engines from spidering the duplicate URLs. Regardless of your methods, after using the URL Rewrite Tool, you should ideally have no links pointing to any of your old dynamic URLs.

You have multiple reasons to utilize static URLs in your website whenever possible. When it's not possible, and you need to keep your database-driven content as those old dynamic URLs, you can still give end-users and search engine a static URL to navigate, and all the while, they are still your dynamic URLs in disguise. When a search engine engineer was asked if this method was considered "cloaking", he responded that it indeed was not, and that in fact, search engines prefer you do it this way. The URL Rewrite Tool not only saves you time and energy by helping you use static URLs by converting them transparently to your dynamic URLs, but it will also save your rankings in the search engines.

Comments (0)
The Importance of Backlinks - Tuesday, September 06, 2011

If you've read anything about or studied Search Engine Optimization, you've come across the term "backlink" at least once. For those of you new to SEO, you may be wondering what a backlink is, and why they are important. Backlinks have become so important to the scope of Search Engine Optimization, that they have become some of the main building blocks to good SEO. In this article, we will explain to you what a backlink is, why they are important, and what you can do to help gain them while avoiding getting into trouble with the Search Engines.

What are "backlinks"? Backlinks are links that are directed towards your website. Also knows as Inbound links (IBL's). The number of backlinks is an indication of the popularity or importance of that website. Backlinks are important for SEO because some search engines, especially Google, will give more credit to websites that have a good number of quality backlinks, and consider those websites more relevant than others in their results pages for a search query.

When search engines calculate the relevance of a site to a keyword, they consider the number of QUALITY inbound links to that site. So we should not be satisfied with merely getting inbound links, it is the quality of the inbound link that matters.
A search engine considers the content of the sites to determine the QUALITY of a link. When inbound links to your site come from other sites, and those sites have content related to your site, these inbound links are considered more relevant to your site. If inbound links are found on sites with unrelated content, they are considered less relevant. The higher the relevance of inbound links, the greater their quality.

For example, if a webmaster has a website about how to rescue orphaned kittens, and received a backlink from another website about kittens, then that would be more relevant in a search engine's assessment than say a link from a site about car racing. The more relevant the site is that is linking back to your website, the better the quality of the backlink.

Search engines want websites to have a level playing field, and look for natural links built slowly over time. While it is fairly easy to manipulate links on a web page to try to achieve a higher ranking, it is a lot harder to influence a search engine with external backlinks from other websites. This is also a reason why backlinks factor in so highly into a search engine's algorithm. Lately, however, a search engine's criteria for quality inbound links has gotten even tougher, thanks to unscrupulous webmasters trying to achieve these inbound links by deceptive or sneaky techniques, such as with hidden links, or automatically generated pages whose sole purpose is to provide inbound links to websites. These pages are called link farms, and they are not only disregarded by search engines, but linking to a link farm could get your site banned entirely.

Another reason to achieve quality backlinks is to entice visitors to come to your website. You can't build a website, and then expect that people will find your website without pointing the way. You will probably have to get the word out there about your site. One way webmasters got the word out used to be through reciprocal linking. Let's talk about reciprocal linking for a moment.

There is much discussion in these last few months about reciprocal linking. In the last Google update, reciprocal links were one of the targets of the search engine's latest filter. Many webmasters had agreed upon reciprocal link exchanges, in order to boost their site's rankings with the sheer number of inbound links. In a link exchange, one webmaster places a link on his website that points to another webmasters website, and vice versa. Many of these links were simply not relevant, and were just discounted. So while the irrelevant inbound link was ignored, the outbound links still got counted, diluting the relevancy score of many websites. This caused a great many websites to drop off the Google map.

We must be careful with our reciprocal links. There is a Google patent in the works that will deal with not only the popularity of the sites being linked to, but also how trustworthy a site is that you link to from your own website. This will mean that you could get into trouble with the search engine just for linking to a bad apple. We could begin preparing for this future change in the search engine algorithm by being choosier with which we exchange links right now. By choosing only relevant sites to link with, and sites that don't have tons of outbound links on a page, or sites that don't practice black-hat SEO techniques, we will have a better chance that our reciprocal links won't be discounted.

Many webmasters have more than one website. Sometimes these websites are related, sometimes they are not. You have to also be careful about interlinking multiple websites on the same IP. If you own seven related websites, then a link to each of those websites on a page could hurt you, as it may look like to a search engine that you are trying to do something fishy. Many webmasters have tried to manipulate backlinks in this way; and too many links to sites with the same IP address is referred to as backlink bombing.

One thing is certain: interlinking sites doesn't help you from a search engine standpoint. The only reason you may want to interlink your sites in the first place might be to provide your visitors with extra resources to visit. In this case, it would probably be okay to provide visitors with a link to another of your websites, but try to keep many instances of linking to the same IP address to a bare minimum. One or two links on a page here and there probably won't hurt you.

There are a few things to consider when beginning your backlink building campaign. It is helpful to keep track of your backlinks, to know which sites are linking back to you, and how the anchor text of the backlink incorporates keywords relating to your site. A tool to help you keep track of your backlinks is the Domain Stats Tool. This tool displays the backlinks of a domain in Google, Yahoo, and MSN. It will also tell you a few other details about your website, like your listings in the Open Directory, or DMOZ, from which Google regards backlinks highly important; Alexa traffic rank, and how many pages from your site that have been indexed, to name just a few.

Another tool to help you with your link building campaign is the Backlink Builder Tool. It is not enough just to have a large number of inbound links pointing to your site. Rather, you need to have a large number of QUALITY inbound links. This tool searches for websites that have a related theme to your website which are likely to add your link to their website. You specify a particular keyword or keyword phrase, and then the tool seeks out related sites for you. This helps to simplify your backlink building efforts by helping you create quality, relevant backlinks to your site, and making the job easier in the process.

There is another way to gain quality backlinks to your site, in addition to related site themes: anchor text. When a link incorporates a keyword into the text of the hyperlink, we call this quality anchor text. A link's anchor text may be one of the under-estimated resources a webmaster has. Instead of using words like "click here" which probably won't relate in any way to your website, using the words "Please visit our tips page for how to nurse an orphaned kitten" is a far better way to utilize a hyperlink. A good tool for helping you find your backlinks and what text is being used to link to your site is the Backlink Anchor Text Analysis Tool. If you find that your site is being linked to from another website, but the anchor text is not being utilized properly, you should request that the website change the anchor text to something incorporating relevant keywords. This will also help boost your quality backlinks score.

Building quality backlinks is extremely important to Search Engine Optimization, and because of their importance, it should be very high on your priority list in your SEO efforts. We hope you have a better understanding of why you need good quality inbound links to your site, and have a handle on a few helpful tools to gain those links.

Comments (0)
The Age of a Domain Name - Tuesday, September 06, 2011

One of the many factors in Google's search engine algorithm is the age of a domain name. In a small way, the age of a domain gives the appearance of longevity and therefore a higher relevancy score in Google.

Driven by spam sites which pop up and die off quickly, the age of the domain is usually a sign whether or not a site is yesterday's news or tomorrow's popular site. We see this in the world of business, for example. While the novelty that may go with a new store in town brings a short burst of initial business, people tend to trust a business that has been around for a long time over one that is brand new. The same is true for websites. Or, as Rob from BlackwoodProductions.com says, "Rent the store (i.e. register the domain) before you open for business".

Two things that are considered in the age of a domain name are:

  • The age of the website
  • The length of time a domain has been registered

The age of the website is built up of how long the content has been actually on the web, how long the site has been in promotion, and even the last time content was updated. The length of time a domain has been registered is measured by not only the actual date the domain was registered, but also how long it is registered for. Some domains only register for a year at a time, while others are registered for two, five, or even ten years.

In the latest Google update that SEOs call the Jagger Update, some of the big changes seen were the importance given to age; age of incoming links, age of web content, and the date the domain was registered. There were many things, in reality, that were changed in this last update, but since we're talking about the age of a domain, we'll only deal with those issues specifically. We'll talk more in other articles about other factors you will want to be aware of that Google changed in their evaluation criteria of websites on the Internet.

One of the ways Google uses to minimize search engine spam is by giving new websites a waiting period of three to four months before giving it any kind of PageRank. This is referred to as the "sandbox effect". It's called the "sandbox effect" because it has been said that Google wants to see if those sites are serious about staying around on the web. The sandbox analogy comes from the concept that Google does this by throwing all of the new sites into a sandbox and let them play together, away from all the adults. Then, when those new sites "grow up", so to speak, then they are allowed to be categorized with the "adults", or the websites that aren't considered new.

What does this mean to you? For those of you with new websites, you may be disappointed in this news, but don't worry. There are some things you can do while waiting for the sandbox period to expire, such as concentrating on your backlink strategies, promoting your site through Pay-per-click, articles, RSS feeds, or in other ways. Many times, if you spend this sandbox period wisely, you'll be ready for Google when it does finally assign you a PageRank, and you could find yourself starting out with a great PageRank!

Even though the domain's age is a factor, critics believe it only gets a little weight in the algorithm. Since the age of your domain is something you have no control over, it doesn't necessarily mean that your site isn't going to rank well in the Search Engine Results Pages (SERPs). It does mean, however, that you will have to work harder in order to build up your site popularity and concentrate on factors that you can control, link inbound links and the type of content you present on your website.

So what happens if you change your domain name? Does this mean you're going to get a low grade with a search engine if you have a new site? No, not necessarily. There are a few things you can do to help ensure that your site won't get lost in the SERPs because of the age of the domain.

1. Make sure you register your domain name for the longest amount of time possible. Many registrars allow you to register a domain name for as long as five years, and some even longer. Registering your domain for a longer period of time gives an indication that your site intends to be around for a long time, and isn't going to just disappear after a few months. This will help boost your score with regards to your domain's age.

2. Consider registering a domain name even before you are sure you're going to need it. We see many domains out there that even while they are registered; they don't have a website to go with it. This could mean that the site is in development, or simply someone saw the use of that particular domain name, and wanted to snatch it up before someone else did. There doesn't seem to be any problems with this method so far, so it certainly can't hurt you to buy a domain name you think could be catchy, even if you end up just selling it later on.

3. Think about purchasing a domain name that was already pre-owned. Not only will this allow you to avoid the "sandbox effect" of a new website in Google, but it also allows you to keep whatever PageRank may have already been attributed to the domain. Be aware that most pre-owned domains with PageRank aren't as cheaply had as a new domain, but it might be well worth it to you to invest a bit more money right at the start.

4. Keep track of your domain's age. One of the ways you can determine the age of a domain is with this handy Domain Age Tool. What it does is allows you to view the approximate age of a website on the Internet, which can be very helpful in determining what kind of edge your competitors might have over you, and even what a site might have looked like when it first started.

To use it, simply type in the URL of your domain and the URLs of your competitors, and click submit. This will give you the age of the domains and other interesting information, like anything that had been cached from the site initially. This could be especially helpful if you are purchasing a pre-owned domain.

Because trustworthy sites are going to have to be the wave of the future, factoring in the age of a domain is a good idea. Even though a site that may have been around for years may suddenly go belly-up, or the next big eBay or Yahoo! just might be getting it start, it may not be a full measure of how trustworthy a site is or will be. This is why there are many other factors that weigh into a search engine's algorithm and not just a single factor alone. What we do know is that we've seen age becoming of more importance that it had been previously, there are only good things to be said about having a site that's been around for a while.

Comments (0)
Ranking in Country Specific Search Engines - Tuesday, September 06, 2011

In the world of Search Engine Optimization, Location is important. Search engines like to bring relevant results to a user, not only in the area of keywords and sites that give the user exactly what they are looking for, but also in the correct language as well. It doesn't do a lot of good for a Russian-speaking individual to continually get websites returned in a search query that are written in Egyptian or in Chinese. So a search engine has to have some way to be able to return the results the user is looking for in the right language, and a search engine's goal is also to try and get the user as close to home as possible in the realm of their search results.

Many people wonder why their websites don't rank well in some search engines, especially if they are trying to get ranked in a search engine based in another country. Perhaps they may not even know they are in another country? You say that is impossible: how could one not know what country they are in? It might surprise that individual to find that their website might in fact be hosted in a completely different country, perhaps even on another continent!

Consider that many search engines, including Google, will determine country not only based on the domain name (like .co.uk or .com.au), but also the country of a website's physical location based upon IP address. Search engines are programmed with information that tells them which IP addresses belong to which particular country, as well as which domain suffixes are assigned to which countries.

Let's say, for instance, that you are wishing to rank highly in Google based in the United States. It would not do well, then, for you to have your website hosted in Japan or Australia. You might have to switch your web host to one whose servers reside in the United States.

There is a tool we like to use called the Website to Country Tool. What this tool does is it allows you to view which country your website is hosted. Not only will this tell you what country your site is hosted in, but it can also help you determine a possible reason why your website may not be ranking as highly as you might like in a particular search engine.

It might be disheartening to learn that your website has been hosted in another country, but it is better to understand why your site might not be ranking as highly as you'd like it to be, especially when there is something you can definitely do about it.

Comments (0)
Optimization, Over-Optimization or SEO Overkill? - Tuesday, September 06, 2011

The fight to top search engines' results knows no limits – neither ethical, nor technical. There are often reports of sites that have been temporarily or permanently excluded from Google and the other search engines because of malpractice and using “black hat” SEO optimization techniques. The reaction of search engines is easy to understand – with so many tricks and cheats that SEO experts include in their arsenal, the relevancy of returned results is seriously compromised to the point where search engines start to deliver completely irrelevant and manipulated search results. And even if search engines do not discover your scams right away, your competitors might report you.

Keyword Density or Keyword Stuffing?

Sometimes SEO experts go too far in their desire to push their clients' sites to top positions and resort to questionable practices, like keyword stuffing. Keyword stuffing is considered an unethical practice because what you actually do is use the keyword in question throughout the text suspiciously often. Having in mind that the recommended keyword density is from 3 to 7%, anything above this, say 10% density starts to look very much like keyword stuffing and it is likely that will not get unnoticed by search engines. A text with 10% keyword density can hardly make sense, if read by a human. Some time ago Google implemented the so called “Florida Update” and essentially imposed a penalty for pages that are keyword-stuffed and over-optimized in general.

Generally, keyword density in the title, the headings, and the first paragraphs matters more. Needless to say that you should be especially careful not to stuff these areas. Try the Keyword Density Cloud tool to check if your keyword density is in the acceptable limits, especially in the above-mentioned places. If you have a high density percentage for a frequently used keyword, then consider replacing some of the occurrences of the keyword with synonyms. Also, generally words that are in bold and/or italic are considered important by search engines but if any occurrence of the target keywords is in bold and italic, this also looks unnatural and in the best case it will not push your page up.

Doorway Pages and Hidden Text

Another common keyword scam is doorway pages. Before Google introduced the PageRank algorithm, doorways were a common practice and there were times when they were not considered an illegal optimization. A doorway page is a page that is made especially for the search engines and that has no meaning for humans but is used to get high positions in search engines and to trick users to come to the site. Although keywords are still very important, today keywords alone have less effect in determining the position of a site in search results, so doorway pages do not get so much traffic to a site but if you use them, don't ask why Google punished you.

Very similar to doorway pages was a scam called hidden text. This is text, which is invisible to humans (e.g. the text color is the same as the page background) but is included in the HTML source of the page, trying to fool search engines that the particular page is keyword-rich. Needless to say, both doorway pages and hidden text can hardly be qualified as optimization techniques, there are more manipulation than everything else.

Duplicate Content

It is a basic SEO rule that content is king. But not duplicate content. In terms of Google, duplicate content means text that is the same as the text on a different page on the SAME site (or on a sister-site, or on a site that is heavily linked to the site in question and it can be presumed that the two sites are related) – i.e. when you copy and paste the same paragraphs from one page on your site to another, then you might expect to see your site's rank drop. Most SEO experts believe that syndicated content is not treated as duplicate content and there are many examples of this. If syndicated content were duplicate content, that the sites of news agencies would have been the first to drop out of search results. Still, it does not hurt to check from time if your site has duplicate content with another, at least because somebody might be illegally copying your content and you do not know. The Similar Page Checker tool will help you see if you have grounds to worry about duplicate content.

Links Spam

Links are another major SEO tool and like the other SEO tools it can be used or misused. While backlinks are certainly important (for Yahoo backlinks are important as quantity, while for Google it is more important what sites backlinks come from), getting tons of backlinks from a link farm or a blacklisted site is begging to be penalized. Also, if outbound links (links from your site to other sites) considerably outnumber your inbound links (links from other sites to your site), then you have put too much effort in creating useless links because this will not improve your ranking. You can use the Domain Stats Tool to see the number of backlinks (inbound links) to your site and the Site Link Analyzer to see how many outbound links you have.

Using keywords in links (the anchor text), domain names, folder and file names does boost your search engine rankings but again, the precise measure is the boundary between topping the search results and being kicked out of them. For instance, if you are optimizing for the keyword “cat”, which is a frequently chosen keyword and as with all popular keywords and phrases, competition is fierce, you might not see other alternative for reaching the top but getting a domain name like http://www.cat-cats-kittens-kitty.com, which no doubt is packed with keywords to the maximum but is first – difficult to remember, and second – if the contents does not correspond to the plenitude of cats in the domain name, you will never top the search results.

Although file and folder names are less important than domain names, now and then (but definitely not all the time) you can include “cat” (and synonyms) in them and in the anchor text of the links. This counts well, provided that anchors are not artificially stuffed (for instance if you use “cat_cats_kitten” as anchor for internal site links this anchor certainly is stuffed). While you have no control over third sides that link to you and use anchors that you don't like, it is up to you to perform periodic checks what anchors do other sites use to link to you. A handy tool for this task is the Backlink Anchor Text Analysis, where you enter the URL and get a listing of the sites that link to you and the anchor text they use.

Finally, to Google and the other search engines it makes no difference if a site is intentionally over-optimized to cheat them or over-optimization is the result of good intentions, so no matter what your motives are, always try to keep to reasonable practices and remember that do not overstep the line.

Comments (0)
See Your Site With the Eyes of a Spider - Tuesday, September 06, 2011
Making efforts to optimize a site is great but what counts is how search engines see your efforts. While even the most careful optimization does not guarantee tops position in search results, if your site does not follow basic search engine optimisation truths, then it is more than certain that this site will not score well with search engines. One way to check in advance how your SEO efforts are seen by search engines is to use a search engine simulator.

Spiders Explained

Basically all search engine spiders function on the same principle – they crawl the Web and index pages, which are stored in a database and later use various algorithms to determine page ranking, relevancy, etc of the collected pages. While the algorithms of calculating ranking and relevancy widely differ among search engines, the way they index sites is more or less uniform and it is very important that you know what spiders are interested in and what they neglect.

Search engine spiders are robots and they do not read your pages the way a human does. Instead, they tend to see only particular stuff and are blind for many extras (Flash, JavaScript) that are intended for humans. Since spiders determine if humans will find your site, it is worth to consider what spiders like and what don't.

Flash, JavaScript, Image Text or Frames?!

Flash, JavaScript and image text are NOT visible to search engines. Frames are a real disaster in terms of SEO ranking. All of them might be great in terms of design and usability but for search engines they are absolutely wrong. An incredible mistake one can make is to have a Flash intro page (frames or no frames, this will hardly make the situation worse) with the keywords buried in the animation. Check with the Search Engine Spider Simulator tool a page with Flash and images (and preferably no text or inbound or outbound hyperlinks) and you will see that to search engines this page appears almost blank.

Running your site through this simulator will show you more than the fact that Flash and JavaScript are not SEO favorites. In a way, spiders are like text browsers and they don't see anything that is not a piece of text. So having an image with text in it means nothing to a spider and it will ignore it. A workaround (recommended as a SEO best practice) is to include meaningful description of the image in the ALT attribute of the <IMG> tag but be careful not to use too many keywords in it because you risk penalties for keyword stuffing. ALT attribute is especially essential, when you use links rather than text for links. You can use ALT text for describing what a Flash movie is about but again, be careful not to trespass the line between optimization and over-optimization.

Are Your Hyperlinks Spiderable?

The search engine spider simulator can be of great help when trying to figure out if the hyperlinks lead to the right place. For instance, link exchange websites often put fake links to your site with _javascript (using mouse over events and stuff to make the link look genuine) but actually this is not a link that search engines will see and follow. Since the spider simulator would not display such links, you'll know that something with the link is wrong.

It is highly recommended to use the <noscript> tag, as opposed to _javascript based menus. The reason is that _javascript based menus are not spiderable and all the links in them will be ignored as page text. The solution to this problem is to put all menu item links in the <noscript> tag. The <noscript> tag can hold a lot but please avoid using it for link stuffing or any other kind of SEO manipulation.

If you happen to have tons of hyperlinks on your pages (although it is highly recommended to have less than 100 hyperlinks on a page), then you might have hard times checking if they are OK. For instance, if you have pages that display “403 Forbidden”, “404 Page Not Found” or similar errors that prevent the spider from accessing the page, then it is certain that this page will not be indexed. It is necessary to mention that a spider simulator does not deal with 403 and 404 errors because it is checking where links lead to not if the target of the link is in place, so you need to use other tools for checking if the targets of hyperlinks are the intended ones.

Looking for Your Keywords

While there are specific tools, like the Keyword Playground or the Website Keyword Suggestions, which deal with keywords in more detail, search engine spider simulators also help to see with the eyes of a spider where keywords are located among the text of the page. Why is this important? Because keywords in the first paragraphs of a page weigh more than keywords in the middle or at the end. And if keywords visually appear to us to be on the top, this may not be the way spiders see them. Consider a standard Web page with tables. In this case chronologically the code that describes the page layout (like navigation links or separate cells with text that are the same sitewise) might come first and what is worse, can be so long that the actual page-specific content will be screens away from the top of the page. When we look at the page in a browser, to us everything is fine – the page-specific content is on top but since in the HTML code this is just the opposite, the page will not be noticed as keyword-rich.

Are Dynamic Pages Too Dynamic to be Seen At All

Dynamic pages (especially ones with question marks in the URL) are also an extra that spiders do not love, although many search engines do index dynamic pages as well. Running the spider simulator will give you an idea how well your dynamic pages are accepted by search engines. Useful suggestions how to deal with search engines and dynamic URLs can be found in the Dynamic URLs vs. Static URLs article.

Meta Keywords and Meta Description

Meta keywords and meta description, as the name implies, are to be found in the <META> tag of a HTML page. Once meta keywords and meta descriptions were the single most important criterion for determining relevance of a page but now search engines employ alternative mechanisms for determining relevancy, so you can safely skip listing keywords and description in Meta tags (unless you want to add there instructions for the spider what to index and what not but apart from that meta tags are not very useful anymore).

Comments (0)
Jumping Over the Google Sandbox - Tuesday, September 06, 2011

It's never easy for newcomers to enter a market and there are barriers of different kinds. For newcomers to the world of search engines, the barrier is called a sandbox – your site stays there until it gets mature enough to be allowed to the Top Positions club. Although there is no direct confirmation of the existence of a sandbox, Google employees have implied it and SEO experts have seen in practice that new sites, no matter how well optimized, don't rank high on Google, while on MSN and Yahoo they catch quickly. For Google, the jailing in the sandbox for new sites with new domains is on average 6 months, although it can vary from less than a month to over 8 months.

Sandbox and Aging Delay

While it might be considered unfair to stop new sites by artificial means like keeping them at the bottom of search results, there is a fair amount of reasoning why search engines, and above all Google, have resorted to such measures. With blackhat practices like bulk buying of links, creation of duplicate content or simply keyword stuffing to get to the coveted top, it is no surprise that Google chose to penalize new sites, which overnight get tons of backlinks, or which are used as a source of backlinks to support an older site (possibly owned by the same company). Needless to say, when such fake sites are indexed and admitted to top positions, this deteriorates search results, so Google had to take measures for ensuring that such practices will not be tolerated. The sandbox effect works like a probation period for new sites and by making the practice of farming fake sites a long-term, rather than a short-term payoff for site owners, it is supposed to decrease its use.

Sandbox and aging delay are similar in meaning and many SEO experts use them interchangeably. Aging delay is more self-explanatory – sites are “delayed” till they come of age. Well, unlike in legislation, with search engines this age is not defined and it differs. There are cases when several sites were launched in the same day, were indexed within a week from each other but the aging delay for each of them expired in different months. As you see, the sandbox is something beyond your control and you cannot avoid it but still there are steps you can undertake to minimize the damage for new sites with new domains.

Minimizing Sandbox Damages

While Google sandbox is not something you can control, there are certain steps you can take in order to make the sandbox effect less destructive for your new site. As with many aspects of SEO, there are ethical and unethical tips and tricks and unethical tricks can get you additional penalties or a complete ban from Google, so think twice before resorting to them. The unethical approaches will not be discussed in this article because they don comply with our policy.

Before we delve into more detail about particular techniques to minimize sandbox damage, it is necessary to clarify the general rule: you cannot fight the sandbox. The only thing you can do is to adapt to it and patiently wait for time to pass. Any attempts to fool Google – starting from writing melodramatic letters to Google, to using “sandbox tools” to bypass the filter – can only make your situation worse. There are many initiatives you can take, while in the sandbox, for as example:

  • Actively gather content and good links – as time passes by, relevant and fresh content and good links will take you to the top. When getting links, have in mind that they need to be from trusted sources – like DMOZ, CNN, Fortune 500 sites, or other reputable places. Also, links from .edu, .gov, and .mil domains might help because these domains are usually exempt from the sandbox filter. Don't get 500 links a month – this will kill your site! Instead, build links slowly and steadily.

  • Plan ahead– contrary to the general practice of launching a site when it is absolutely complete, launch a couple of pages, when you have them. This will start the clock and time will be running parallel to your site development efforts.

  • Buy old or expired domains – the sandbox effect is more serious for new sites on new domains, so if you buy old or expired domains and launch your new site there, you'll experience less problems.

  • Host on a well- established host – another solution is to host your new site on a subdomain of a well-established host (however, free hosts are generally not a good idea in terms of SEO ranking). The sandbox effect is not so severe for new subdomains (unless the domain itself is blacklisted). You can also host the main site on a subdomain and on a separate domain host just some contents, linked with the main site. You can also use redirects from the subdomained site to the new one, although the effect of this practice is also questionable because it can also be viewed as an attempt to fool Google.

  • Concentrate on less popular keywords – the fact that your site is sandboxed does not mean that it is not indexed by Google at all. On the contrary, you could be able to top the search results from the very beginning! Looking like a contradiction with the rest of the article? Not at all! You could top the results for less popular keywords – sure, it is better than nothing. And while you wait to get to the top for the most lucrative keywords, you can discover that even less popular keywords are enough to keep the ball rolling, so you may want to make some optimization for them.

  • Rely more on non-Google ways to increase traffic – it is often reminded that Google is not the only search engine or marketing tool out there. So if you plan your SEO efforts to include other search engines, which either have no sandbox at all or the period of stay there is relatively short, this will also minimize the damages of the sandbox effect.

Comments (0)
Bad Neighborhood - Tuesday, September 06, 2011

Has it ever happened to you to have a perfectly optimized site with lots of links and content and the right keyword density and still do not rank high in search engines? Probably every SEO has experienced this. The reasons for such kind of failure can be really diverse – starting from the sandbox effect (your site just needs time to get mature), to overoptimization and inappropriate online relations (i.e. the so called “bad neighborhood” effect).

While there is not much you can do about the sandbox effect but wait, in most other cases it is up to you to counteract the negative effects you are suffering from. You just need to figure out what is stopping you from achieving the deserved rankings. Careful analysis of your site and the sites that link to you can give you ideas where to look for for the source of trouble and deal with it. If it is overoptimization – remove excessive stuffing; if it is bad neighbors – say “goodbye” to them. We have already deals with overoptimization as a SEO overkill and in this article we will have a look at another frequent rankings killer.

Link Wisely, Avoid Bad Neighbors

It is a known fact that one of the most important items for high rankings, especially with Google, are links. The Web is woven out of links and inbound and outbound links are most natural. Generally, the more inbound links (i.e. other sites link to you) you have, the better. On the contrary, if you have many outbound links, this is not very good. And what is worse – it can be disastrous, if you link to improper places – i.e. bad neighbors. The concept is hardly difficult to comprehend – it is so similar to real life: if you choose outlaws or bad guys for friends, you are considered to be one of them.

It might look unfair to be penalized for things that you have not done but linking to sites with bad reputation is equal to a crime for search engines and by linking to such a site, you can expect to be penalized as well. And yes, it is fair because search engines do penalize sites that use different tricks to manipulate search results. In a way, in order to guarantee the integrity of search results, search engines cannot afford to tolerate unethical practices.

However, search engines tend to be fair and do not punish you for things that are out of your control. If you have many inbound links from suspicious sites, this will not be regarded as a malpractice on your side because generally it is their Web master, not you, who has put all these links. So, inbound links, no matter where they come from, cannot harm you. But if in addition to inbound links, you have a considerable amount of outbound links to such sites, in a sense you vote for them. Search engines consider this as malpractice and you will get punished.

Why Do Some Sites Get Labelled as Bad Neighbors?

We have already mentioned in this article some of the practices that are a reason for search engines to ban particular sites. But the “sins” are not only limited to being a spam domain. Generally, companies get blacklisted because they try to boost their ranking by using illegal techniques such as keyword stuffing, duplicate content (or lack of any original content), hidden text and links, doorway pages, deceptive titles, machine-generated pages, copyright violators, etc. Search engines also tend to dislike meaningless link directories that conceive the impression that they are topically arranged, so if you have a fat links section on your site, double-check what you link to.

Figuring Out Who's Good, Who's Not

Probably the question that is popping is: “But since the Web is so vast and so constantly changing, how can I know who is good and who is bad?” Well, you don't have to know each of the sites on the black list, even if it were possible. The black list itself is changing all the time but it looks like there will always be companies and individuals who are eager to earn some cash by spamming, disseminating viruses and porn or simply performing fraudulent activities.

The first check you need to perform when you have doubts that some of the sites you are linking to are bed neighbors is to see if they are included in the indices of Google and the other search engines. Type “site:siteX.com”, where “siteX.com” is the site you are performing a check about and see if Google returns any results from it. If it does not return any results, chances are that this site is banned from Google and you should immediately remove any outbound links to siteX.com.

If you have outbound links to many different sites, such checks might take a lot of time. Fortunately, there are tools that can help you in performing this task. The CEO of Blackwood Productions has recommended http://www.bad-neighborhood.com/ as one of the reliable tools that reports links to and from suspicious sites and sites that are missing in Google's index.

Comments (0)
Optimizing Flash Sites - Tuesday, September 06, 2011

If there is a really hot potato that divides SEO experts and Web designers, this is Flash. Undoubtedly a great technology to include sounds and picture on a Web site, Flash movies are a real nightmare for SEO experts. The reason is pretty prosaic – search engines cannot index (or at least not easily) the contents inside a Flash file and unless you feed them with the text inside a Flash movie, you can simply count this text lost for boosting your rankings. Of course, there are workarounds but until search engines start indexing Flash movies as if they were plain text, these workarounds are just a clumsy way to optimize Flash sites, although certainly they are better than nothing.

Why Search Engines Dislike Flash Sites?

Search engines dislike Flash Web sites not because of their artistic qualities (or the lack of these) but because Flash movies are too complex for a spider to understand. Spiders cannot index a Flash movie directly, as they do with a plain page of text. Spiders index filenames (and you can find tons of these on the Web), but not the contents inside.

Flash movies come in a proprietary binary format (.swf) and spiders cannot read the insides of a Flash file, at least not without assistance. And even with assistance, do not count that spiders will crawl and index all your Flash content. And this is true for all search engines. There might be differences in how search engines weigh page relevancy but in their approach to Flash, at least for the time beings, search engines are really united – they hate it but they index portions of it.

What (Not) to Use Flash For?

Despite the fact that Flash movies are not spider favorites, there are cases when a Flash movie is worth the SEO efforts. But as a general rule, keep Flash movies at a minimum. In this case less is definitely better and search engines are not the only reason. First, Flash movies, especially banners and other kinds of advertisement, distract users and they generally tend to skip them. Second, Flash movies are fat. They consume a lot of bandwidth, and although dialup days are over for the majority of users, a 1 Mbit connection or better is still not the standard one.

Basically, designers should keep to the statement that Flash is good for enhancing a story, but not for telling it – i.e. you have some text with the main points of the story (and the keywords that you optimize for) and then you have the Flash movie to add further detail or just a visual representation of the story. In that connection, the greatest SEO sin is to have the whole site made in Flash! This is is simply unforgivable and do not even dream of high rankings!

Another “No” is to use Flash for navigation. This applies not only to the starting page, where once it was fashionable to splash a gorgeous Flash movie but external links as well. Although it is a more common mistake to use images and/or javascript for navigation, Flash banners and movies must not be used to lead users from one page to another. Text links are the only SEO approved way to build site navigation.

Workarounds for Optimizing Flash Sites

Although a workaround is not a solution, Flash sites still can be optimized. There are several approaches to this:

  • Input metadata
    This is a very important approach, although it is often underestimated and misunderstood. Although metadata is not as important to search engines as it used to be, Flash development tools allow easily to add metadata to your movies, so there is no excuse to leave the metadata fields empty.

  • Provide alternative pages
    For a good site it is a must to provide html only pages that do not force the user to watch the Flash movie. Preparing these pages requires more work but the reward is worth because not only users, but search engines as well will see the html only pages.

  • Flash Search Engine SDK
    This is the life-belt. The most advanced tool to extract text from a Flash movie. One of the handiest applications in the Flash Search Engine SDK is the tool named swf2html. As it name implies, this tool extracts text and links from a Macromedia Flash file and writes the output unto a standard HTML document, thus saving you the tedious job to do it manually.
    However, you still need to have a look at the extracted contents and correct it, if necessary. For example, the order in which the text and links is arranged might need a little restructuring in order to put the keyword-rich content in the title and headings or in the beginning of the page.
    Also, you need to check if there is no duplicate content among the extracted sentences and paragraphs. The font color of the extracted text is also another issue. If the font color of the extracted text is the same as the background color, you will run into hidden text territory.

  • SE-Flash.com
    Here is a tool that visually shows what from your Flash files is visible to search engines and what is not. This tool is very useful, even if you already have the Flash Search Engine SDK installed because it provides one more check of the accuracy of the extracted text. Besides, it is not certain that Google and the other search engines use Flash Search Engine SDK to get contents from a Flash file, so this tool might give completely different results from those that the SDK will produce.

These approaches are just some of the most important examples of how to optimize Flash sites. There are many other approaches as well. However, not all of them are brilliant and clear, or they can be classified on the boundary of ethical SEO – e.g. creating invisible layers of text that is delivered to spiders instead the Flash movie itself. Although this technique is not wrong – i.e. there is no duplicate or fake content, it is very similar to cloaking and doorway pages and it is better to avoid it.

Comments (0)
How to Build Backlinks - Tuesday, September 06, 2011

It is out of question that quality backlinks are crucial to SEO success. More, the question is how to get them. While with on-page content optimization it seems easier because everything is up to you to do and decide, with backlinks it looks like you have to rely on others to work for your success. Well, this is partially true because while backlinks are links that start on another site and point to yours, you can discuss with the Web master of the other site details like the anchor text, for example. Yes, it is not the same as administering your own sites – i.e. you do not have total control over backlinks – but still there are many aspects that can be negotiated.

Getting Backlinks the Natural Way

The idea behind including backlinks as part of the page rank algorithm is that if a page is good, people will start linking to it. And the more backlinks a page has, the better. But in practice it is not exactly like this. Or at least you cannot always rely on the fact that your contents is good and people will link to you. Yes, if your content is good and relevant you can get a lot of quality backlinks, including from sites with similar topic as yours (and these are the most valuable kind of backlinks, especially if the anchor text contains your keywords) but what you get without efforts could be less than what you need to successfully promote your site. So, you less than what you need to successfully promote your site. So, you will have to resort to other ways of acquiring quality backlinks as described next.

Ways to Build Backlinks

Even if plenty of backlinks come to your site the natural way, additional quality backlinks are always welcome and the time you spend building them is not wasted. Among the acceptable ways of link building are getting listed in directories, posting in forums, blogs and article directories. The unacceptable ways include inter-linking (linking from one site to another site, which is owned by the same owner or exists mainly for the purpose to be a link farm), linking to spam sites or sites that host any kind of illegal content, purchasing links in bulk, linking to link farms, etc.

The first step in building backlinks is to find the places from which you can get quality backlinks. A valuable assistant in this process is the Backlink Builder tool. When you enter the keywords of your choice, the Backlink Builder tool gives you a list of sites where you can post an article, message, posting, or simply a backlink to your site. After you have the list of potential backlink partners, it is up to you to visit each of the sites and post your content with the backlink to your site in it.

You might wonder why sites as those, listed by the Backlink Builder tool provide such a precious asset as backlinks for free. The answer is simple – they need content for their site. When you post an article, or submit a link to your site, you do not get paid for this. You provide them for free with something they need – content – and in return they also provide you for free with something you need – quality backlinks. It is a free trade, as long as the sites you post your content or links are respected and you don't post fake links or content.

Getting Listed in Directories

If you are serious about your Web presence, getting listed in directories like DMOZ and Yahoo is a must – not only because this is a way to get some quality backlinks for free, but also because this way you are easily noticed by both search engines and potential visitors. Generally inclusion in search directories is free but the drawback is that sometimes you have to wait a couple of months before you get listed in the categories of your choice.

Forums and Article Directories

Generally search engines index forums so posting in forums and blogs is also a way to get quality backlinks with the anchor text you want. If the forum or blog is a respected one, a backlink is valuable. However, in some cases the forum or blog administrator can edit your post, or even delete it if it does not fit into the forum or blog policy. Also, sometimes administrators do not allow links in posts, unless they are relevant ones. In some rare cases (which are more an exception than a rule) the owner of a forum or a blog would have banned search engines from indexing it and in this case posting backlinks there is pointless.

While forum postings can be short and do not require much effort, submitting articles to directories can be more time-consuming because generally articles are longer than posts and need careful thinking while writing them. But it is also worth and it is not so difficult to do.

Content Exchange and Affiliate Programs

Content exchange and affiliate programs are similar to the previous method of getting quality backlinks. For instance, you can offer to interested sites RSS feeds for free. When the other site publishes your RSS feed, you will get a backlink to your site and potentially a lot of visitors, who will come to your site for more details about the headline and the abstract they read on the other site.

Affiliate programs are also good for getting more visitors (and buyers) and for building quality backlinks but they tend to be an expensive way because generally the affiliate commission is in the range of 10 to 30 %. But if you have an affiliate program anyway, why not use it to get some more quality backlinks?

News Announcements and Press Releases

Although this is hardly an everyday way to build backlinks, it is an approach that gives good results, if handled properly. There are many sites (for instance, here is a list of some of them) that publish for free or for a fee news announcements and press releases. A professionally written press release about an important event can bring you many, many visitors and the backlink from a respected site to yours is a good boost to your SEO efforts. The tricky part is that you cannot release press releases if there is nothing newsworthy. That is why we say that news announcements and press releases are not a commodity way to build backlinks.

Backlink Building Practices to Avoid

One of the practices that is to be avoided is link exchange. There are many programs, which offer to barter links. The principle is simple – you put a link to a site, they put a backlink to your site. There are a couple of important things to consider with link exchange programs. First, take care about the ratio between outbound and inbound links. If your outbound links are times your inbound, this is bad. Second (and more important) is the risk that your link exchange partners are link farms. If this is the case, you could even be banned from search engines, so it is too risky to indulge in link exchange programs.

Linking to suspicious places is something else that you must avoid. While it is true that search engines do not punish you if you have backlinks from such places because it is supposed that you have no control over what bad guys link to, if you enter a link exchange program with the so called bad neighbors and you link to them, this can be disastrous to your SEO efforts. For more details about bad neighbors, check the Bad Neighborhood article. Also, beware of getting tons of links in a short period of time because this still looks artificial and suspicious.

Comments (0)
Importance of Sitemaps - Tuesday, September 06, 2011

There are many SEO tips and tricks that help in optimizing a site but one of those, the importance of which is sometimes underestimated is sitemaps. Sitemaps, as the name implies, are just a map of your site - i.e. on one single page you show the structure of your site, its sections, the links between them, etc. Sitemaps make navigating your site easier and having an updated sitemap on your site is good both for your users and for search engines. Sitemaps are an important way of communication with search engines. While in robots.txt you tell search engines which parts of your site to exclude from indexing, in your site map you tell search engines where you'd like them to go.

Sitemaps are not a novelty. They have always been part of best Web design practices but with the adoption of sitemaps by search engines, now they become even more important. However, it is necessary to make a clarification that if you are interested in sitemaps mainly from a SEO point of view, you can't go on with the conventional sitemap only (though currently Yahoo! and MSN still keep to the standard html format). For instance, Google Sitemaps uses a special (XML) format that is different from the ordinary html sitemap for human visitors.

One might ask why two sitemaps are necessary. The answer is obvious - one is for humans, the other is for spiders (for now mainly Googlebot but it is reasonable to expect that other crawlers will join the club shortly). In that relation it is necessary to clarify that having two sitemaps is not regarded as duplicate content. In 'Introduction to Sitemaps', Google explicitly states that using a sitemap will never lead to penalty for your site.

Why Use a Sitemap

Using sitemaps has many benefits, not only easier navigation and better visibility by search engines. Sitemaps offer the opportunity to inform search engines immediately about any changes on your site. Of course, you cannot expect that search engines will rush right away to index your changed pages but certainly the changes will be indexed faster, compared to when you don't have a sitemap.

Also, when you have a sitemap and submit it to the search engines, you rely less on external links that will bring search engines to your site. Sitemaps can even help with messy internal links - for instance if you by accident have broken internal links or orphaned pages that cannot be reached in other way (though there is no doubt that it is much better to fix your errors than rely on a sitemap).

If your site is new, or if you have a significant number of new (or recently updated pages), then using a sitemap can be vital to your success. Although you can still go without a sitemap, it is likely that soon sitemaps will become the standard way of submitting a site to search engines. Though it is certain that spiders will continue to index the Web and sitemaps will not make the standard crawling procedures obsolete, it is logical to say that the importance of sitemaps will continue to increase.

Sitemaps also help in classifying your site content, though search engines are by no means obliged to classify a page as belonging to a particular category or as matching a particular keyword only because you have told them so.

Having in mind that the sitemap programs of major search engines (and especially Google) are still in beta, using a sitemap might not generate huge advantages right away but as search engines improve their sitemap indexing algorithms, it is expected that more and more sites will be indexed fast via sitemaps.

Generating and Submitting the Sitemap

The steps you need to perform in order to have a sitemap for your site are simple. First, you need to generate it, then you upload it to your site, and finally you notify Google about it.

Depending on your technical skills, there are two ways to generate a sitemap - to download and install a sitemap generator or to use an online sitemap generation tool. The first is more difficult but you have more control over the output. You can download the Google sitemap generator from here. After you download the package, follow the installation and configuration instructions in it. This generator is a Python script, so your Web server must have Python 2.2 or later installed, in order to run it.

The second way to generate a sitemap is easier. There are many free online tools that can do the job for you. For instance, have a look at this collection of Third-party Sitemap tools. Although Google says explicitly that it has neither tested, nor verified them, this list will be useful because it includes links to online generators, downloadable sitemap generators, sitemap plugins for popular content-management systems, etc., so you will be able to find exactly what you need.

After you have created the sitemap, you need to upload it to your site (if it is not already there) and notify Google about its existence. Notifying Google includes adding the site to your Google Sitemaps account, so if you do not have an account with Google, it is high time to open one. Another detail that is useful to know in advance is that in order to add the sitemap to your account, you need to verify that you are the legitimate owner of the site.

Currently Yahoo! and MSN do not support sitemaps, or at least not in the XML format, used by Google. Yahoo! allows webmasters to submit “a text file with a list of URLs” (which can actually be a stripped-down version of a site map), while MSN does not offer even that but there are rumors that it is indexing sitemaps when they are available onsite. Most likely this situation will change in the near future and both Yahoo! and MSN will catch with Google because user-submitted site maps are just a too powerful SEO tool and cannot be ignored.

Comments (0)
Keyword Difficulty - Tuesday, September 06, 2011

The wise choice of the right keywords you will optimize for is the first and crucial step to a successful SEO campaign. If you fail on this very first step, the road ahead is very bumpy and most likely you will only waste your (or your client's) money and time. There are many ways to determine which keywords to optimize for and usually the final list of them is made after a careful analysis of what the online population is searching for, which keywords have your competitors chosen and above all - which are the keywords that you feel describe your site best. All of this is great and certainly this is the way to go but if you want to increase your chances of success, additional research is never too much, especially when its results will save you the shots in the dark.

Dreaming High - Shooting the Top-Notch Keywords?

After you have made a long and detailed list of all the lucrative keywords that are searched by tens of thousands a day, do not hurry yet. It is great that you have chosen popular keywords but it would be even greater if you have chosen keywords for which top positioning is achievable with reasonable effort. If you have many competitors for the keywords you have chosen, chances are, no matter how hard you try, that you will hardly be able to overtake them and place your site amongst the top ten results. And as every SEO knows, if you can't be on the first page (or on the second and in the worst case on the third one) of the organic search results, you'd better think again if the potential gain from optimization for those particular words is worth the effort. It is true that sometimes even sites that are after the first 50 results get decent traffic from search engines but it is certain that you can't count on that. And even if you somehow manage to get to the top, do you have any idea what it will take to keep the good results?

You can feel discouraged that all lucrative keywords are already occupied but it is too early to give up. Low-volume search keywords can be as lucrative as the high-volume ones and their main advantage is that you will have less competition. The SEO experts from Blackwood Productions confirm that it is possible with less effort and within budget to achieve much better results with low-volume search keywords than if you targeted the high-volume search ones. In order to do this, you need to make an estimate about how difficult it would be to rank well for a particular keyword.

Get Down to Earth

The best way to estimate how difficult it would be to rank well for a particular keyword is by using the appropriate tools. If you search the Web, you will see several keyword difficulty tools. Choose a couple of them, for instance Seochat's Keyword Difficulty Tool, Cached's Keyword Difficulty Tool and Seomoz's Keyword Difficulty Tool and off we go. The idea behind choosing multiple tools is not that you have so much free time that you need to find a way to waste it. If you choose only one tool, you will finish your research faster but having in mind the different results that each tool gives, you'd better double check before you start the optimization itself. The Seomoz's tool is a kind of complicated and if you want to use it you need to make several registrations but it is worth the trouble (and the patience - while you wait for the results to be calculated).

You may also want to check for several keywords or keyword phrases. You will be surprised to see how different the estimated difficulty for similar keywords is! For instance, if you are optimizing a financial site, which deals mainly with credits and loans, and some of your keywords are finance, money, credit, loan, and mortgage, running a check with the seochat's Keyword Difficulty Tool produces results like these (the percentages are rounded but you get the idea): finance - 89%, money - 76% credit - 74% loan - 66% mortgage - 65%.

It seems that the keyword finance is very tough and since your site is targeted at credits and loans and not on stock exchange or insurance, which are also branches of finance, there is no need to cry over the fact that it is very difficult to compete for the finance keyword.

The results were similar with the second tool, though it does not give percentages but uses a scale starting from Very Easy to Very Difficult. I did not check all the results with the third tool, because it seems that the seomoz report on keyword difficulty for a particular word needs ages to be compiled but the results were similar, so it becomes clear that it is more feasible to optimize for mortgage and loan, rather than for the broader term finance.

You may want to bookmark some of these tools for future use as well. They will be very useful to monitor possible changes on the keyword difficulty landscape. After you have optimized your site for the keywords you have selected, occasionally check again the difficulty of the keywords you are already optimizing for because the percentages are changing over time and if you discover that the competition for your keywords has increased, make some more efforts to retain the gained positions.

Comments (0)
Choosing a SEO Company - Tuesday, September 06, 2011

After you have been dealing for some time with SEO on your own, you discover that no matter how hard you try, your site does not rank well or that your site ranks well but optimizing it for search engines takes all your time and all your other tasks lag behind. If this is the case with you, maybe it is better to consider hiring a SEO company to do the work for you. With so many SEO companies out there, you can't complain that you have no choice. Or is it just the opposite – so many companies but few reliable?

It is stretching the truth to say that there are no reliable SEO companies. Yes, there might be many scam SEO companies but if you know what to look for when selecting a SEO company, the risk of hiring fraudsters is reduced. It is much better if you yourself have a substantial knowledge of SEO and can easily decide if they promise you the starts in the sky or their goals are realistic but even if you are not quite familiar with SEO practices, here is a list with some points to watch for when choosing a SEO company:

  • Do they promise to guarantee #1 ranking? If they do, you have a serious reason to doubt their competencies. As the Google SEO selection tips say, no one can guarantee a #1 ranking in Google. This is true even for not so competitive words.

  • Get recommendation from friends, business partners, etc. Word of mouth is very important for the credibility of a company.

  • Ask in forums. There are many reputable Web master forums, so if you can't find somebody who can recommend you a SEO company right away, consider asking in Web master forums. However, beware that not all forum posters are honest people, so take their opinion (no matter if positive or negative) with a grain of salt. Forums are not such a reliable source of information as in-person contact.

  • Google the company name. If the company is a known fraudster, chances are that you will find a lot of information about it on the Web. However, lack of negative publicity does not mean automatically that the company is great, nor do some subjective negative opinions mean that the company is a scammer.

  • Ask for examples of sites they have optimized. Happy customers are the best form of promotion, so feel free to ask your potential SEO company about sites they have optimized and references from clients. If you get a rejection because of confidentiality reasons, this must ring a bell about the credibility of the SEO company - former customers are not supposed to be a secret.

  • Check the PR of their own site. If they can't optimize their site well enough to get a good PR (over 4-5), they are not worth hiring.

  • Ask them what keywords their site ranks for. Similarly to the page rank factor, if they don't rank well for the keywords of their choice, they are hardly as professional as they are pretending to be.

  • Do they use automated submissions? If they do, stay away from them. Automated submissions can get you banned from search engines.

  • Do they use any black hat SEO tricks? You need to know in advance what black hat SEO is in order to judge them, so getting familiar with the most important black hat SEO tricks is worth before you go and start cross-examining them.

  • Where do they collect backlinks from? Backlinks are very, very important for SEO success but if they come from link farms and other similar sites, this can cause a lot of trouble. So, make sure the SEO firm collects links from reputable sites only.

  • Get some personal impressions, if possible. Gut instinct and impressions from meetings are also a way to judge a company, though sometimes it is not difficult to get mislead, so use this approach with caution.

  • High price does not guarantee high quality. If you are eager to pay more, this does not mean that you will get more. Just because a firm costs more DOES NOT make them better SEO's. There are many reasons for high prices and high quality is only one of them. For instance, the company might work inefficiently and this is the reason for their ridiculously high costs, not the quality of their work.

  • Cheap is more expensive. This is also true. If you think you can pay peanuts for a professional SEO campaign, then you need to think again. Professional SEO companies offer realistic prices.

  • Use tricky questions. Using tricky questions is a double-edged sword, especially if you are not an expert. But there are several easy questions that can help you.
    For instance, you might ask them how many search engines they will automatically submit your site to. If they are scammers, they will try to impress you with big numbers. But in this case, the best answer would be "no automatic submissions".
    Another tricky question is to ask them if they will place in you top 10 for some competitive keywords of your choice. The trap here is that it is them, not you, who chooses the words that are best for your site. It is not that probable that they will choose exactly the same words as you suggest, so if they tell you that you just give them the words and they push you to the top, tell them “Goodbye”.

  • Do they offer subscription services? SEO is a constant process and if you want to rank well and keep on like that, efforts are necessary all the time. Because of this, it is better to select a company that includes post-optimization maintenance, than get a company that pushes your site to the top and then leaves you in the wild on your own. You may even want to see if they offer SEO Pay Per Click services to help you maintain an ongoing PPC campaign to further optimize your site's online marketing.

We tried to mention some of the most important issues in selecting a SEO company. Of course, there are many other factors to consider and each case is different, so give it some thought, before you sign the contract for hiring a SEO company.

Comments (0)
Top 10 SEO Mistakes - Tuesday, September 06, 2011

1Targetting the wrong keywords

This is a mistake many people make and what is worse – even experienced SEO experts make it. People choose keywords that in their mind are descriptive of their website but the average users just may not search them. For instance, if you have a relationship site, you might discover that “relationship guide” does not work for you, even though it has the “relationship” keyword, while “dating advice” works like a charm. Choosing the right keywords can make or break your SEO campaign. Even if you are very resourceful, you can't think on your own of all the great keywords but a good keyword suggestion tool, for instance, the Website Keyword Suggestion tool will help you find keywords that are good for your site.

2Ignoring the Title tag

Leaving the <title> tag empty is also very common. This is one of the most important places to have a keyword, because not only does it help you in optimization but the text in your <title> tag shows in the search results as your page title.

3A Flash website without a html alternative

Flash might be attractive but not to search engines and users. If you really insist that your site is Flash-based and you want search engines to love it, provide an html version. Here are some more tips for optimizing Flash sites. Search engines don't like Flash sites for a reason – a spider can't read Flash content and therefore can't index it.

4JavaScript Menus

Using JavaScript for navigation is not bad as long as you understand that search engines do not read JavaScript and build your web pages accordingly. So if you have JavaScript menus you can't do without, you should consider build a sitemap (or putting the links in a noscript tag) so that all your links will be crawlable.

5Lack of consistency and maintenance

Our friend Rob from Blackwood Productions often encounters clients, who believe that once you optimize a site, it is done foreve. If you want to be successful, you need to permanently optimize your site, keep an eye on the competition and – changes in the ranking algorithms of search engines.

6Concentrating too much on meta tags

A lot of people seem to think SEO is about getting your meta keywords and description correct! In fact, meta tags are becoming (if not already) a thing of the past. You can create your meta keywords and descriptions but don't except to rank well only because of this.

7Using only Images for Headings

Many people think that an image looks better than text for headings and menus. Yes, an image can make your site look more distinctive but in terms of SEO images for headings and menus are a big mistake because h2, h2, etc. tags and menu links are important SEO items. If you are afraid that your h1 h2, etc. tags look horrible, try modifying them in a stylesheet or consider this approach: http://www.stopdesign.com/articles/replace_text.

8Ignoring URLs

Many people underestimate how important a good URL is. Dynamic page names are still very frequent and no keywords in the URL is more a rule than an exception. Yes, it is possible to rank high even without keywords in the URL but all being equal, if you have keywords in the URL (the domain itself, or file names, which are part of the URL), this gives you additional advantage over your competitors. Keywords in URLs are more important for MSN and Yahoo! but even with Google their relative weight is high, so there is no excuse for having keywordless URLs.

9Backlink spamming

It is a common delusion that it more backlinks are ALWAYS better and because of this web masters resort to link farms, forum/newgroup spam etc., which ultimately could lead to getting their site banned. In fact, what you need are quality backlinks. Here are some more information on The Importance of Backlinks

10Lack of keywords in the content

Once you focus on your keywords, modify your content and put the keywords wherever it makes sense. It is even better to make them bold or highlight them.

Comments (0)
Bing Optimization - Tuesday, September 06, 2011

Ever since Microsoft launched its Bing search engine, it has drawn a lot of interest (and speculation) from the SEO community. On one hand, this is quite logical because Bing is intended to be one more heavy-weight player and it is expected to cut some share from Google. On the other hand, this is hardly the first time a new heavy-weight player comes to the ring, so maybe the expectations that Bing will put an end to Google's monopoly are groundless. Still, Bing is quite different (in a positive way) from the other search engines and this is its major strength.

Is Bing Really That Different?

The first impression you get when you go to Bing.com is that it is different – the background makes it cute but sure, there have been many other cases of search engines with tons of graphical frills to disguise their irrelevant search algorithms. However, when you type a search term, the results you get are a pleasant surprise because they are relevant.

It is this relevance of search results that worries SEO experts. The results you get when you search with Bing are relevant, yet they are very different from Google's. Actually, no matter if you search with Google or with Bing (or if you go to Bingle, you can compare the result sets side by side), you get relevant results and the two sets are very different from one another.

One of the most important things SEO experts are curious to know about Bing is its algorithm. Obviously, Bing's algorithm is different from Google's because when the search term is the same but the set of results is different, a difference in the algorithm is the obvious answer. Actually, the question is exactly what is different between the two algorithms and if the difference is so drastic that it makes it mandatory to reoptimize a site for Bing.

What Do I Need to Do In Order to Optimize My Site for Bing?

Wait. This is the first thing you need to do. Right now it is too early to say what steps (if any) are required in order to optimize your site for Bing.

Additionally, no matter how promising Bing looks, it is still early to predict if it will become a real competitor to Google or if it will become one more failed attempt to dethrone Google. Let's see how users react – will they start Binging more or will they stick to Google. When it becomes clear that Bing will be able to make it, then it will make sense to optimize for it as well. So for now the best you can do is wait.

Which Factors Make a Site Rank Well With Bing

As you probably guess, the exact algorithm of Bing is not publicly available and because of that there is a lot of speculation about what weighs more for Bing (in comparison to Google) and what weighs less. Many SEO experts test different search queries, analyze the results, and based on that try to figure out what of the known SEO tactics works with Bing. For instance, these tests are quite interesting.

Some SEO experts even think that Bing is actually Live Search in new clothes (i.e. user interface), while others say that there are noticeable differences between Live Search and Bing. But there is no doubt that for now Bing is a significant improvement over Live Search in terms of relevance of search results.

Bing is hardly the first time when there is no agreement in the SEO community about the intricacies of the algorithm but if we can summarize, here are some factors, which are (or at least are strongly believed to be) of importance when Bing optimization is concerned:

  • Backlinks are of less importance. If you compare the first 10 results in Bing and Google, it is noticeable that all equal, the winners in Bing have less backlinks than the winners in Google. It is unclear if nofollow matters with Bing.

  • Inbound anchor text matters more. The quantity of quality inbound links might be of less importance for Bing but the anchor text certainly matters more. Actually, since anchor text is one of the measurements of the quality of inbound links, it isn't much different. Get quality anchor text and you will do well in both Bing and Google.

  • Link spamming won't do much for you on Bing. Since the quantity of backinks (even if they are of supreme quality) seems to be of less importance to Bing, link spamming will be even less effective than with Google.

  • Onpage factors matter more than with Google. This is one of the most controversial points. Many SEO experts disagree but many also think that onpage factors matter more with Bing than with Google. Still, it has nothing to do with the 90s, when onpage factors were definitive.

  • Bing pays more attention to the authority of the site. If this is true, this is bad news for bloggers and small sites because it means that search results are distorted in favor of older sites and/or sites of authoritative organizations. Age of domain is also very important with Bing – even more than with Google.

  • PR matters less. When you perform a search for a competitive keyword and you see a couple of PR2 or even PR1 sites among the top 10 results, this might make you wonder. On Google this is hardly possible but on Bing it looks quite normal.

  • Fresh content matters less. Bing looks a bit conservative – or maybe it just can't index sites that quickly – but it seems that fresh content is not so vital as with Google. This is related to the age of domain specifics and as a result you will see ancient pages rank high (but these ancient pages are relevant to the search query).

  • Bing is more Flash-friendly. Optimizing a Flash site for Google is a bit of a SEO nightmare. It is too early to say but it looks like Bing is more Flash-friendly, which is good news to all sites where Flash is (still) heavily employed.

For now it is too early to say which factors are of primary importance with Bing. But the fact that their search results are relevant means that their algorithm is really precise. Well, maybe the relevant results in Bing are due to the fact that web masters were taken by surprise and they haven't had the time to optimize for Bing. As a result, the content is authentic, there are no SEO gimmicks and artificial pumping. We'll see if this will stay so in the future, when web masters learn how to optimize for Bing as well!

Comments (0)
How to get traffic from Facebook - Tuesday, September 06, 2011

Facebook is not the first social network but it is the most popular one. There have been many other social networks before Facebook and while some of them were popular at some point in time, none could reach the popularity of Facebook. In addition to keeping in touch with your friends, Facebook can be (and is) used for business. You can use it to promote your products and services, to acquire new clients, or to get traffic to your site.

Like Twitter, Facebook is just one of the many ways to get some traffic to your site. Many marketers believe that it is just a matter of time for the traffic from Facebook, Twitter and the other major social networking sites to surpass the traffic their sites get from Google.

While this time might come, don't take this as a promise that even if you do everything right, Facebook, Twitter, or any other similar site will do traffic miracles for you. For some people Facebook works like a charm, for others it doesn't work at all. The same applies to Twitter. You can't know in advance if Facebook and/or Twitter will crash your server with traffic. Just try both and see which one (if any) works for you.

Unlike Twitter, which is very simplistic, Facebook offers more possibilities. Yes, you might need more time in order to explore all the possibilities and take advantage of them but hopefully these efforts will have a great return in terms of traffic. Here are some tips that can help you turn Facebook into a traffic monster:

1 Your profile is your major weapon

As with Twitter and any other social network, if you don't make your profile interesting, you will hardly become popular. Give enough background information for you and don't forget to make your profile public because this way even people, who don't know you, when they encounter your profile, they might become interested in you and become a supporter of yours.

2Include information about your site on your Wall and in the photo gallery

Facebook gives you the opportunity to write a lot about you and your endeavors, as well as to include pictures, so use all these opportunities to build interest in you and your products. It is even better to post videos and fill in the other tabs, so if you have something meaningful to put there, just do it.

3Build your network

As with other social networking sites, your network is your major capital. That is why you need to invite your friends, acquaintances, and partners and ask them to join as your supporter. You should also search for people with interests similar to yours. However, don't be pushy and don't spam because this is not the way to convince people to join your network.

4Post regularly

No matter how interesting the stuff in your Facebook profile is, if you don't publish new content regularly, the traffic to your Facebook profile (and respectively the Facebook traffic to your site) will slow down. If you can post daily, it is fine but even if you don't post that regularly, try to do it as frequently as you can. If nothing else, updating your status regularly is more than nothing, so do it.

5Be active

A great profile, an impressive network, and posting regularly are just a part of the recipe for success on Facebook. You also need to be active – visit the profiles of your supporters, take part in their groups and other initiatives, visit their sites. You are right that all this takes a lot of time and you might soon discover that Facebooking is a full-time occupation but if you notice an increase in traffic to your site, then all this is worth.

6Arrange your page

Unlike other social networks, Facebook gives you more flexibility and you can move around many of the boxes. If you put the RSS feed with the links to your blog in a visible space, this alone can generate lots of traffic for you.

7Check what Facebook apps are available

Facebook apps are numerous and new and new ones are released all the time. While many of these apps are not exactly what you need, there are apps, which can work for you in a great way. For instance, MarketPlace widget/plugin or Blog Friends widget are very useful and you should take advantage of them. You can also use the widgets for crossposting (i.e. posting directly on Twitter from Facebook) because this saves you time.

8Use Facebook Social Ads

If you can't get traffic the natural way, you might consider using Facebook Social ads. These are PPC ads and starting a campaign is similar to an Adwords campaign.

9Start a group

There are many groups on Facebook but it is quite probable that there is a free niche for you. Start a group about something related to your business and invite people to join it. The advantage of this approach is that you are getting targeted users – i.e. people, who are interested in you, your product, your ideas, etc.

10 Write your own Facebook extensions

While this step is certainly not for everybody, if you can write Facebook extensions, this is one more way to make your Facebook profile popular and get some traffic to your site.

11Use separate profiles

Unfortunately, social networks do expose a lot of personal information and you are not paranoid, if you don't want so much publicity. Many people are rightfully worried about their privacy on social network sites and that is why it is not uncommon to have one personal profile for friends and one business profile to promote their business. You can have one single profile for both purposes, but if you have privacy concerns, consider separating this profile in two – you'd better be safe than sorry.

Facebook is changing all the time and no matter how hard you try to follow these changes, there will be new and new possibilities for you to explore. That is why it is not possible to compile a complete list of all the tactics you can use in order to drive traffic from Facebook to your site. Anyway, if you try just the basics for Facebook success we listed here, chances are that you will see a considerable traffic increase.

Comments (0)
How to Boost your SEO with Google Adwords - Tuesday, September 06, 2011

Many advertisers use Google AdWords as their major PPC network. However, in addition to using AdWords for getting paid traffic to your site, it can also be used for SEO. Here are some ideas how you can use AdWords for SEO.

1 For Keyword Research

The most valuable use of AdWords for SEO is to research keywords. Keywords are the basis of any SEO campaign and even if you are an expert in your niche, you should always research keywords simply because users frequently search for quite unexpected keywords and keyphrases you as an expert will never think of. Needless to say, what matters most for high rankings is which keywords your target audience is searching for, not which keywords you as an expert think are most popular in a particular niche.

In order to find what users are searching for, you need a keyword research tool. It is true that there are many special (free and paid) keyword research tools but Google AdWords Keyword Tool is light years ahead of them all.

It is simple to use AdWords to research keywords. You can either enter the URL of your site or put in some seed keywords, the tool will then automatically generate a whole bunch of suggested keywords. Look at the results and shortlist all the keywords that seem relevent and have a decent global search volume.

You may want to rank well for ALL the generated keywords, but its best to focus all your efforts on a selected few. The idea now is to find keywords that are relatively easy to optimize and yet have a decent search volume. These would be the keywords with the least compitetion in Google. Go to Google.com and enter each of your short listed keywords (one at a time). It is best if you search for the exact phrase, so surround your keyword with double quotes. Note how many web results there are for each of the phrases. Now that you have collected the 'Number of web results' for each keyword, calculate competition ratio by divding it's 'Global search volume' by the 'Number of web results'. The keywords with the higher ratios are the easier ones to optimize.

You can now start a SEO campaign for your keywords however you'll see next, it might be much wiser to start an AdWords campaign instead.

2 To Ensure that the Keywords You Have Picked Convert Well

After you have picked your keywords, you need to verify if these keywords really work for you – i.e. if they convert properly. No matter how precise you've been when picking your keywords, if you don't test them in practice, you can never know for sure if they work well or don't. You can pick lucrative keywords with high global search volume and low levels of competition and still end nowhere.

For instance, for this website - webconfs.com we could try optimizing for the keyword "Search Engine Optimization". It could take a year or so with a LOT of effort to reach the first page on Google for "Search Engine Optimization” and still one can never be sure this will happen.

However, let's pretend that this happens – We manage to top Google for "Search Engine Optimization” after a year of hard SEO work. To our greatest disappointment, even the first place for "Search Engine Optimization” on Google did'nt bring the expected results because the bounce rate for this particular keyword turned out to be very high. Since we do not provide SEO Services a lot of people reaching us via "Search Engine Optimization" may NOT be getting what they're looking for. Instead, lesser popular keywords, such as "SEO Tips” or "SEO Guide” might have lower bounce rates and may actually perform better than "Search Engine Optimization” did for us.

The result is not surprising but the price paid is. If we had launched an AdWords campaign, it would have saved a lot of trouble. We could have spent $20-50 on AdWords for "Search Engine Optimization” and it would have taken us a week or less to figure that the bounce rate for this keyword is very high and it makes no sense to do organic SEO for it. These $20-50 on AdWords would have spared a year of wasted SEO efforts.

3 For Getting a Better CTR with Your Existing Rankings


In addition to keyword research, AdWords is a valuable tool for getting a better CTR (Click Thru Rate) with your existing rankings. You might rank well for a given keyword, get a lot of traffic, and still be unable to monetize this traffic because your CTR is low. The reasons for this might be various but inadequate title and description could be a very possible reason.

AdWords can also help you get better CTR with your existing rankings. For instance, if you run ad AdWords campaign and you are satisfied with the conversion/performance, you might want to keep changing your ad title and description until you feel you have reached the maximum CTR for your keywords.

Sure, it might take you a couple of tries till you find the winning combination of a title and a description and you might even lower your CTR in the process but once you find this magical combination of a title and description, just copy them as the title and description for your page in order to maximize your organic search CTR as well.


4 For Geographic Targeting

One more good use of AdWords for SEO is geotargeting. If you bid on traffic from many geographic locations, you can use Google Analytics to compare how different locations convert. It is quite natural to have significant discrepancies in the conversions for the same keyword among the countries.

When you go to Google Analytics and see which countries are converting best, you can invest more effort in them. For instance, you can create local pages for these countries or target the geo-specific keywords with exceptionally good conversion rates.


AdWords is a really valuable tool not only for advertisers. It started as a tool for advertisers but its use is not restricted to them alone. For many publishers and SEO experts AdWords is the most valuable tool because even a moderate AdWords campaign can give you valuable insights and save you a lot of time and money to optimize for words, which don't work for you.

Comments (0)
How to Pick an SEO Friendly Designer - Monday, September 05, 2011
It is very important to hire a SEO-friendly designer because if you don't and your site is designed in a SEO-unfriendly fashion, you can't compensate for this later. This article will tell you how to pick a SEO-friendly designer and save yourself the disappointment of low rankings with search engines.

A Web designer is one of the persons without whom it is not possible to create a site. However, when SEO is concerned, Web designers can be really painful to deal with. While there are many Web designers, who are SEO-proficient, it is still not an exception to stumble upon design geniuses, who are focused only on the graphic aspect of the site. For them SEO is none of their business and they couldn't care less for something as unimportant as good rankings with search engines. Needless to say, if you hire such a designer, don't expect that your site will rank well with search engines.

If you will do SEO on your own, then you might not care a lot about the SEO skills of your Web designer but still there are design issues as we'll see next, which can affect your rankings very badly. When he or she designs the site against SEO rules, then it is not possible to fix this with SEO tricks.

When we say that you need to hire a SEO-friendly designer, we presume that you are a SEO pro and you know SEO but if you aren't, then have a look at the SEO Tutorial and the SEO Checklist. If you have no idea about SEO, then you will hardly be able to select a SEO-friendly designer because you won't know what to look for.

One of the ultimate tests if a designer is SEO-friendly or not is to look at his or her past sites – are they done professionally, especially in the SEO department. If their past sites don't exhibit blatant SEO mistakes, such as the ones we'll list in a second and they rank well, this is a recommendation that this person is worth hiring. Anyway, after you look at past sites, ask the designer if he or she did the SEO for their past sites because in some cases it might be that the client himself or herself has done a lot to optimize the site and this is why the site ranks well.

Here is a checklist of common web design sins that will make your site a SEO disaster. If you notice any or all of the following in the past sites your would-be designer has created, just move to the next designer. These SEO-unfriendly design elements are absolute sins and unless the client made them do it, no designer who would use the below techniques deserves your attention:

1 Rely heavily on Flash

Many designers still believe that Flash is the next best thing after sliced bread. While Flash can be very artistic and make a site look cool (and load forever in the browser), heavily Flash-ed sites are disaster in terms of SEO. Simple HTML sites rank better with search engines and as we point out in Optimizing Flash Sites, if the use of Flash is a must, then an HTML version of the same page is more than mandatory.

2 No internal links, or very few links

Internal links are backlinks and they are very important. Of course, this doesn't mean that all the text on a page must be hyperlinked to all the other pages on the site but if there are only a couple of internal links a page, this is a missed chance to get backlinks.

3 Images, not text for anchors

This is another frequent mistake many designers make. Anchor text is vital in SEO and when your links lack anchor text, this is bad. It is true that for menu items and other page elements, it is much easier to use an image than text because with text you can never be sure it will display correctly on users' screens, but since this is impacting your site's rankings in a negative way, you should sacrifice beauty for functionality.

4 Messy code and tons of code

If you have no idea about HTML, then it might be impossible for you to judge if a site's code is messy and if the amount of code is excessive but cleanness of code is an important criterion for SEO. When the code is messy, it might not be spiderable at all and this can literally exclude your site from search engines because they won't be able to index it.

5 Excessive use of (SEO non-friendly) JavaScript

Similarly to Flash, search engines don't love JavaScript, especially tons of it. Actually, the worst with JavaScript is that if not coded properly, it is quite possible that because of the use of JavaScript your pages (or parts of them) are not spiderable, which automatically means that they won't be indexed.

6 Overoptimized sites

Overoptimized sites aren't better than under-optimized. In fact, they could be much worse because when you keyword stuff and use other techniques (even when they are not Black Hat SEO) to artificially inflate the rankings of the site, this could get you banned from search engines and this is the worst that can happen to a site.

7 Dynamic and other SEO non-friendly URLs

Well, maybe dynamic URLs is not exactly a design issue but if you are getting a turn-key site - i.e. it is not up to you to upload and configure it and to create the links inside - then dynamic URLs are bad and you have to ask the designer/developer not to use them. You can rewrite dynamic and other SEO non-friendly URLs on your own but actually this means to make dramatic changes to the site and this is hardly the point of hiring a designer.

These points are very important and this is why you need to follow them, when you are choosing a SEO-friendly designer. Some of the items on the list are so bad for SEO (i.e. Flash, JavaScript) that even if the site is a design masterpiece and you promote it heavily, you will still be unable to get decent rankings. SEO-friendliness of design is a necessity, not a whim and you shouldn't settle for a SEO-unfriendly designs – this can be really expensive!

Comments (0)
LiveZilla Live Help