The Professional’s Guide to Link Building

©2009 SEOmoz This article serves as a guide to defining and understanding what link building is, why links are important for a website and its rankings, and how to obtain links to one's website. It covers various link building strategies and also provides information on various tools that help the link building process. Concluding this article is an appendix of tools for easy access.

Table of Contents
Section I: The Theory and Goals of Link Building for SEO 1. 2. 3. 4. 5. Why Link Building Matters How Search Engines Evaluate Links Structuring a Link Building Campaign Determining a Link's Value The Role of Content in Building Links

Section II: Professional Link Building Strategies 1. Asking for a Link 2. Directories 3. Content-Based Link Building Strategies 4. Back-Linking for Links 5. Press Releases and PR 6. Partnerships and Licensing 7. Buying Links for SEO 8. Reciprocal Links 9. Links from Pages Without Editorial Control 10. Nofollow 11. Helpful Search Terms, Phrases, and Advanced Query Parameters Section III: Valuable Tools and Search Queries for Link Building 1. 2. 3. 4. 5. 6. 7. 8. 9. The Basics Competitive Backlink Analysis Linkscape Link Diagnosis Other Tools Determining Link Value General Link Building Tools Advanced Tools Recovering Lost Links

Section IV: Tracking Link Campaign Results Section V: Putting It All Together 1. 2. 3. 4. 5. Starting Strategically Execution Conducting Strategic Reviews Creating a Link Building Culture Never Stopping

Section VI: Conclusion Section VII: Appendix of Link Building Tools 1. 2. 3. 4. Search Engine Supplied Link Tools Backlink Analysis Tools Firefox Add-Ons Other Useful Link Building Tools

Section I: The Theory and Goals of Link Building for SEO
Link building is the practice of getting other sites to implement hyperlinks back to a web site. This can often be done using a wide variety of strategies, such as:
• • • • • • •

Asking for a link Giving away content or tools Widget development Social media campaigns PR Purchasing links And more...

This guide will focus on all of these strategies as well as provide additional link building resources and tools.

Why Link Building Matters
In broad terms, whenever any user enters a query into a search engine, the search engine needs to determine how to return the best results. One key step is to figure out which pages on the web are related in any way to the query. Another key step is to evaluate which of the pages that relate to the query are the most important, or the most authoritative in regard to that query. One of the biggest factors in deciding that is the link

profile of the site containing the web page being evaluated, and the link profile of that page itself. In principle, each link to a web page is seen as a vote for that web page. In simple terms, if there are two pages that are equally relevant to a given search query, the page with the better inbound link profile will rank higher than the other page. However, the world does not reward those who sit back and wait for links to come to them. As outlined above, it is incumbent on the site publisher to go out and tell the world about their site and get people to link to it. Publishers who do not pursue link building are quite likely to be at high risk of losing their search engine traffic, or never building up their web sites to the point where the traffic they are getting meets their goals. A related concept to think about is whether or not the link is something that will help the site with its rankings for the long term. To illustrate, there are types of links that publishers can obtain to their sites that are against the search engine's terms of service. For example, Google has taken a strong stance against the practice of buying links for the purposes of influencing rankings on the Google index. Yet many people still purchase links and it can and does work for many of them, at least in the short term. However, as Google actively invests time in finding paid links (and other link schemes they deem to be against their terms of service), even if the links work in the short term for the publisher, there is no guarantee that they will work in the long term. This leads to a choice that every publisher must make: whether to pursue short term strategies, such as buying links, that may bring faster results with less effort, or to pursue longer term strategies that have much lower risk.

How Search Engines Evaluate Links
The notion of using links as a way of measuring a site’s importance was first made popular by Google, with the implementation of their PageRank algorithm. In simple terms, each link to a web page is a vote for that page. However, votes do not have equal weight. A detailed explanation of how PageRank works can be found here. Sticking to the letter of the original PageRank algorithm (which has evolved significantly), a site with higher PageRank will have a more valuable vote. To some degree this is still true. However, it has gotten more complicated by the introduction of 2 new factors: 1. Relevance. If the link comes from a site that is on the same topic as the publisher’s site (or a closely related topic), that link is worth more than a links that comes from a site with an unrelated topic. Taken to its extreme, think of the relevance of each link being evaluated in the specific context of the search query just entered by the user. If the user enters “used cars in Phoenix” and the publisher has a link to your Phoenix used cars page that is from the Phoenix Chamber of Commerce, that link will reinforce the search engine’s belief that the page really does relate to Phoenix.

Similarly, if a publisher has another link from a magazine site that has done a review of used car websites, that will reinforce the notion that the site should be considered a used car site. Taken in combination, these 2 links will be quite powerful in helping the publisher rank for “used cars in Phoenix”. 2. Authority. It is generally accepted that search engines attempt to measure how much they trust a site. If a site is highly trusted, its vote will count for more than if it is not that trusted. So, how does a site get to be trusted? This has been the subject of much research. One of the more famous papers on the topic is that of Apostolos Gerasoulis and others at Rutgers University on Applying Link Analysis to Web Search. This paper became the basis of the Teoma algorithm, which was later acquired by AskJeeves and became part of the Ask algorithm. What made this algorithm unique was the focus on evaluating links on the basis of their relevance. Google’s original PageRank algorithm did not incorporate the notion of relevance. Google clearly does do this today, but Teoma was first to offer a commercial implementation of this idea. Teoma also introduced the notion of hubs, which are sites that link to most of the important sites relevant to a particular topic, and authorities, which are sites that are linked to by most of the sites relevant to a particular topic. The key concept here is that each topic area that a user can search on will have authority sites specific to that topic area. The authority sites for used cars are different than the authority sites for baseball. So if a publisher has a site about used cars, he should get links from web sites that are considered by the search engine to be an authority on used cars (or perhaps more broadly, cars). However, the search engines don’t tell you which sites are authoritative, making the publisher’s job more difficult. The EDU Myth On a specific note, let’s consider for a moment the “EDU myth.” Many people believe that a link is better if it comes from an “.edu” (or a “.gov”) domain. However, it does not make sense for search engines to look at it so simply. There are many forums, blogs, and other pages on .edu domains that are easily manipulated by spammers to gain links to their sites. For this reason, search engines cannot simply imbue a special level of trust or authority to a site because it is an .edu domain. However, it is true that .edu domains are often authoritative, but this is a result of the link analysis that defines a given college or university as an authority on one or more topics. The result is that there can be (and there are) domains that are authoritative on one or more topics on some sections of their site, and yet can have another section of the site that is actively being abused by spammers. Search engines deal with this problem by varying their assessment of a domains authority across the domain. The publisher’s

http://yourdomain.com/usedcars section may be considered authoritative on the topic of used cars, but http://yourdomain.com/newcars might not be authoritative on the topic of new cars. Ultimately, every site, and every page on every site, gets evaluated for the links they have on a topic by topic basis. Further, each section and page of a site also gets evaluated on this basis. A certain link profile gives a page more authority on a given topic, making that page likely to rank higher on queries for that topic, and also providing that page with more valuable links that they could then give to other websites related to that topic. A Deeper Look at Trust One basic concept for evaluating "trust" of a web site is to review the site link profile to see what authority sites link to it. More authority site links would be equal to more trust. Note that we can refine this concept by thinking about the authority of the web page on the authoritative web site that provides the link to your web site. This helps address situations where spammers contact major sites such as university sites, and get a student to include a link to their site. Since the student page is not that important a page, it would not provide much, or any, trust to the site receiving the link. This simple approach will carry you a long way in thinking about improving the degree to which search engines trust your site. However, one of its limitations is that search engines are constrained to determining authority sites purely algorithmically. Clever spammers can always determine ways to work around the limitations of an algorithm. In 2004, Yahoo and Stanford University published a paper titled Combating Web Spam with Trust Rank (PDF). The basis of this paper was the notion of using manual human review to identify a small set of seed pages from sites that were deemed to be the most trusted/authoritative. Using this approach removes the inherent risk in having an algorithm determine the authoritativeness of a site, and potentially coming up with false positives/negatives. The trust level of other sites, other than the manually identified ones, would be evaluated based on how many clicks away they are from the seed sites. One click away results in a lot of trust, two clicks a way a bit less, these clicks even less, and so forth. Of course, one click away from multiple seed sites is the best of all worlds. The researchers who wrote the paper on Trust Rank also authored an interesting paper on a concept they call spam mass (PDF). This paper focuses on evaluating the effect level of spammy links on a site's (unadjusted) rankings. The greater the impact of those links, the more likely the site itself is spam. Similarly, if the search engine determines that a large percentage of a site's links are purchased, this could be problematic as well. Expanding on this slightly, you also think about the notion of "Reverse Trust Rank". This is the notion that if your site links to spammy sites, their Trust Rank should be lowered. The theory should provide ample motivation to make sure that you take care to not link to any bad sites. It is highly likely that Google, Yahoo, and Live Search all use

some form of trust measurement to evaluate web sites, and that this trust metric can be a driving factor in rankings.

Structuring a Link Building Campaign
One of the most difficult parts of implementing a link building campaign is deciding what the strategy will be. There are many types of campaigns that can be chosen, and making those decisions can often be difficult to do. It is also a critical decision, as publishers will want and need to get the highest possible return on their investment in link building. There are four major components to a link building campaign. These are: 1. 2. 3. 4. 5. Getting links from authoritative domains Getting links from a volume of domains Getting links to pages other than the home page (also known as "deep links") Local links for local rankings Getting the anchor text needed to drive rankings.

Each of these components will be discussed in more detail. 1. Getting Links from Authoritative Domains The highest authority sites in the publisher’s space are ones from where they should want to get links. A good way to start structuring a link building campaign is to identify the high authority sites in a given space and then determine what it will take to get a link from them. The first complication is that the search engines provide no direct measurement of a web site’s authority. There are limited amounts of publicly available data. These include Google PageRank, as displayed on the Google Toolbar, and total back links to a site, as extracted from Yahoo! Site Explorer. However, one really good way to start the process is to determine if major government sites, major university sites, or major media sites are potentially authoritative for a given site’s topic matter. For example, if the New York Times has a major section of their newspaper devoted to the publisher’s topic (or something closely related to it), there is a good bet that it is considered authoritative on the topic. Taking this first step simple depends on the publisher using his knowledge of their market, and who the major players are. The next step, or narrowing down who the top authority sites may be at a more granular level, is harder. One way to approach this is to collect the most likely suspects, let’s say 10 of them for example, and then see which of those 10 has the most links from the other nine. This can be done by checking one domain a time, using this relatively simple search query at Yahoo:

linkdomain:potentialauthoritydomain1.com site:potentialauthoritydomain2.com This command lets the publisher see if potentialauthoritydomain2.com links to potentialauthoritydomain1.com in a simple and fast manner. If potentialauthoritydomain1.com has links from the other 9 prospective authority sites that were identified, it’s a good bet that the search engines also see it as authoritative. It should be noted that search engines probably consider the number of different relevant domains that link to a site as a factor in determining its authority as well. Identifying how many different domains link to a site may not be that hard using the right tools, but determining the relevance of a large number of links is in fact quite a bit harder. Once this type of analysis is complete, there is still no way to be 100% sure that the identified sites will be seen by the search engines as authoritative. This means the publisher may identify 10 sites he thinks is authoritative, and perhaps only 6 of them are actually seen as authoritative by the search engine. But, this is about the best a publisher can do in identifying authoritative target sites. As outlined above, the next major step is that the publisher has to figure out what it will take to get these types of sites to link to his site. Is it a particular piece of content, or a special tool? Is it sufficient to publish this great content or tool on the publisher’s site, or should it be syndicated in some fashion to the prospective authority site? There are also direct and indirect approaches. Contacting a site directly and asking for a link is a direct method. Using PR or a social media campaigns are indirect methods. Either can work as a strategy for getting a link from a given web site. As publishers look at each strategy in turn, there will be a variety of issues they must consider. These will be discussed later on in this guide. 2. Getting Links from a Volume of Domains It is also extremely valuable to get links from a large number of domains. 100 links from one domain are not nearly as valuable as one link from 100 domains (assuming that the pages involved are likewise equal). The reason for this is that search engines want to count a link to a site as an endorsement by that site. In addition, the search engines want that to be an editorial decision by an informed person. 100 links from one domain require only one editorial decision, but one link from 100 domains most likely involves 100 editorial decisions. The bottom line is that it's important to get links from many domains, which complements the process of pursuing authoritative links. As with authoritative links, there are many different strategies one can use to try and get links in volume. Direct and indirect means can be used to get links in volume, although direct means will require significant effort over a sustained period of time. 3. Getting Links to Pages Other Than the Home Page (Deep Links) It is also very useful to get links to pages in a site other than its home page. This is particularly true for larger sites, but is also applies to sites of any size. Publishers should

look to get links to each major section of their site. Each page on a web site has the opportunity to rank for different search terms, and supporting those pages with direct links will help them rank better. In addition, these direct links to lower level pages create anchor text opportunities for those pages as well. In principle, each major theme or topic of the site should be the subject of its own link campaign. It's also useful to get links to sub-pages of each theme. The publisher may not want to launch focused campaigns on each such sub-page, but if the campaign is structured in such a way that most links will go to the highest level page of each theme, and then some links will go to sub-pages, that would be a very effective use of link building resources. 4. Local Links for Local Rankings A publisher with a page related to real estate in Seattle may want that page to rank for search terms such as “Seattle Real Estate”. In addition to doing on page optimization and general link building, the publisher should also look to obtain links from other local businesses. For example, a link from the local chamber of commerce is likely to be helpful in ranking for local search terms. There are many other similar sources, such as local business directories, the local chapter of the Better Business Bureau, local libraries, local businesses and local clubs. Getting links from these sources reinforces the local theme of the publisher’s site or web page. This is equally important, or perhaps even more important, in international settings. If the publisher wants traffic from Google UK, he should plan on getting some links from other sites in the UK. 5. Getting the Anchor Text Needed to Drive Rankings Anchor text has been rated by leading SEOs as one of the most powerful factors in search rankings. Search engine representatives have also acknowledged that anchor text is a significant ranking factor. They use anchor text to provide further evidence of what the page receiving the link is about. Since these are assumed to be specified by the person giving the link, it is a factor that is potentially very powerful. If a site has a page selling used Ford cars and the page has a lot of links pointing to it that use the anchor text “Used Ford Cars,” the anchor text is likely to help the page's rankings for related search terms. Also of note is that most web sites do not need to do anything at all to rank for their company or web site name. The search engines have a very high success rate in delivering the correct result in the top spot for a branded search. It is a bit of a waste to obtain links that use anchor text using the brand name of the company or web site, yet this is one of the most common anchor texts used by sites that link to a publisher’s site naturally (although it should be noted that these links still do have value). This is one reason why many SEOs choose to purchase links. When a publisher purchases a link, they normally specify the exact anchor text that they want; after all, they are buying an ad, and the ad should say what they want. The existence of this market then

becomes a reason for search engines to place less weight on anchor text, as it makes the signal less reliable for them to use in ranking a site. As with links themselves, the assumption with anchor text is that it is editorially given. It is also possible to implement “white hat” link building campaigns and get the anchor text that you want. The techniques in the referenced article can be quite effective and do not come with the risk of being seen as spam by the search engines. Other Considerations There are other factors that one must consider in structuring a link building campaign. These are: 1. “Looking” Natural Many SEOs talk about the need for a web site’s links to “look natural.” What they mean by this is that search engines shouldn't perceive the links to be artificially manipulated. For example, if a publisher has links from 10 different domains and every one of them is a site wide link, and they all use very similar anchor text which is not the name of the web site, it’s a good bet that those links are the result of manipulative behavior by the publisher. To dig into this a bit more deeply, most sites are linked to using one of the 3 forms of anchor text:
• • •

The site name The site URL “Click here” or a similar derivative

This is just a fact of life for links that are naturally given. For that reason, if a publisher gets a very large percentage of the links to his site with the exact same keyword rich anchor text, he is asking for trouble. This is largely only a consideration for publishers who engage in manipulative schemes to obtain their links. However, publishers not engaging in such practices also have reason to be aware of the issue and understand how it affects their link building efforts. 2. Balance Search engines look at many different types of signals when ranking sites. As a component of reducing the influence of paid links, one such factor is looking at the variance in the sources of the links and the anchor text. Publishers that have obtained a greater balance in their linking structure could potentially be ranked higher than other publishers. This is one key reason for implementing multiple link building strategies at the same time, including approaching different types of target sites. For example, a publisher may choose to concentrate resources on requesting links directly from major universities while also implementing a significant PR effort, or perhaps a campaign

simultaneously targeted at bloggers and traditional media. These types of strategies have the dual benefit of appearing more natural and providing a broader balance to the link profile of the site. 3. Permanence Another important factor is to realize that link building should never stop. An effective link building campaign is a competitive advantage for a publisher. Achieving initial success with a link building campaign (by getting the desired rankings in search engines) and then stopping only results in giving the publisher’s competition a competitive advantage. 4. Anti-Spam Measures by Search Engines Search engines have only one goal in mind – improving the quality of their index (which leads to increased market share and increased revenues). This is the entire reason for Google’s policies regarding link buying for PageRank purposes and other similar techniques. Site owners cannot overlook this in deciding on their link building strategies. Companies that make billions of dollars in revenue every year have a lot of money to invest in protecting their business. As a result, the budget that search engines have for fighting spam is considerable. That said, public companies such as Google look for scalable structures to their business. Their clear preference is to improve their algorithm to fight spam. It is difficult for them to implement large teams of human reviewers to look at one site after another to determine whether or not it is gaining higher rankings for reasons that are in conflict with their algorithms (though note that Google does have a team of reviewers based in India that do just that). This is what leads some to say that Google and spammers are engaged in an arms race. But even those who are known for black hat expertise professed in the Black Hat, White Hat panel at SES San Jose 2008 stated clearly they use what they consider white hat practices with their clients, and only experiment with black hat techniques on sites they consider throw away domains. 5. Temporal Factors Search engines also keep detailed data on when they discover the existence of a new link or the disappearance of a link. They can perform quite a bit of interesting analysis with this type of data. Here are some examples:

When did the link first appear? This is particularly interesting when considered in relationship to the appearance of other links. Did it happen immediately after you received that link from the New York Times? When did the link disappear? Some of this is quite routine, such as links that appear in blog posts that start on the home page of a blog and then get relegated to

archive pages over time. However, perhaps it is after you rolled out a new major section on your site, which could be an entirely different type of signal. How long has the link existed? You can potentially count a link for less if it has been around for a long time. Whether or not you choose to count it for more or less could depend on the authority/trust of the site providing the link, or other factors. Rate of adding links. Did you go from one link per week to 100 per day? Or vice versa? Such drastic changes in the rate of link acquisition could also be a significant sign.

Determining a Link's Value
Putting together a link campaign generally starts with researching sites that would potentially link to the publisher’s site, and then determining the relative value of each potential linker. There are several basic factors that go into this, including: 1. The PageRank of the site. 2. The perceived authority of the site. While there is a relationship between authority and PageRank, they do not have a 1 to 1 relationship. Authority relates to how the sites in a given market space are linked to, whereas PageRank measures aggregate raw link value without regard to the market space. So higher authority sites will tend to have higher PageRank, but this is not absolutely the case. 3. The PageRank of the Linking Page. 4. The perceived authority of the Linking Page. 5. The number of outbound links on the Linking Page. This is important because the Linking Page can vote its PageRank, but each page it links to consumes a portion of that PageRank, leaving less to be passed on to other pages. This can be expressed mathematically: For a page with Passable PageRank “n” And with “r” outbound links Passed PageRank = n/r Passable PageRank is as defined in the following article on SEOmoz about PageRank. This is a rough formula, but the bottom line is that the more outbound links a page has, the less the value of a link from that page. 6. The relevance of the linking page and site. Organizing this data in a spreadsheet, or at least being consciously aware of these factors, when putting together a link building campaign is a must. One tool that can help with this task is the SEOmoz Page Strength Tool. In this paper, we are going to assume that the sites get divided into four possible categories: 1. Low Ranking Sites

2. Medium Ranking Sites 3. High Ranking Sites 4. Very High Ranking Sites The level of effort put into obtaining links from each type of site will be discussed further below. We will also discuss more tools that can be used for evaluating a link's value.

The Role of Content in Building Links
In natural link building the publisher must provide compelling content. Publishers need good reason to provide links to another site, and it is not something they do frivolously. Superior content or tools is the key to obtaining such links. Aggressive publishers can even let their content strategy be guided by their link building strategy. This is not to say that they should change their business itself for their link building strategy. Normally, however, there are many different types of content a site could produce. The concept is simply to identify the link building targets, what content they will need to have the best chance to get links from those targets, and then tweak the content plan accordingly. Content is also at the heart of achieving link building nirvana – having a site so good that people discover it and link to it without any effort on the publisher’s part. This can be done, but does require that the publisher create content that truly stands out for the topics that their site covers. Another aspect of content is the decision of where to place it. There are two major alternatives for content placement: 1. On Site Of course, the content may be placed on the publisher’s site. Within the publisher’s site, there are other decisions to be made, such as does the content go in a special section, or is it integrated throughout the site. As an example of how this decision may be made, an etail site that publishes a large catalogue of products may not want all (or some of) the pages in their catalog laden with a lot of article content. A site like this might build a separate section with all kinds of tips, tricks, and advice related to the products they sell. On the other hand, an advertising supported site might want to integrate the content throughout the main body of the site. 2. Off Site A publisher may also choose to place the content on another site. One reason for this would be to provide the content to another site in return for a link to their site. This can be quite an effective link building strategy. Some publishers also implement blogs in other sites, simply because that is where the blog publishing platform resides. These then get integrated into the main site through the cross linking structure. Note that this is not a strategy we would recommend, but that’s the subject of another paper.

Section II: Professional Link Building Strategies
Asking for a Link
Sometimes simple is best. If the site looking for links is a high quality site with unique and/or authoritative content, the publisher may simply need to tell the world about what they have. They can do this via a PR strategy (which we will discuss later) or by simply emailing other publishers. If there is already a relationship between the publisher requesting the link and the publisher being asked to provide the link, this process is pretty easy. The requesting site sends the other party a note with whatever pitch they want to make. This pitch is easy to personalize, and the nature of what is said is likely to be guided by the existing relationship. When a publisher is in the process of establishing a business relationship with a publisher, it is also relatively easy. Assuming that they are going to sign a contract with this publisher, all that needs to be done is to make linking to the site a term of the agreement. As an extension of this, if the Potential Linking Site is being granted the right to distribute or sell the products of the Link Destination Site, the Link Destination Site can simply require the Potential Linking Site to provide them with a link. Publishers that have a standard contract that they use with many people can derive substantial benefit from adding a link clause to that agreement. Such links still represent endorsements, even though they were contractually required, because no one forced the other party to sign the contract, and they would not be interested in reselling or distributing the products or services of the Link Destination Site unless they were prepared to endorse them. However, if the Requester does not know the Requestee, it is a very different ballgame. Deciding on how to contact the Potential Linking Site, including how much effort to put in, can be summarized quickly with the following table: Low Ranking Sites Targets may result from large scale research. Contact is by email and is personalized, but likely in an automated way. These types of sites are being relied on to obtain a volume of links. No customized content is developed. Targets may result wither from large scale research or industry knowledge by Medium the publishers of the Link Destination Site. Contact is by email and is Ranking personalized, likely done by a human, but with only low to moderate levels Sites of effort. No customized content is developed. Targets identified by principals of the business or senior marketing people. High Email contact is entirely custom and tailored to the Requestee. Phone calls Ranking may also be used in pursuit of these links. Content may be developed just to Sites support to campaign to get links from these types of sites. Very High Targets identified by principals of the business or senior marketing

Ranking Sites

people.Email contact is entirely custom and tailored to the Requestee. Phone calls may also be used in pursuit of these links. Face to face visits may also be involved. Content may be developed just to support to campaign to get links from these types of sites.

Basic Email Pitch Assuming that the Requester does not know the Requestee, there are few simple guidelines that should go into the link pitch: 1. Keep it simple and short. The Requestee is receiving an email that is unsolicited. They are not going to read a 2 page email, or even a 1 page email. 2. Clearly articulate the request. It is an investment to get someone to read an email, and it is critical that the pitch be clear about the desired result. 3. Clearly articulate why the Link Destination Site deserves a link. This, generally speaking, involves pointing at the great content or tools on the site, and perhaps citing some major endorsements. 4. Follow each and every guideline of the CAN-SPAM Act. Unsolicited emails are not illegal as long as they follow the guidelines of the Act. Don’t even think about violating them. CAN-SPAM violations do not result in fines, they result in jail time. Creating a Value Proposition for Direct Requests One of the key points is how the publisher can make their pitch interesting enough for the Potential Linking Site. As noted above, this starts with understanding what content or tools the publisher has on their site that would be of interest to them. (For purposes of clarity: sites do not link to other sites for the purpose of helping those sites make money. They link because they perceive that their users might value the content or tools on the site.) This relates to the fundamental structure of the web, which is designed around the notion of interlinking related documents. Positioning a site, or a section of a site, as being related to the Potential Linking Site is a requirement of each link building request. With High Ranking Sites and Very High Ranking sites, it may be worth spending quite a bit of time and energy on putting together this value proposition. In many cases, it is even worth doing custom content development to increase the perceived value and relevance of the content to the Potential Linking Site. In all cases, developing a link request value proposition begins with understanding the nature of the Potential Linking Site’s content, and then deciding how to match up the content of the Requestor’s site with it. Requests Via Social Media Applications It is also possible to reach out to people through social networks, such as Facebook,

LinkedIn, and Twitter. This is similar to emailing people with a few important distinctions: 1. Publishers can send out communications to their friends on those networks. Assuming that they have treated this designation (that of friend) with any level of seriousness, instead of “friending” everybody in sight, the communication can be a bit more informal. 2. Publishers can also join groups on these networks related to their market space, and then send messages out to those groups. These groups will provide them with the ability to reach new people with related interests. 3. Messages broadcast through these networks cannot be personalized, so a more general message needs to be tailored for these types of broadcasts. 4. These are social networks. Beware broadcasting too many messages or poorly targeted messages. Many publishers have made this mistake and become a pariah in the communities, and lost the leverage that these communities bring in the process. 5. Personalized messages can be sent on a 1 to 1 basis as well. In fact, one strategy for approaching a High Ranking Site or a Very High Ranking Site is to make initial contact by friending someone senior who works at the company that publishes such a Potential Linking Site. Then the publisher can develop the relationship with that senior person without asking anything of them. Strategies for doing this involve learning as much about the person as possible and interacting around those interests, and possibly helping them with a thing or two. Once the relationship is established, a more informal approach can be used to introducing the great content on the publisher’s site. Creative Link Request Tactics Creative link requests can come in a few forms: 1. Make the link request standout through creative presentation. This requirement might make a publisher think about dancing girls, for example, but this might not work for a woman who might prefer dancing men, and in any event is a bit over the top. Publishers that pursue the notion of creative presentation need to be careful to not make it look like a commercial. It’s more important to find a creative way to make the point stand out. Keep in mind that the person receiving the email did not wake up this morning dreaming about which sites they were going to link to today. 2. Offer something of unique value. For example, one way to do this is to write a great article and offer it in return for an attribution link (aka Content Syndication). We will talk about this particular tactic and other creative tactics later in this document.

3. Lay the groundwork first. The default approach is to email someone the publisher doesn’t know a request they don’t expect, for a task that they really did not have on their list for today, or for that matter, this month. Sounds compelling doesn’t it? Let’s provide a few examples to illustrate some alternatives: o The publisher is a recognized authority in their industry. They speak at industry conferences and write in major periodicals related to their market space. So there may not be a personal relationship here, but the email recipient will probably know of the publisher, and this helps with getting links. o The publisher conducts a training session on a related topic on the premises of the Potential Linking Site. o The publisher builds a relationship with an important person from the Potential Linking Site. This can be done through meeting at industry conferences, networking through social media, getting an introduction from a friend, and many other methods. This tactic can unfold over many months, with the relationship being developed with no mention of getting a link. Then after the relationship is fully developed, the publisher highlights some piece of content they have developed. This may seem like an extensive amount of work, but it might be worth it to get a link on this PageRank 9 site (Nasa.gov):

Or this Page Rank 10 site from a small search engine called Google:

In all of these strategies, the key is to develop trust before making the request. In fact, ideally, the publisher never makes the request at all, but the context of the relationship causes the link to be given. It is far easier to get a link when there is trust in place then when there is not.

Directories
Directories can be a great way to obtain links. There are a large number of directories out there, and they may or may not require money in order to obtain a listing. Examples of high quality directories that are free include:
• •

Librarian's Internet Index Open Directory Project (aka DMOZ)

Examples of high quality directories which require a fee are:
• • •

Yahoo Directory Business.com Best of the Web

A more comprehensive list of directories is available from Strongest Links. What Search Engines Want From Directories The essential factors that the search engines look for are: 1. 2. 3. 4. The paid fee is made in payment for an editorial review, not for a link. Editors may at their whim change the location, title, and description of the listing. Editors may reject the listing altogether. Regardless of the outcome, the directory keeps the money (even if the publisher doesn’t get a listing). 5. The directory has a track record of rejecting submissions. The inverse of this, which is more measurable, is that the quality of the sites listed in the directory is high. The following is an extract from my blog post on The Role of Directories in Link Building: Ultimately, "Anything for a buck" directories do not enforce editorial judgment, and therefore the listings do not convey value to the search engines. To take a closer look at this, let's examine some of the key statements from Yahoo's Directory Submission Terms: “For web sites that do not feature adult content or services, the Yahoo! Directory Submit service costs US$299 (nonrefundable) for each Directory listing that is submitted. I understand that there is no guarantee my site will be added to the Yahoo! Directory. I understand that Yahoo! reserves the right to edit my suggestion and category placement; movement or removal of my site will be done at Yahoo!'s sole discretion.” Classifying Directories We can divide directories into 3 buckets: 1. Directories That Provide Sustainable Links. These are directories that comply with the policies as outlined above. Most likely, these links will continue to pass link juice for the foreseeable future. 2. Directories That Pass Link Juice that May Not Be Sustainable. These are directories that don’t comply with the policies as outlined above. The reason such directories exist is that search engines tend to use an “innocent until proven guilty” approach, so the search engine must proactively make a determination of guilt before a directory’s ability to pass link juice is turned off. Even so, link juice from these types of directories is probably not going to be passed in the long term.

3. Directories That Do Not Pass Link Juice. These are the directories that have already been flagged by the search engines. They do not pass any value. In fact, submission to a large number of them could be seen as a spam signal, although it is unlikely that any action would be taken solely on this signal alone. Detecting Directories that Pass Link Juice The process is relatively simple for directories that pass sustainable links, as defined above. The steps are:
• •

Investigate their editorial policies and see if they conform to what search engines want. Investigate their track record. Do they enforce their policy for real? This may be a bit subjective, but if there are lots of junky links in their directory, chances are that the policy is just lip service. As another check, search on the directory name and see if there is any SEO scuttlebutt about the directory as well.

The process is a bit harder for directories that do not confirm to the policies search engines prefer. There are still some things the publisher can do:
• • • • •

Search on the name of the directory to see if it shows up in the search engine results. If not, definitely stay away from it. Take a unique phrase from the home page of the directory and see if that shows in the search engine results. If not, stay away from it. Do they have premium sponsorships for higher level listings? A sure signal to search engines about their editorial policies. Do they promote search engine value instead of traffic? Another bad signal. Evaluate their inbound links. If they are engaged in shady link building tactics, it’s a good idea to stay away from them.

A good reference article for detecting bad directories is Rand’s article on what makes a good web directory.

Content-Based Link Building Strategies
As outlined before, Potential Linking Sites do not link out to other sites just to help those other sites make money. In the eyes of the Requestee, there needs to be a purpose for giving out the link. Of course, paid links are one way of doing that, but we will discuss that later. For now, let’s review six major ways to use content to build links: 1. Creating a Rich Resource Site

The principle here is simple. Create one of the best resources in a given market space and people will link to it. In an ideal world, the publisher can reach Link Building Nirvana, where the links come to them without their doing anything. However, this is a difficult state to reach. Ultimately, even if the publisher must take on the burden of promoting their content to get links, the key for the publisher is to establish their site as an expert in its field by publishing truly authoritative content (not authority in the search engine sense discussed previously, but recognized by people in their market as authoritative). This is a serious undertaking in any field, but it is extremely effective, especially if the field is already established. One way to make the strategy a bit easier is to focus on one particular vertical aspect of the field. For example, if the general market space is “widgets,” there may be an opportunity to become an expert on a vertical aspect, such as “left handed widgets.” A specific area of vertical expertise such as this fictitious example is a very, very effective way to establish a new presence in a crowded market. Once this type of resource has been created, simply “Asking for a Link” as outlined above will often be a very effective strategy. 2. Rifle Shot Content A related strategy is that of producing a single killer article (or a small set of killer articles). This by itself can divide into two scenarios: 1. Content that is meant to attract the attention of a specific Potential Linking Site. This may come about because someone at the Potential Linking Site lets a need be known to the public. An example of this is the 2007 Web Analytics Shootout report published by Stone Temple Consulting. 2. The Web Analytics report was spawned by this blog post by Rand on SEOmoz titled Free Link Bait Idea. Rand identified a need, the need was answered, and links resulted. A related version of the same idea is to publish a single authoritative article that is of enough interest to the community at large that it attracts a lot of attention. 3. Content Syndication The previous two content based strategies were based on the notion that the content developed would be placed on the publisher’s site. This does not have to be the case. It is entirely possible to develop content with the intent of publishing it on someone else’s site. In return for providing the article, the author gets a link back to their site. It is also often possible to get targeted anchor text in this scenario. Watch the boundaries here though, because if the anchor text is unrelated to the article itself, it will not comply with what the search engines want publishers to do for link building techniques. There are a couple of important points to watch for when syndicating content:

1. The publisher should not distribute articles that are published in the same form on the publisher’s own site. Search engines will see this as duplicate content. While the search engine’s goal is to recognize the original author of a piece of content, it is a difficult job to do perfectly, and it does happen that the search engines make mistakes. When looking to distribute content published on a site, the best practice is to rewrite the article, add some things to make it a bit different in structure and in its material points, and syndicate that version of the article. Therefore, if the site publishing the syndicated article ranks highly for key search terms, it is not a problem for the author’s site. If a publisher does choose to take an article from their site and distribute it in an unmodified form, they should make sure that the site publishing the article includes a link back to the original article on the publisher's site. This increases the likelihood that the search engine will handle the duplicate content situation correctly by recognizing the original publisher of the article. 2. When considering the syndication of content to a High Ranking Site, or a Very High Ranking Site, it makes sense to study the content needs of the site and custom tailor the content to those needs. This practice maximizes the chances of the acceptance of the article by the Potential Linking Site. One variant of content syndication is to generate articles and the submit them to “article directories.” There are many of these types of sites, and frankly some of them are pretty trashy. But there are good ones as well. The distinction between the trashy and good ones is relatively easy to recognize, based on the quality of the articles that they have published. Many of these article directories allow publishers to include links, and these links are followed. Here are some worthy of consideration: Directory Site www.buzzle.com/ www.ezinearticles.com www.work.com www.articleblast.com/ www.articlecity.com/ www.articlesbase.com www.mastersmba.com www.trans4mind.com/ www.pandecta.com www.promotenews.com/ www.a1articles.com/ www.articleclick.com www.articlesphere.com PR Specialty 6 General 6 General 6 Business 5 General 5 General 5 General 5 General 5 Self-Help 4 E-Business 4 Web Marketing 4 General 5 General 3 General

www.bestmanagementarticles.com www.buildyourownbusiness.biz www.ibizresources.com/ www.isnare.com/ www.marketingseek.com www.redsofts.com/articles/ www.sbinformer.com/ www.valuablecontent.com

4 4 4 4 4 4 4 4

Business Business Business General General General Small Business Genera

Note that even if these do appear to pass link juice, some of them may be disabled on the back end by Google, but there is no way to tell which ones. In general though, Google is not going to punish a publisher for putting an article in these directories, and many of them may provide helpful links. 4. Social Media Campaigns Social media sites such as Digg, Reddit, StumbleUpon, Delicious, and others can play a big role in a link building campaign. Becoming “popular” on these sites can bring in a tremendous amount of traffic and links. While social news sites like Digg bring lots of traffic, this traffic is usually of low quality and will have a very low revenue impact on the site receiving it. The real ball game is to get the links that result. For example, stories that make it to the home page of Digg can receive tens of thousands of visitors and hundreds of links. While many of these links are transient in nature, there is also a significant number of high quality links that result. Better still, articles about topics that also happen to relate to very competitive keywords can end up ranking very well, quite quickly, for those very competitive keywords. The key insight into how to make that happen is to use the competitive keyword in the title of the article itself and in the title of the article submission. These are the two most common elements grabbed by people linking to such articles when selecting the anchor text they use. Similar strategies can work well with other social news sites like Mixx and Propeller. Sites like Delicious and StumbleUpon are different in structure. Delicious is a tagging (or bookmarking) site used by users to mark pages on the web that they want to be able to find easily later. StumbleUpons share some similarities with Delicious, but also offers a content discovery aspect to its service. Both of these sites have “popular” pages for content that is currently hot on their sites. Getting on those pages can bring lots of traffic to a site. This traffic is of higher quality than you get from social news sites. Users coming to a site via tagging sites are genuinely interested in the topic in a deeper way than users who got their eye caught by a snappy article title on a social news site. So while the traffic may be quite a bit lower than on the social news sites, publishers can also earn some quality links in the process. Tagging sites are best used in attempting to reach and develop relationships with major influencers in a market space. Some of these

may link to the publisher, and these are potentially significant links. Social media sites also allow publishers to create their own profiles. These profiles can, and should, include links back to the publisher’s site. Learn more about social media sites here. 5. Getting Links From Social Media Profiles Some social media sites, such as LinkedIn, allow you to link back to your own sites in your personal profile, and these links are not nofollowed (meaning they pass link juice). Leveraging this can be a great tactic, as it is simple and immediate. In the case of LinkedIn, the process takes a few steps:
• • • • • • • • • • •

Login to LinkedIn. Click "Account & Settings" in the top right corner of the LinkedIn screen. Click on "My Profile" to edit it. Click "Websites" to edit your websites. This will present you with the ability to edit your additional information (click "edit"). Add a web site listing. Start by taking the box that says "Choose" and selecting "other." This is what will allow you to specify keyword rich anchor text. Then enter in the correct URL and click "Save Changes." Next we have to make your websites visible to the public (and the search engines). On the upper right of your screen you will see a link titled "Edit Public Profile Settings." Click on it. On the next screen, under Public Profile, make sure that "websites" is checked. This is what tells LinkedIn to display that publicly. Click "Save Changes."

Note that the above process was what was required as of early 2009, but the specifics may evolve over time as LinkedIn releases updates. 10 other social media sites that do not nofollow links in their public profiles are: 1. 2. 3. 4. 5. 6. 7. 8. Flickr Digg Propeller Technorati MyBlogLog BloggingZoom Current Kirtsy

9. PostOnFire 10. CoRank 6. Blogging for Links Blogging can also be quite effective in link development. How effective a blog will be is highly dependent on the content on it, the market space, and how it is promoted by the publisher. The first thing to realize when starting a blog is that it is a serious commitment. No blog will succeed if it does not publish content on a regular basis. How frequently a blog needs to publish depends on the market space. For some blogs, one post a week is enough. For others, it really needs to be 2-3 times per week, or even more. Blogging is very much about reputation building as well. Quality content and/or very novel content is a key to success. However, when that first blog post goes up, the blog will not yet be well known, will not likely have a lot of readers, and those that do come by are less likely to link to a little known blog. In short, starting a new blog for the purpose of obtaining links is a process that can take a long time. But it can be a very, very effective tool for link building. It’s just that patience and persistence is required. Here are a few key things to think about when blogging for links: 1. One of the best places to get links to a blog is from other blogs. This is best done by targeting relationships with major bloggers and earning their trust and respect. 2. Be patient when developing relationships with other blogs. Trust and respect don’t come overnight, and they certainly don’t result from starting the relationship with a request for a link. 3. The publisher should target a portion of his content at the interests the major bloggers. Over time, this process should turn into links to the publisher’s blog from the major bloggers. 4. Other less well-known bloggers will begin to see links on the major blogs and will begin to follow suit. It is also important to leverage the social nature of the blog. Publishers should try to provide a personalized response to every person who comments on their blog. One effective way to do this is to send each and every one of them a personalized response by direct email that shows that the comment was read. This helps deepen the interest of the commenter, creates a feeling of personal connection, and increases the chance that the commenter will return and possibly add more comments. Nurturing the dialog on a blog in this fashion helps that dialog grow faster. 7. Widgets Widgets are also a way of syndicating content to third party sites. They have the

advantage of packaging the content so that it's not seen as duplicate content. This is because they are usually implemented in JavaScript, and the web page publishing the widget calls back to a remote server to fetch the content. The result is that a search engine crawler does not see the content. This also results in any links embedded within a widget as being invisible to the search engine. However, it is possible to implement a widget in such a way that it has an HTML wrapper around it with a simple HTML text link in it, which is quite visible to the crawler. Popular widgets can get adopted by a large number of web sites and can result in a large number of links as a result. Widget campaigns can also result in links to deep pages on the site. A word of caution is merited. Widgets used for link building should be closely related to the content of the page receiving the link. An example of someone who did this differently than this is discussed in this post: Another Paid Links Service Disguised As Hit Counter. Four days after this post went up, the sites referenced lost all of their high rankings in Google, and therefore lost most of their traffic. The main reason for this is that the links given with the hit counter were in fact hidden in the <noscript> portion of the hit counter. Be aware that making the link visible is not enough to make this practice legitimate in the search engines' eyes. Google has confirmed that they consider tactics like the use of unrelated widgets for link building, such as a hit counter, a no-no, even if the link is visible. The underlying reason for this is that if the link is unrelated to the widget, it is pretty unlikely that the link given actually represents an endorsement of the web page receiving the link. The goal of the person installing the widget is to get the benefit of the contents of the widget. On the other hand, using this same strategy where there is a close relationship between the widget and the link given is far more likely to represent a real endorsement, particularly if content from the page receiving the link is what is included in the link embedded in the widget. 8. Discount Programs The idea here is for the publisher to offer visitors from certain web sites a discount on the publisher’s product or a related special offer. The publisher contacts the sites in his industry and tells them that they will give 20% off to all visitors that arrive from their sites. All the other site needs to do is to link back to the publisher's site. This may appeal to the sites as it allows them to offer real value to their visitors. However, this is seen by Google as a questionable practice. While it would seem that the discount offer would only be interesting to the third party site if they valued the product (or think their visitors will), Matt Cutts clearly indicated that Google did not want to value such links in this interview.

Back Linking for Links
Back linking, or seeing who links to whom, is one of the oldest methods of putting together a link campaign. If someone links to one publisher’s competitor, there is a

decent chance that they might be willing to link to the publisher themselves. The process starts by identifying sites related to the market space of the publisher, including directly competitive sites and non-competitive authoritative sites. Then, using tools such as the linkdomain: command in Yahoo, you obtain a list of the sites linking to those sites. The precise form of the command is: linkdomain:targetdomain.com –site:targetdomain.com The purpose of including the "-site:targetdomain.com" at the end is to exclude external links to the site. However, this listing from Yahoo is not easily downloaded unless you use a third party tool such as SEOmoz's Linkscape or Link Diagnosis, which we will discuss later in this guide. Note, however, that if you enter the simpler form of the command, linkdomain:targetdomain.com, this will bring you into Yahoo Site Explorer, which provides a result set that looks like this:

One of the nice things about this is that you can download the first 1,000 results into a TSV file that you can pop straight into Excel, which gives you a list that looks like this:

Then you can filter the list by sorting by URL, and removing all the internal links (i.e., links from targetdomain.com to targetdomain.com). This will return a list of all the third party links to the site. Once this list is obtained, the next steps are: 1. Capture, at a minimum, the URL of the linking page, the anchor text used, the PR of the linking page, and the PR of the home page of the site containing the linking page. This step is an arduous process unless you use a third party tool to pull that data for you. 2. Analyze the linking pages to determine which ones are worthy of individualized attention. Campaigns can be designed very specifically around these sites, as discussed earlier in this document. 3. Research the linking pages and try to determine the best contact to use in asking them for a link. This work should be done manually for two major reasons:

Using a computer program to extract contact information from a web site is a violation of the CAN-SPAM Act (just don’t go there). o Computer programs may not return the best contact to use, and the contact returned by such programs may in fact not be useful at all. 4. Develop an email campaign using the guidelines provided above.
o

Performing back link analyses on competitive sites is one great way to pursue this. If a site links to a publisher’s competitor, there is a decent chance that they might link to the publisher. If the content of the publisher is better than that of the competitor, this may even result in the competitor’s link being replaced by the publisher’s. Count that as a double win. It is also interesting to perform back link analyses on authoritative sites. The sites linking to these authority sites have an interest in the market space, and could be a good source of links. As always, publishers need to be careful that they are presenting a good value proposition to the Potential Linking Sites in order to maximize their return.

Press Releases and PR
Press releases are one of the staples of a PR campaign. Developing SEO optimized press releases are one very effective technique. These need to be distributed by wire services such as PRWeb and BusinessWire. The key to optimizing a press release is to choose keywords for the title that make them likely to be picked up by major news media editors. Having the correct keywords will align a press release with the search terms the new media editor uses. These terms are not the same as those used in web search, because they relate to search queries by highly knowledgeable people searching through feeds and/or databases of press releases. Designing a press release to get looked at is its own science. Once a press release is picked up by a news editor, if the content is compelling, there is now a chance that they will write about it. Press exposure like this is awesome. The publisher gets the link from the editor’s site, and some of their readers will like the site and link to it too. Direct Contact of News Media Another staple of traditional PR is the practice of making direct contact with news editors and writers and getting them interested in a story. In other words, instead of a shotgun approach like a press release, it's a rifle shot approach. In the rifle shot approach, the idea is to conduct a highly customized specific effort to appeal to the interests of the editor or writer. In today’s environment, news media encompasses traditional magazines and papers, as well as bloggers. Bloggers are part of the media as well. High value blogs can be just as influential as a major news site. Contacting the major players in a highly personalized way makes a lot of sense. Use common relationship building rules, such as

making sure to bring value to the person being contacted. Study the things they like, write about, or are interested in. Use that knowledge to tailor the approach and maximize the chances of success.

Partnerships and Licensing
One of the simplest ways for a publisher to obtain a link is to incorporate it into the business terms of the contracts that they execute. For example, if a business uses distributors or resellers to sell their products, they can have the distributor or reseller link back to the publisher’s site. This may seem like a compensated link, but that is at least a debatable point. The third party would not want to distribute or resell someone’s product unless they were willing to endorse it. Linking to the publisher’s site is simply one element of that endorsement. There are two types of related programs that are seen by Google as being “gray hat” in orientation. These are: 1. Affiliate Programs Affiliate programs are when one web site links to another, and the site providing the link gets paid whenever a visitor goes to the publisher’s site and buys something or completes some other form of conversion. Links in affiliate programs often come in the form similar to the following: http://www.yourdomain.com?id=1234. The ID code is used to track the visitors from the affiliate to determine when a transaction or conversion occurs. The basic problem with this is that http://www.yourdomain.com? id=1234 is seen as duplicate content to the original page: http://www.yourdomain.com. This can cause some problems. In addition, Google has publicly indicated that it does not want people to use affiliate programs as a method for getting links that pass PageRank. They have stated this several times, including in this interview with Matt Cutts. Matt Cutts has also indicated that Google does a pretty good job at detecting when a link is an affiliate link. For publishers that want to push the envelope a little bit, it is possible to implement a 301 redirect from http://www.yourdomain.com?id=1234 to http://www.yourdomain.com. This can be done by having a web application recognize the incoming affiliate link, cookie the user, and then execute the redirect. Publishers who use this technique need to recognize, however, that the basis of this type of link campaign carries some real risk. 2. Discount Programs This is a closely related concept, but it removes the element of compensation. The idea here is for the publisher to offer visitors from certain web sites a discount on the publisher’s product. Based on this, the publisher can encourage those other sites to put a link on their site back to the publisher’s site, noting the availability of the discount. This may appeal to such other sites as it allows them to offer real value to their visitors.

However, this is also seen by Google as a questionable practice. While it would seem that the discount offer would only be interesting to the third party site if they valued the product (or think their visitors will), Matt Cutts clearly indicated that Google does not want to value such links in the above referenced interview.

Buying Links for SEO
One of the more popular techniques is to buy links. It has two significant advantages:

It’s easy. There is no need to sell the quality of the content on the Link Destination Site. The only things that need to happen are determining that the Potential Linking Site is willing to sell, and setting a price. Since the link is an ad, the Potential Linking Site can simply specify the anchor text they want. Anchor text is a powerful ranking signal, and this is one of the major reasons that people engage in link buying.

Google’s Policy on Paid Links The major downside is that buying links for SEO is against Google’s Webmaster Guidelines. The policy on paid links can be briefly summarized as follows: 1. Links given in return for compensation should not be obtained for purposes of improving PageRank / passing link juice. 2. The link should be freely given, and the publisher of the Potential Linking Site should be informed of what they are doing. An example of a link where the publisher is not informed is one that is hidden in the NOSCRIPT tag of a JavaScript based widget. 3. As noted above, Google considers the use of hit counters and Wordpress templates with embedded links, even if they are visible, as spammy as well. Unless of course, the publisher distributing the hit counter is in the hit counter business, or the publisher distributing the Wordpress template is in the Wordpress template business. The key issue that needs to addressed in these types of link building campaigns is relevance. Google is not saying that publishers should not be able to buy ads on the web. Their policy is that links which are purchased should only be purchased for the traffic and branding value that they bring. Google also recommends that publishers use the NOFOLLOW tag on such ads, which means that they will have no SEO value. On another note, PPC campaigns using Adwords, Yahoo Search Marketing, etc., are not considered a violation of the policy against paid links. This is because they are easily recognized by the crawler and do not pass link juice. Methods for Buying Links

There are four major methods for buying links. These are: 1. Direct link advertising purchases. This method involved contacting sites directly and asking them if they are willing to sell text link ads. Many sites have pages that describe their ad sales policies. However, sites that openly say that they sell text links are more likely to get caught in a human review, resulting in the link being disabled from passing PageRank by Google. 2. Link brokers. The next method is the use of link brokers. These are companies that specialize in identifying sites selling links and reselling that inventory to publishers looking to buy such links. The major danger here is that ad brokers may have a template of some sort for their ads, and a template can be recognized by a spider as being from a particular broker. 3. Charitable donations. Many sites of prominent institutions request charitable contributions. Some of these provide links to larger donors. Search for pages like this one on genomics.xprize.org that link to their supporters and provide a link to them. Sometimes these types of links don’t cost a ton of money. This tactic is, however, frowned on by Google, and best used with care. One way a publisher can potentially make the tactic more acceptable is to support causes that are related in a material way to their site. However, it is not clear that this would be acceptable to Google either. Finding these types of sites may seem hard, but the search engines can help with this. Aaron Wall wrote a post about how to find donor links a long time ago, and the advice he gave then still holds true. Publishers can search on terms such as these:
o o o o o

Sponsors Donors Donations "Please visit our sponsors" "Thanks to the following donors"

This can be filtered a bit further to narrow the list down some, by adding operators such as site:.org or site:.edu to the end of the search command. Strategies Which Are Not Considered Buying Links It’s worth noting some strategies where money is involved in obtaining a link, yet it is not considered buying a link. Here are some examples: 1. Using a consultant to promote articles on a social media site such as Digg 2. Paying a PR firm to promote a site 3. Paying a link building firm to ask for (as opposed to buy) links

The key point is that these strategies do not compensate the Potential Linking Site itself for the links given, and the links are given freely.

Reciprocal Links
Another common practice is to trade links between web sites. As with buying links, in many cases websites will agree to trade links without regard to the quality of the content on the publisher’s site. Of course, this illustrates the problem of a lack of editorial judgment that is often a result from a link swap. Link trading used to be done by publishers in large volume. However, less of it occurs today, as more publishers are aware of the limitations of this technique. In early 2006, Google implemented an update that was called “Big Daddy.” What this update did was devalue swapped links involving websites that had an unnaturally high number of reciprocal links. Another factor to be concerned about is the relevance of the site with which the link is traded. Poor relevance may limit the value of the link received. In addition, linking to unrelated sites can dilute the relevance of the site of the publisher. The bottom line is that a high volume campaign focused on reciprocal links is a bad idea. However, this does not mean a publisher should never swap links. There is certainly a time and a place for it. For example, swapping a link with an authoritative site in the same market space as the publisher can be very beneficial indeed. One rule of thumb to use is whether or not the site that the publisher is considering swapping links with is something that they might link to even if they did not get a link in return. If the answer is yes, then the relevance and authority of the site may justify a link swap.

Links from Pages Without Editorial Control
There are many thousands of places on the web where links can be obtained without any editorial control. Some prime examples of this are:
• • • • • • •

Forums Blogs Guestbooks Social media sites Wikis Discussion boards Job posting sites

These types of sites often allow people to leave behind comments or text without any review. A common spam tactic is to write a bot that crawls around the web looking for blogs and forums, and then put machine-generated comments in them containing a link back to the spammer’s site (or their client’s site). Most of the forums and blogs that have these comments added to them will either catch them and remove them, or will make use

of the nofollow attribute on links inside such comments. However, the spammer does not care because not all of the sites will catch and eliminate the comments or nofollow them. Throwing millions of comments out using a bot costs the spammer very little. If it results in a few dozen links that are not nofollowed, then it is worth their effort. This is a deep black hat tactic and one that all the search engines do not want publishers to use. It is highly risky, and getting caught is very likely to result in getting banned.

Nofollow
The nofollow meta tag or attribute is one that the search engines have agreed should mean that the links so tagged will not pass any PageRank or link juice to the site receiving the link. It comes in two forms, the nofollow meta tag and the nofollow attribute. Nofollow Meta Tag The nofollow meta tag is implemented by placing code similar to the following in the <head> section of a given web page: <meta name="robots" content="nofollow"> When this meta tag is seen by the search engines, it tells them that none of the links on the web page should pass any link juice at all to the pages they link to. Nofollow Attribute The nofollow attribute is meant to allow a more granular level of control than the nofollow meta tag. Its implementation looks something like this: <a href="http://www.yourdomain.com/page37.html" rel="nofollow"> When used as an attribute, the only link that is affected is the one contained within the same anchor (in this case, the link to www.yourdomain.com/page37.html). This allows publishers to select only specific links that they want to nofollow from a given page. Nofollow Uses and Scams One of the most common uses of nofollow is to nofollow all links in comments on a blog or in a forum. This is a common and legitimate use of the tactic, as it is used to prevent spammers from flooding forums and blogs with useless comments that include links to their sites. However, as might be expected, there are some common scams people implement using nofollow. One simple example of this is that someone proposes a link swap and provides the return link, but nofollows it. The effect of this is that it looks like a clean one way link

from one site to the other, even though a swap was done. Publishers that engage in a link swap with websites should take care to make sure that the link they receive back is not nofollowed. Publishers who are actively link building need to be aware of nofollow. Potential Linking Sites that offer only links that are nofollowed are providing links that have no SEO benefit (although there may be other non-SEO benefits in those links, such as branding and traffic).

Helpful Search Terms, Phrases, and Advanced Query Parameters
The search engines provide a rich array of tools to perform link related research. The following are some search commands as well as links to additional resources to learn more about query parameters. 1. inanchor:keyword This command is useful in analyzing the relevance of a particular page based on the links pointing to it. For example, the search “inanchor:seo” returns the pages that have the best inbound links including the word SEO in the anchor text. Publishers can go one step further and look at the pages on a particular website that have the best anchor text for a keyword as follows: “inanchor:keyword site:domain-to-check.com” The search is great for checking if a site has unnatural links. If 80% or more of the site’s links include the major keyword for their market (and it’s not part of their company name), it’s a sure bet that they are buying links or doing something similar to steer the anchor text. The operator is very valuable in learning about a competitor’s site and its strength in competing for the keyword. There are other related operators as well: Intext:keyword – shows pages that have the phrase implemented in their text Intitle:keyword – shows pages that have the phrase in the page title Inurl:keyword – shows pages that have the phrase in the URL for the page. 2. linkdomain:http://www.domain-to-check.com/ -site:domain-to-check.com keyword Find “keyword” related sites that link to domain.com. This command needs to be executed in Yahoo, since they are the only search engine that supports the “linkdomain” command. This operator is very valuable in learning about a competitor’s site and its strength in competing for the keyword. 3. linkdomain:domain-to-check.com site:.edu "keyword" This command is very useful in finding the .edu sites that link to a competitor’s domain, where the site uses a target keyword. This command also needs to be executed in Yahoo. Take a competitor’s site and use “resources” for the keyword, and see what happens. It's

great for slicing and dicing the backlinks of a site. 4. yourdomain.com –site:yourdomain.com, with the &as_qdr parameter This one is a bit trickier. In Google, perform a search on domain-to-check.com – site:domain-to-check.com. Then add “&as_qdr=d” to the end of the URL on the search results page and reload the page. This will show the mentions that domain-to-check.com has received in the past 24 hours. This command is very useful in evaluating the competitive strength of a site. Trying it for Microsoft.com results in the following:

Publishers who are competing against someone who has received 39,600 mentions in the past 24 hours are in deep trouble or should focus on ranking somewhat lower than that competitor! Fortunately, most sites get very few mentions in a day. Here are some variants of the operator that can be used at the end of the URL: 1. &as_qdr=w (past week) 2. &as_qdr=m (past month) 3. &as_qdr=y (past year) 5. linkdomain:competitor1.com linkdomaincompetitor2.com –site:yourdomain.com This command is useful because it will show the publisher who links to two of their competitors but not to them. Sites that link to more than one competitor, but not to the publisher, may well be interested in learning about another site providing information on similar topics, particularly if the publisher’s site does one or more things better than the sites they already link to. 6. intext:domain-to-check.com This command can help a publisher rapidly identify sites that reference “domain-to-

check.com” but don’t implement that reference as a link. This command is very powerful because it can be used to identify “lost” links to the publisher’s domain. By identifying sites that reference the publisher’s domain, the publisher can go and contact those sites and ask them to convert it into a link. Since these are sites that are already endorsing the publisher’s site (most likely they are at any rate), the conversion rate can be expected to be reasonably high. Sources & Additional Resources Here are some additional resources for learning about search operators and how they can help with a variety of tasks:
• • •

Advanced Search Operators PRO Guide from SEOmoz Advanced Link Operation article on Search Engine Journal Getting Links From Known, Quality Linkers

Section III: Valuable Tools & Search Queries for Link Building
There are lots of tools available in the market for use in link building. This section will summarize some of the more interesting ones.

The Basics
The first thing a publisher should do is develop an understanding of the links they already have. The 3 most basic tools for doing that are: 1. Google Webmaster Tools is a powerful first start. With Google Webmaster Tools, publishers can easily download a spreadsheet of all the links Google has in its database (note that they might not include some links that they do not consider significant). Publishers can only use this tool to see the links to their own site. Here is a screenshot of how it looks:

2. Live Search Webmaster Tools is also a great asset. It offers a similar capability for downloading a spreadsheet of the links that Live Search has in its database for a site. Publishers can only use this tool to see the links to their own site. Here is a look at the report:

Live Search Webmaster Tools is limited in its export, however. Publishers can only

extract up to 1,000 of their backlinks due to a limitation in the Live Search API (as of Sept. 2008). Live Search does provide some cool filtering capabilities to help publishers work around that limitation. 3. Yahoo! Site Explorer will allow a publisher to see all of their backlinks. Site Explorer will also allow a publisher to see the backlinks for any web site, not just their own. This is particularly interesting in evaluating the competition.

For quick and dirty link totals, it is handy to make use of a Firefox plugin known as Search Status. This provides basic link data on the fly with just a couple of mouse clicks. Here is the menu that is shown with regard to backlinks:

Notice also that it offers an option for highlighting nofollow links, as well as many other capabilities. It’s a great tool to help pull numbers like these much more quickly than would otherwise be possible.

Competitive Backlink Analysis
One of the most important things a publisher does is obtain links to his/her site. A great way to do that is to see who links to the competition. Yahoo! Site Explorer is one great tool for doing just that, but its major flaw is that it is hard to extract the data into a spreadsheet or process formats. It is possible to write a program to extract the data automatically, but Site Explorer will not let you extract more than 1,000 links. However, more data can be extracted by using sophisticated techniques to filter the output data that can be gotten from Yahoo! Search (as opposed to Yahoo! Site Explorer). For example, a publisher might notice in Yahoo! Site Explorer that a site has a lot of .edu links. These can easily be isolated using the following command in Yahoo! Search: linkdomain:domain-to-check.com site:edu

Once this is done, the publisher can extract all the .edu links for the site. The next command the publisher might try could be: linkdomain:domain-to-check.com -site:edu This shows all links except the .edu links. There are many of these types of filters that the publisher can try to accomplish their goals of extracting as many links as possible.

Linkscape
This level of manual effort is, of course, not for the weak of heart. Advanced tools exist to do much of the work for the publisher. One example of such a tool is SEOmoz’s Linkscape. Linkscape is a tool that has been developed based on a crawl of the web conducted by SEOmoz, as well as a variety of parties engaged by SEOmoz. You can see a complete list of Linkscape’s sources here. Linkscape is designed to be the tool for SEOs to use in really mapping links across the web. It gets around the 1,000 URL limit imposed by Yahoo! and will let you get as many links as are found by Linkscape and extract them all into a spreadsheet. The following is a sample report:

Included with this report are mozRank and mozTrust scores for each URL/domain listed. The tool also flags redirects, and you can easily obtain additional data, such as the

Google PageRank scores for the linking page and its domain. The beauty of this tool is that it allows SEOs and publishers to collect competitive intelligence on other people’s domains, much like they do with Yahoo! Site Explorer today, but to escape the 1,000 result limit and also get SEOmoz’s proprietary mozRank and mozTrust scores. The importance of mozRank and mozTrust cannot be underestimated. The PageRank score published by Google through their toolbar is known to not be a reasonable approximation of the value of a given page. The Google algorithm has simply evolved far past the original simplistic concept of PageRank that Google used in its early days. In addition, all the data is extractable in a spreadsheet to allow for easy offline processing.

Link Diagnosis
Here is an example of the base Link Diagnosis output for a site:

Each of the subsections of this report can be downloaded into a CSV file for later examination in Excel. Taking the basic backlinks report (top left above) and pulling it into a spreadsheet provides data that looks like this:

Other Tools
Using some method for extracting link data in this fashion is a must for performing competitive backlink analysis. Over at Search Engine Journal, Ann Smarty did a nice write up of SEO Tools. In it she mentioned two other backlinking tools that might be of interest:
• •

Backlink Watch Domain Inbound Links Checker

Determining Link Value
Early on in this guide, we established the notion of evaluating sites (and pages) to determine if they were Low Ranking, Medium Ranking, High Ranking, or Very High Ranking. We established the following parameters as being important: 1. 2. 3. 4. The PageRank of the site The perceived authority of the site The PageRank of the Linking Page The perceived authority of the Linking Page

5. The number of outbound links on the Linking Page 6. The relevance of the link There are other factors that one could take into account. These can include the number of Delicious (and other bookmarking sites) tags a site has, or the total number of citations of the site on the web which are not implemented as links. One tool that helps with this is the Trifecta Tool, which is accessible via an SEOmoz PRO account. Most SEO firms that engage in link building for their clients will either leverage a tool such as Trifecta or an internally developed tool.

General Link Building Tools
Some other link building tools include: 1. SoloSEO’s Link Search Tool provides an extensive list of links out to valuable advanced Google queries based on the selected search terms 2. We Build Pages’ Search Combination Tool provides a more concise list of advanced queries, but allows the publisher to enter multiple keywords at a time and returns results from Google, Yahoo! and MSN 3. Aaron Wall’s SEO Book Tools Page has a rich set of tools for link research and beyond.

Advanced Tools
Google Webmaster Tools It is worth going back to the beginning and looking at Google Webmaster Tools again. Looking at the 404 errors report in Google Webmaster Tools can be very illuminating. Here is an example output from that report:

The URL circled in red is of particular interest. The correct URL is http://www.stonetemple.com/STC_Services.shtml. This type of data in the 404 report is a sure sign that someone has linked to the publisher’s site using an incorrect URL. The problem with this is that the link brings no benefit to the site, as links to a 404 page are not counted. However, this is easily fixed by 301 redirecting from http://www.stonetemple.com/stc_services.shtml to http://www.stonetemple.com/STC_Services.shtml. On an Apache server, this can be implemented in one single line in the .htaccess file that looks like this: redirect 301 /stc_services.shtml http://www.stonetemple.com/STC_Services.shtml

Recovering Lost Links
Many times people reference a publisher’s site without implementing the reference as a true link. Mainstream media does this quite a bit - they will cite a source without implementing it as a link. The result is that the user can’t click on it and the search engine does not count it as a link to the publisher’s site. Some of the sites that reference a publisher’s site might be willing to make it a link if they are asked. After all, they are already endorsing the publisher’s content by referencing it, and perhaps the failure to implement it as a link was simply an oversight. Finding these types of citations is easy using the following command: intext:seomoz.org -site:seomoz.org

Once the publisher has executed this command on their site, the next step is to review the sites listed and then determine which ones are worth contacting to request a link. This should provide a fairly high return activity.

Section IV: Tracking Link Campaign Results
Once a link campaign is launched, it is important to track the results. This is also a difficult process. One of the basic ways to do this is to use the linkdomain:yourdomain.com command in Yahoo. However, filtering through this on an ongoing basis can be tedious. If this technique is used, it becomes important to maintain a database of existing links (which can be largely obtained from the link reports in Google Webmaster Tools and Live Search Webmaster Tools) for ongoing comparisons. Then a tool such as Linkscape or Link Diagnosis can be run to get the current inbound links and the two lists can be compared. New links will show up in the tool's report, but not your database of existing links. A more focused way to do this is a command such as linkdomain:yourdomain.com site:domain-to-check.com. This command can be used to see if domain-to-check.com has any links to yourdomain.com. However, this will also be tedious if the publisher has contacted 100 or more sites. The other thing that can be done is to make use of the analytics software in use on the publisher’s site. For example, publishers can look at the referrers report on a daily basis and recognize when a brand new referrer shows up. The reason that this works (for the most part) is that people generally perform a quick test of a new link when they implement it to make sure it is working. This should show up as a new referral in the analytics software. As before, it is helpful to maintain a database (or spreadsheet) with a list of all historical referrers to make it easier to recognize when a new one comes in.

Section V: Putting It All Together
Starting Strategically
It is critical to start strategically when implementing a link building campaign. It's easy to start down the wrong track and waste a lot of valuable time, so this should be avoided. Here are some of the major steps for assembling a strategy: 1. Identify Types of Link Targets

There are many possible different ways to classify types of link targets. Some examples include:
• • • • • • •

Major media sites Social media sites (including Digg, Propeller, Delicious, StumbleUpon, and others) Blogs Universities and colleges Government sites Sites that link to your competitors Related hobbyist sites

These types of lists can get a lot more granular. There might be seven different categories of blogs that represent good targets for a publisher to pursue, each requiring a somewhat different link building strategy. In the first stage of putting together a link building strategy, publishers should focus on building a list of all the different options. 2. Identify Content Strategies for Each Segment The next step is to identify what it will take to get links from a particular market segment. Publishers who want to get links from “green” bloggers, for example, will need to create something that appeals to that audience. This step should not be underestimated. Without the right content (or tools) to appeal to a given audience, the publisher will not be successful. In addition, understanding the content or tools that needs to be developed is critical to understanding the cost of a link building strategy. Part of this step is also to identify the content packaging. For example, if the content is to be delivered using a widget, resources will need to be allocated to build the widget. Another simpler question to ask is: Will the content be placed on the publisher’s site or on a third party site (i.e., syndicating it)? 3. Identify Channel Strategies This relates to the method of reaching the Potential Linking Sites. As defined before, there are indirect means and direct means for reaching sites. Examples of indirect means are:
• • • •

PR Social media sites Writing a column in a major media magazine Speaking at conferences

There are also many direct means. Some of these include:
• • • •

Sending emails Calling people Sending snail mail Content syndication

4. Find Out What the Competition is Doing The next step is to see what segments the competition is pursuing. This is useful from two perspectives:
• •

Determine where they are successful and follow in their footsteps Find out what segments they have ignored and use that to get ahead

Of course, a major part of this is also nailing down specific links the competition has and identifying the ones that have the highest value. Publishers should use the competitive review to provide valuable input into finalizing their strategy. 5. Review the Cost of the Various Strategies How much will it cost to pursue a particular strategy? Will the strategies meet the business goals? Are there brand considerations? What will the cost be in terms of money and resources? These questions must be answered before finalizing. It is no good to have the ultimate link building strategy defined if the publisher is unable to pursue it. 6. Finalize the Strategies to Pursue The final step of the strategic phase is the easiest. Review the available information that has been put together and decide on an affordable but effective set of strategies to pursue.

Execution
A world class link building campaign is always a large effort. The resources need to be lined up and focused on the goal. Publishers always experience bumps in the road and difficulties, but it is critical to line up the resources and go for it. It is also critical be persistent. It is amazing how poorly many companies execute their strategies. Publishers that execute ruthlessly inevitably gain an edge over many of their competitors. Of course, if there are other competitors that also execute ruthlessly, it becomes even more important to push hard on link building.

Conducting Strategic Reviews
Link building strategies should evolve in the normal course of business. As campaigns are pursued, lessons are learned, and this information can be fed back into the process. New strategies are also conceived over time, and some of these are great ones to pursue. Also, sometimes the initial strategy goes great for a while but begins to run out of steam. Publishers should have a constant stream of ideas that they are feeding into their link building plans.

Creating a Link Building Culture

Publishers should also train many people within the organization about their link building plan, its goals, and how it will help the business. The purpose of this is to engage the creativity of multiple team members in feeding the stream of link building ideas. The more ideas, the better. The quality of a link building campaign is directly proportional to the quality of the ideas that are driving it.

Never Stopping
Link building is not something that you do once, or once in a while. We live in a culture where the search engine plays an increasingly large role in the wellbeing of a business. Consider the business that implements a great link building campaign, gets to where they want to, and then stops. What happens to them when their competitors just keep on going? They get passed and left behind. Publishers who are lucky enough to get in front need to be prepared to fight to stay there. Publishers who are not out in front should fight to get there.

Section VI: Conclusion
Links are the engine that drives search engine rankings. This is not meant to minimize the importance of on page SEO (also known as "technical SEO"). A poorly optimized website will have problems getting traffic no matter how many links it gets. But, because links and their anchor text are used by the search engines to determine how important a site is, they determine how important a site is to a particular topic and are thought of as being the multiplier of traffic potential. Think of it in housing terms. When you buy a house, the first thing you probably should do is to make sure all the major systems, electrical, heat, plumbing, etc. is working safely and correctly. Correspondingly, you don't start with that kitchen remodeling project or adding onto the house. When you go to sell the house, however, the more visible improvements you have made to the house, such as the remodeling project, will have a direct impact on the sales price. The work you did on the major systems won't have much impact on the price, but it will affect whether or not you will be able to sell the house (because people don't want to move into a house with major systems problems). And so it is with technical SEO and link building. You won't be able to get traffic if your site architecture is broken or there is not any real content on the site, but link building drives the site value. In addition, think of link building as a marketing function, much like PR. Link building is about getting the word out about the great stuff you have on your site and your company. Because of the dynamics of getting someone to link to you, you have to sell them on the quality of what you have, even if you make use of some form of incentives (other than compensation based incentives). Where link building differs from PR is that link building has specific goals about the outcome which is desired as a result of the promotional activities, and there are specific types of tactics and targets a link builder will choose as a

result. Finally, think of link building as a cost of doing business. You don't want to make the mistake of doing a campaign, then stopping, then starting again, then stopping … All that this does it give a more determined competitor the chance to get in front of you and stay there because they do link building continuously and you don't. Don't let them do that. There is so much to be gained by increasing your search traffic - it is an opportunity that few can afford to ignore. Go get yourself some links!

Section VII: Appendix of Link Building Tools
Search Engine Supplied Link Tools

• •

Google Webmaster Central - allows you to download your backlinks directly from Google. You can also add anchor text checking to Google Webmaster Central by using this free Greasemonkey script. Yahoo! Site Explorer - allows you to see what pages you have indexed as well as the backlinks to those pages. Live Search Webmaster Tools - allows you to see your backlinks as seen by Microsoft. Limits you to only downloading 1,000 of them, but provides search tools to extract more data.

Backlink Analysis Tools
Free Ones:
• • • •

LinkDiagnosis - Extracts links from Yahoo! Site Explorer. Appears to still be limited to the 1,000 site limit. Obtains key data such as anchor text. Backlink Watch - returns a report showing the backlinks and related anchor text for the site you specify. Domain Inbound Links Checker - Also pulls backlinks. A little bit cryptic, but provides a nice visual look at how many domains are linking to you. Back Link Analyzer - free Windows based downloadable software.

Paid Ones:

Linkscape - SEOmoz's backlink tool that provides data based on its own crawl of the web. As a result, it is not limited to 1,000 backlinks as many Yahoo! Site Explorer based tools are.

Majestic SEO - link anchor index that allows you to download an in depth link profile for any site.

Firefox Add-Ons

Search Status - Useful Firefox plugin that allows you to access backlink data from the search engines with just a few clicks. Also allows you to see all links which are nofollowed on a page. SEO for Firefox - Import useful marketing data (including link counts, with links to source data) right into Google and Yahoo! search results.

Other Useful Link Building Tools
• •

• • • •

Trifecta - Determine the strength of a page, blog, or a domain. Useful in evaluating the value of a link relationship. Juicy Link Finder - The Juicy Link Finder is intended to find links that have authority - old domains with a high PageRank that rank well for the keyword you enter. SEO Toolbar - Imports link data from various sources and makes it quite easy to compare sites against each other. Hub Finder - finds topically related pages by looking at link co-citation. Link Suggest Tool - list of words and phrases like "add URL," "directory," "ezine," etc., where you may be able to add links or buy ads. SoloSEO's Link Search Tool - allows you to enter a keyword and then builds a list of search queries to help you rapidly find directly related sites that link out to other sites. We Build Pages' Search Combination Tool - Similar to the SoloSEO tool, but much more flexible (you can provide a list of keywords).

Sign up to vote on this title
UsefulNot useful