This action might not be possible to undo. Are you sure you want to continue?
insearch engines via the "natural" or un-paid ("organic" or "algorithmic") search results. Other forms of search engine marketing (SEM) target paid listings. In general, the earlier (or higher on the page), and more frequently a site appears in the search results list, the more visitors it will receive from the search engine's users. SEO may target different kinds of search, including image search, local search, video search, academic search,news search and industry-specific vertical search engines. This gives a website web presence. As an Internet marketing strategy, SEO considers how search engines work, what people search for, the actual search terms typed into search engines and which search engines are preferred by their targeted audience. Optimizing a website may involve editing its content and HTML and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic. The acronym "SEOs" can refer to "search engine optimizers," a term adopted by an industry of consultantswho carry out optimization projects on behalf of clients, and by employees who perform SEO services in-house. Search engine optimizers may offer SEO as a standalone service or as a part of a broader marketing campaign. Because effective SEO may require changes to the HTML source code of a site and site content, SEO tactics may be incorporated into website development and design. The term "search engine friendly" may be used to describe website designs, menus, content management systems, images, videos, shopping carts, and other elements that have been optimized for the purpose of search engine exposure. Another class of techniques, known as black hat SEO or spamdexing, uses methods such as link farms,keyword stuffing and article spinning that degrade both the relevance of search results and the quality of user-experience with search engines. Search engines look for sites that employ these techniques in order to remove them from their indices. History Webmasters and content providers began optimizing sites for search engines in the mid1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed to do was submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed. The process involves a search engine spider downloading a page and storing it on the search engine's own server, where a second program, known as anindexer, extracts various information about the page, such as the words it contains and where these are located, as well as any weight for specific words, and all links the page contains, which are then placed into a scheduler for crawling at a later date. Site owners started to recognize the value of having their sites highly ranked and visible in search engine results, creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. The first documented use of the term Search Engine Optimization was John Audette and his company Multimedia Marketing Group as documented by a web page from the MMG site from August, 1997.  The first registered USA Copyright of a website containing that phrase is by Bruce Clay effective March, 1997. Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag, or index files in engines likeALIWEB. Meta tags provide a guide to each page's content. Using meta data to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.  Web content providers also manipulated a number of attributes within the HTML source of a page in an attempt to rank well in search engines. By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pagesshowed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, allowing those results to be false would turn users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate.
 Google has a Sitemapsprogram[dead link] to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website. Some search engines have also reached out to the SEO industry.  Preventing crawling Main article: Robots Exclusion Standard To avoid undesirable content in the search indexes. When a search engine visits a site. Increasing prominence A variety of methods can increase the prominence of a webpage within the search results. Search engine crawlers may look at a number of different factors when crawling a site.txt file is then parsed. portable document format files. Methods Main article: search engine optimization methods Getting indexed The leading search engines. Additionally. Cross linking between pages of the same website to provide more links to most important pages may improve its visibility. the Yahoo Directory and the Open Directory Project both require manual submission and human editorial review. flash files.txt located in the root directory is the first file crawled. Google offers Google Webmaster Tools. In March 2007. and are frequent sponsors and guests at SEO conferences. Such programs usually guarantee inclusion in the database. but do not guarantee specific ranking within the search results.  Bing Toolbox provides a way from webmasters to submit a sitemap and web feeds. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. with the advent of paid inclusion. use crawlers to find pages for their algorithmic search results. will tend to improve the relevancy of a site's search listings. allowing users to determine the crawl rate. URL normalization of web pages accessible via multiple urls. especially pages that aren't discoverable by automatically following links. Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients. Bing and Yahoo!. and dynamic content.Wall for writing about the ban. some search engines now have a vested interest in the health of the optimization community.txt file in the root directory of the domain. for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found. thus increasing traffic. it may on occasion crawl pages a webmaster does not wish crawled. so as to be relevant to a wide variety of search queries will tend to increase traffic. notably Yahoo!. Not every page is indexed by the search engines. The robots. determine how many pages are in the Yahoo! index and view link information. Distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled. White hat versus black hat Main article: White or black hat . including the title tag and meta description. Updating content so as to keep search engines crawling back frequently can give additional weight to a site. webmasters can instruct spiders not to crawl certain files or directories through the standard robots.[dead link] Two major directories. Google guidelines are a list of suggested practices Google has provided as guidance to webmasters. the robots. operate a paid submission service that guarantee crawling for either a set fee or cost per click. and will instruct the robot as to which pages are not to be crawled. chats. Major search engines provide information and guidelines to help with site optimization.  Additionally. Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam. Some search engines. Adding relevant keywords to a web page's meta data. In fact. search engines sometimes have problems with crawling sites with certain kinds of graphic content. Yahoo! Site Explorer provides a way for webmasters to submit URLs.  Writing content that includes frequently searched keyword phrase. and seminars. such as Google. and how many pages have been indexed by their search engine. As a search engine crawler may keep a cached copy of this file. a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. using the "canonical" meta tag or via 301 redirects can help make sure links to different versions of the url all count towards the page's link popularity score. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically.
achieved through optimization techniques and not paid advertising. it will announce how and when the ranking algorithm will change a few months before changing the algorithm). building high quality web pages to engage and persuade. However. either as text colored similar to the background. fixed the offending pages.  As of June 2008. White hat SEO is in many ways similar to web development that promotes accessibility. "Search marketers. quickly apologized. or by a manual site review. Google's share is often larger.  It is considered wise business practice for website operators to liberate themselves from dependence on search engine traffic. search engines are not paid for organic search traffic. A search engine's algorithm takes this into account. or positioned off screen. their algorithms change. depending on the site operator's goals. however. a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine. White hats tend to produce results that last a long time. Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines. and other Internet marketing strategies can be much more effective." Instead. One infamous example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.SEO techniques are classified by some into two broad categories: techniques that search engines recommend as part of good design. In markets outside the United States. but it also may involve the use of paid advertising on search engines and other pages. in an invisible div. That market share is achieved in a number of countries. their main sources of traffic are links from other websites. (Some trading sites such as eBay can be a special case for this. As the search engine guidelines  are not written as a series of rules or commandments. or black hat SEO. or involve deception. International markets Optimization techniques are highly tuned to the dominant search engines in the target market. A top-ranked SEO blog Seomoz. in a twist of irony. there were only about five in Germany. In 2003. and were restored to Google's list. making efforts to deliver quality content to an audience that has requested the quality content. as does competition. but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. receive a very small share of their traffic from search engines. referred to as spamdexing. a technique known as cloaking. as either white hat SEO. The search engines' market shares vary from market to market. although the two are not identical. Google had an 8590% market share in Germany. technique or method is considered white hat if it conforms to the search engines' guidelines and involves no deception. .  As of 2006. and improving a site's conversion rate.Both companies. whereas black hats anticipate that their sites will eventually be banned once the search engines discover what they are doing. and those techniques that search engines do not approve of and attempt to minimize the effect of. this is an important distinction to note. and there are no guarantees of continued referrals.  SEO may generate areturn on investment.org has suggested. While there were hundreds of SEO firms in the US at that time. the marketshare of Google in the UK was close to 90% according to Hitwise. either by reducing their rankings or eliminating their listings from their databases altogether. A successful Internet marketing campaign may drive organic traffic. Some industry commentators classify these methods. rather than attempting to game the algorithm. setting up analytics programs to enable site owners to measure their successes. Due to this lack of guarantees and certainty. to web pages. White hat advice is generally summed up as creating content for users. Danny Sullivan stated that Google represented about 75% of all searches. not for search engines. and Google remains the dominant search engine worldwide as of 2007. and the practitioners who employ them. An SEO tactic. White hat SEO is not just about following guidelines. Such penalties can be applied either automatically by the search engines' algorithms. One black hat technique uses text that is hidden. As a marketing strategy SEO is not necessarily an appropriate strategy for every website. Traditional marketing means have allowed this through transparency and exposure. addressing technical issues that may keep search engines from crawling and indexing those sites. Search engines may penalize sites they discover using black hat methods. White Hat SEO is merely effective marketing. and then making that content easily accessible to the spiders.  This includes paid search advertising which has its own version of SEO called ATO (Ad Text Optimization). such as Google's PageRank.
South Korea. A backlink from a Web directory. for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found. such as Google. Yandex and Seznam are market leaders. Cross linking between pages of the same website. webmasters and content administrators create blogs to easily provide this information through a method that is intrinsically viral. and easily scannable content users tend to spend little to no time paying attention to a website. Bing and Yahoo!. Linking from other websites. to increase PageRank used by search engines. URL normalization for webpages with multiple urls. Naver. to a subdomain of the respective domain is a quick way to combat this siphoning of link juice . Giving more links to main pages of the website. Without unique. Often. the fundamental elements of search optimization are essentially the same. operate a paid submission service that guarantees crawling for either a set fee or cost per click. Successful search optimization for international markets may require professional translation of web pages. Linkbait is a term used to describe content that is designed to be shared and replicated virally in an effort to gain backlinks. Yahoo! Japan. These additional forms of content allow webmasters to .  Two major directories. Japan. Almost all SEOs that provide organic search improve focus heavily on creating this type of content.As of 2009. or "linkbait". registration of a domain name with a top level domain in the target market. so they lose "link juice". Keyword rich text in the webpage and key phrases. However. the Yahoo Directory and the Open Directory Project both require manual submission and human editorial review. Link juice is jargon for links that provide a boost to Page Rank and Trust Rank. Other methods A variety of other methods are employed to get a webpage indexed and shown higher in the results and often a combination of these methods are used as part of a search engine optimization campaign. Google offers Google Webmaster Tools. but do not guarantee specific ranking within the search results. most forget that traffic generated to blog accounts don't point back to their respective domains. Some search engines. including keyword stuffing. especially pages that aren't discoverable by automatically following links. Russia and the Czech Republic where respectively Baidu. Otherwise. Changing the domain of the blog. and web hosting that provides a local IP address. including link farming and comment spam. regardless of language. as well as Picasa and Flickr photos indexed in Google Images Searches. Other commonly implemented methodologies for creating and disseminating content include YouTube Videos. there are only a few large markets where Google is not the leading search engine. Search engine optimization methods Search engine optimization methods are techniques used by webmasters to get more visibility for their sites in search engine results pages. Media Content creation like press releases and online news letters to generate an amount of incoming links Content Creation and Linking Content creation is one of the primary focuses of any SEO's job. when Google is not leading in a given market. it is lagging behind a local player. Search engine crawlers may look at a number of different factors when crawling a site. Such programs usually guarantee inclusion in the database. use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. In most cases. Not every page is indexed by the search engines. Google Places accounts. using "canonical" meta tag. relevant. The most notable markets where this is the case are China. Distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled. notably Yahoo!. Getting indexed The leading search engines.  Adding relevant keywords to a web page meta tags. SEO Trending based on recent search behaviour using tools like Google Insights for Search. so as to match all search queries.
Another widely used gray hat technique is a webmaster creating multiple 'micro-sites' which he or she controls for the sole purpose of cross linking to the target site. others whether the search term appears in the body text or URL of a web page. They all aim at variants of the vector space model for information retrieval on text collections. Some of these include determining whether the search term appears in the META keywords tag. The authority is sometimes measured by Google's PageRank. this is a violation of the principles of the search engine's algorithms (by self-linking) but since ownership of sites is not traceable by search engines it is impossible to detect and therefore they can appear as different sites. where he said: The problem arises when site operators load their Web pages with hundreds of extraneous terms so search engines will list them among legitimate addresses. The average price for a text link depends on the perceived authority of the linking page." The Boston Herald. Some of these gray hat techniques may be argued either way.   Some[who?] consider it to be a part of search engine optimization. It involves a number of methods. the free encyclopedia For spam on Wikipedia. This is useful to make a page appear to be relevant for a web crawler in a way that makes it more likely to be found. see Wikipedia:Spam and Wikipedia:WikiProject Spam. and density of the page. The rise of spamdexing in the mid-1990s made the leading search engines of the time less useful. A very good example of such a technique is purchasing links. Since it is the same owner of all the micro-sites. such as repeating unrelated phrases. to manipulate the relevance or prominence of resources indexed in a manner inconsistent with the purpose of the indexing system. spamdexing (also known as search spam. Many search engines check for instances of spamdexing and will remove suspect pages from their indexes. These techniques might have some risk associated with them. variety. hoping that the page will be listed as a fan site and receive many visits from music . Black hat techniques Main article: spamdexing Spamdexing From Wikipedia. Also. He places hidden text appropriate for a fan page of a popular music group on his page. The process is called "spamdexing. Common spamdexing techniques can be classified into two broad classes: content spam (or term spam) and link spam. Gray hat techniques Gray hat techniques are those that are neither really white nor black hat. though there are many search engine optimization methods that improve the quality and appearance of the content of web sites and serve content useful to many users. especially when using separate Class-C IPs. History The earliest known reference to the term spamdexing is by Eric Convey in his article "Porn sneaks way back on Web." a combination of spamming — the Internet term for sending users unsolicited information — and "indexing. May 22. perhaps alerted by user complaints of false matches. Search engines use a variety of algorithms to determine relevancy ranking. Keyword stuffing Keyword stuffing involves the calculated placement of keywords within a page to raise the keyword count. . search engine spam or web spam)  is the deliberate manipulation of search engine indexes.produce content that ranks well in the world's second most popular search engine YouTube. While Google is against sale and purchase of links there are people who subscribe to online magazines. Example: A promoter of a Ponzi scheme wants to attract web surfers to a site where he advertises his scam. people working for a search-engine organization can quickly block the results-listing from entire websites that use spamdexing. memberships and other resources for the purpose of getting a link back to their website. in addition to appearing in organic search results. 1996."  Content spam These techniques involve altering the logical view that a search engine has over the page's contents. In computing. although this is not necessarily an accurate way of determining the importance of a page.
large webpages are truncated. Doorway pages "Gateway" or doorway pages are low-quality web pages created with very little content but are instead stuffed with very similar keywords and phrases. People screening websites for a search-engine company might temporarily or permanently block an entire website for having invisible text on some of its pages. These techniques also aim at influencing other link-based ranking techniques such as the HITS algorithm. zerosized DIVs. Link-building software A common form of link spam is the use of link-building software to automate the search engine optimization process. Highlighted link text can help rank a webpage higher for matching that phrase. They are designed to rank highly within the search results. but serve no purpose to visitors looking for information. or they redirect the user to other sites. named after the famous multiple personality disorder patient "Sybil" (Shirley Ardell Mason). Spam blogs Spam blogs are blogs created solely for commercial promotion and the passage of link authority to target sites. However. Link spam Link spam is defined as links between pages that are present for reasons other than merit. Sybil attack A Sybil attack is the forging of multiple identities for malicious intent. such as fake blogs (known as spam blogs). and "no script" sections. A spammer may create multiple web sites at different domain names that all link to each other. but is merely an amalgamation of content taken from other sources. Link farms Link farms are tightly-knit communities of pages referencing each other.  The specific presentation of content on these sites is unique. also known humorously as mutual admiration societies Hidden links Putting hyperlinks where visitors will not see them to increase link popularity. This tactic has been ineffective since 2005. and using meta keywords that are unrelated to the site's content. to avoid penalties imposed by search engines for duplicate content. are created using various programs designed to "scrape" search-engine results pages or other sources of content and create "content" for a website. Meta-tag stuffing This involves repeating keywords in the Meta tags.lovers.  Link spam takes advantage of link-based ranking algorithms. or hiding it within HTML code such as "no frame" sections. This process is undertaken by hired writers or automated using a thesaurus database or a neural network. and used that to determine relevance levels. Older versions of indexing programs simply counted how often a keyword appeared. Article spinning Article spinning involves rewriting existing articles. Scraper sites Scraper sites sites. It is even feasible for scraper sites to outrank original websites for their own information and organization names. as opposed to merely scraping content from other sites. using a tiny font size. Often these "splogs" are designed in a misleading manner that will give the effect of a legitimate website but upon close inspection will often be written using . Such websites are generally full of advertising (such as pay-per-click ads). hidden text is not always spamdexing: it can also be used to enhance accessibility. Most modern search engines have the ability to analyze a page for keyword stuffing and determine whether the frequency is consistent with other sites created specifically to attract search engine traffic. Hidden or invisible text Unrelated hidden text is disguised by making it the same color as the background. alt attributes. A doorway page will generally have "click here to enter" on the page. so that massive dictionary lists cannot be indexed on a single webpage. Also. which gives websites higher rankings the more other highly ranked websites link to it. often without permission.
such as a Wikipedia article. . by following a link from another web page (the referrer). essentially stealing their legitimately earned commissions. Forum and Wiki admins can use these to discourage Wiki spam. and guestbooks. However Google resets the link data on expired domains. Cookie stuffing Cookie stuffing involves placing an affiliate tracking cookie on a website visitor's computer without their knowledge. Since some Web search engines base the importance of sites on the number of different sites linking to them. By having a robot randomly access many sites enough times. The spammer uses the open editability of wiki systems to place links from the wiki site to the spam site. Page hijacking Page hijacking is achieved by creating a rogue copy of a popular website which shows contents similar to the original to a web crawler but redirects web surfers to unrelated or malicious websites. to cooperate with other users to boost the ranking of a particular page for a particular query. so that the referee is given the address of the referrer by the person's Internet browser. then buy them when they expire and replace the pages with links to their pages. which will then generate revenue for the person doing the cookie stuffing. The subject of the spam site is often unrelated to the wiki page where the link is added. It can be problematic because agents can be written that automatically randomly select a user edited web page. In early 2005. Wikipedia implemented a default "nofollow" value for the "rel" HTML attribute. They are similar in nature to link farms. and any site that accepts visitors' comments are particular targets and are often victims of drive-by spamming where automated software creates nonsense posts with links that are usually irrelevant and unwanted. with a message or specific address given as the referrer. See Domaining. but also has the potential to overwrite other affiliates' cookies. Links with this attribute are ignored by Google's PageRank algorithm. this is.spinning software or very poorly written and barely readable content. Buying expired domains Some link spammers monitor DNS records for domains that will expire soon. Using world-writable pages Main article: forum spam Web sites that can be edited by users can be used by spamdexers to insert links to spam sites if the appropriate anti-spam measures are not taken. Also. and add spamming links. Some websites have a referrer log which shows which pages link to that site. Spam in blogs: Spam in blogs is the placing or solicitation of links randomly on other sites. blogs. Comment spam: Comment spam is a form of link spam that has arisen in web pages that allow dynamic user editing such as wikis. Referrer log spamming Referrer spam takes place when a spam perpetrator or facilitator accesses a web page (the referee). Wiki spam: Wiki spam is a form of link spam on wiki pages. referrer-log spam may increase the search engine rankings of the spammer's sites. site administrators who notice the referrer log entries in their logs may follow the link back to the spammer's referrer page. that message or Internet address then appears in the referrer log of those sites that have referrer logs. This not only generates fraudulent affiliate sales. Guest books. blogs. Automated spambots can rapidly make the user-editable portion of a site unusable. placing a desired keyword into the hyperlinked text of the inbound link. Some of these techniques may be applied for creating a Google bomb. forums. Programmers have developed a variety of automatedspam prevention techniques to block or at least slow down spambots.
This action might not be possible to undo. Are you sure you want to continue?
We've moved you to where you read on your other device.
Get the full title to continue reading from where you left off, or restart the preview.