Chefs are you having trouble boosting your ranking?
How did I get to the top of the list?
Was it easy?
Did I Pay allot of money?
Can I help you with your site?
want more info?
goto-
chefedccp.comSearch engine optimization
From Wikipedia, the free encyclopedia
Jump to:
navigation,
searchSearch engine optimization (SEO), considered by many to be a subset of
search engine marketing, is a term used to describe a process of improving the volume or quality of traffic to a
web site from search engines, usually in "natural" ("organic" or "algorithmic")
search results. Those efforts may also be seen in more narrow vertical search engines involving areas such as local search. Many site owners and consultants engaging in SEO attempt to pursue qualified visitors to a site, and the quality of visitor traffic can be measured by how often a visitor using a specific keyword phrase leads to a desired
conversion action, such as making a purchase, viewing or downloading a certain page, requesting further information, signing up for a newsletter, or taking some other specific action.
In a broad sense, SEO is
marketing by understanding how search
algorithms work and what human visitors might search for, to help match those visitors with sites offering what they are interested in finding. Creating web pages with SEO in mind does not necessarily mean creating content more favorable to algorithms than human visitors. Some SEO efforts may involve optimizing a site's coding, presentation, and structure, without making very noticeable changes to human visitors, such as incorporating a clear hierarchical structure to a site, and avoiding or fixing problems that might keep search engine indexing programs from fully spidering a site. Other, more noticeable efforts, involve including unique content on pages that can be easily indexed and extracted from those pages by search engines while also appealing to human visitors.
The term SEO can also refer to "search engine optimizers," a term adopted by an industry of
consultants who carry out optimization projects on behalf of clients, and by employees of site owners who may perform SEO services in-house.
Search engine optimizers often offer SEO as a stand-alone service or as a part of a larger marketing campaign. Because effective SEO can require making changes to the source code of a site, it is often very helpful when incorporated into the initial development and design of a site, leading to the use of the term "Search Engine Friendly" to describe designs, menus,
content management systems and
shopping carts that can be optimized easily and effectively.
SEO targets the algorithms of popular search engine companies.
History
[
edit] Origin: Early search engines
Webmasters and content providers began optimizing sites for search engines in the mid-
1990s, as the first search engines were cataloging the early
Web.
Initially, all a
webmaster needed to do was submit a page, or
URI, to the various engines which would send a
spider to "crawl" that page, extract links to other pages from it, and return information found on the page to be
indexed.
[1] The process involves a search engine spider downloading a page and storing it on the search engine's own server, where a second program, known as an
indexer, extracts various information about the page, such as the words it contains and where these are located, as well as any weight for specific words, as well as any and all links the page contains, which are then placed into a scheduler for crawling at a later date.
Site owners started to recognize the value of having their sites highly ranked and visible in search engine results, creating an opportunity for both "white hat" and "black hat" SEO practitioners. Indeed, by
1996,
email spam could be found on usenet touting SEO services.
[2][3] The earliest known use of the phrase "search engine optimization" was a spam message posted on
Usenet on July 26, 1997.
[4]At first, search engines were supplied with information about pages by the webmasters themselves. Early versions of search
algorithms relied on webmaster-provided information such as the keyword
meta tag, or index files in engines like
ALIWEB. Meta-tags provided a guide to each page's content. But indexing pages based upon meta data was found to be less than reliable, because some webmasters abused meta tags by including irrelevant keywords to artificially increase page impressions for their website and to increase their ad revenue.
Cost per thousand impressions was at the time the common means of monetizing content websites. Inaccurate, incomplete, and inconsistent meta data in meta tags caused pages to rank for irrelevant searches, and fail to rank for relevant searches.
[5] Search engines responded by developing more complex ranking
algorithms, taking into account additional factors including:
Text within the title element
Domain nameURL directories and file names
HTML tags: headings, emphasized (
) and strongly emphasized () text
Term frequency, both in the document and globally, often misunderstood and mistakenly referred to as Keyword density
On page keyword proximity
On page keyword adjacency
On page keyword sequence
Alt attributes for images
Text within NOFRAMES tags
Web content development
Sitemaps
There are no major search engines which state that they consider meta keywords in their ranking algorithms these days, the way that Altavista did in the late 90s. The value of meta keywords are, however, not readily known because of the secrecy used during the ranking of pages by the search engines. One could recommend the use of meta keywords in webpages, but there may be little value in doing so. However, some sites continue to use them. For example, the source code of this page shows that Wikipedia uses meta keywords. The "description" tag is, however, claimed by most SEO-experts to be more important and is recommended by Yahoo! in their search indexing help page[6].
Web content providers also manipulated a number of attributes within the HTML source of a page in an attempt to rank well in search engines.[7]
By relying so much upon factors exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their SERPs showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This led to the rise of a new kind of search engine.
[edit] More sophisticated ranking algorithms
Google brought a new concept to evaluating web pages. This concept, called PageRank, has been important to the Google algorithm from the start.[8] PageRank is an algorithm that weights a page's importance based upon the quantity and quality of incoming links. PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are more valuable than others, as a higher PageRank page is more likely to be reached by the random surfer.
The PageRank algorithm proved very effective, and Google began to be perceived as serving the most relevant search results. On the back of strong word of mouth from programmers, Google became a popular search engine. Off-page factors such as PageRank and hyperlink analysis were considered as well as on-page factors to enable Google to avoid the kind of manipulation seen in search engines focusing primarily upon on-page factors for their rankings.
Despite being difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaining PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale.
Inktomi, an earlier search engine using similar off-page factors, had forced webmasters to develop link building tools and schemes to influence searches; these same tools proved applicable to Google's PageRank system. Thus an online industry spawned focused on selling links designed to improve PageRank and link popularity. To drive human site visitors, links from higher PageRank pages sell for more money.
A proxy for the PageRank metric is still displayed in the Google Toolbar, though the displayed value is rounded to the nearest integer, and the toolbar is believed to be updated less frequently than the value used internally by Google. In 2002 a Google spokesperson stated that PageRank is only one of more than 100 algorithms used in ranking pages, and that while the PageRank toolbar is interesting for users and webmasters, "the value to search engine optimization professionals is limited" because the value is only an approximation.[9] Many experienced SEOs recommend ignoring the displayed PageRank.[10]
Google — and other search engines — have, over the years, developed a wider range of off-site factors they use in their algorithms. The Internet was reaching a vast population of non-technical users who were often unable to use advanced querying techniques to reach the information they were seeking and the sheer volume and complexity of the indexed data was vastly different from that of the early days. Combined with increases in processing power, search engines have begun to develop predictive, semantic, linguistic and heuristic algorithms. Around the same time as the work that led to Google, IBM had begun work on the Clever Project [11], and Jon Kleinberg was developing the HITS algorithm.
As a search engine may use hundreds of factors in ranking the listings on its SERPs; the factors themselves and the weight each carries can change continually, and algorithms can differ widely, with a web page that ranks #1 in a particular search engine possibly ranking #200 in another search engine, or even on the same search engine a few days later.
Google, Yahoo, Microsoft and Ask.com do not disclose the algorithms they use to rank pages. Some SEOs have carried out controlled experiments to gauge the effects of different approaches to search optimization. Based on these experiments, often shared through online forums and blogs, professional SEOs attempt to form a consensus on what methods work best, although consensus is rarely, if ever, actually reached. SEO-focused communities are, in some respects, anti-collaborative, as the very nature of SEO requires establishing a significant competitive advantage over other practitioners. For this reason, those disclosing the greatest number of tips and algorithmic nuances are rarely the most skilled. As the community selects against full disclosure, due to market pressure, the information available to the public should not be interpreted as anything but the most well-known and historically-known practices.
SEOs widely agree that the signals that influence a page's rankings include:[12][13][14]
Keywords in the title tag.
Keywords in links pointing to the page.
Keywords appearing in visible text.
Link popularity.
PageRank of the page (for Google).
Keywords in Heading Tag H1,H2 and H3 Tags in webpage.
Linking from one page to inner pages.
Placing punch line at the top of page.
There are many other signals that may affect a page's ranking, indicated in a number of patents held by various search engines, such as historical data[15].
More than just concern for algorithms
Search engine optimization often involves more than just rankings. By improving the quality of a page's search listings, more users will select that page. Factors that may improve search listing quality include good copywriting such as an attention-grabbing title, an interesting description and a domain and URL that reinforce the legitimacy of the site. Some commentators have noted that domains with lots of hyphens look spammy and may discourage click throughs.[16][17]
Relationship between SEO and search engines
The first mentions of Search Engine Optimization do not appear on Usenet until 1997, a few years after the launch of the first Internet search engines. The operators of search engines recognized quickly that some people from the webmaster community were making efforts to rank well in their search engines, and even manipulating the page rankings in search results. In some early search engines, such as Infoseek, ranking first was as easy as grabbing the source code of the top-ranked page, placing it on your website, and submitting a URL to instantly index and rank that page.
Due to the high value and targeting of search results, there is potential for an adversarial relationship between search engines and SEOs. In 2005, an annual conference named AirWeb[18] was created to discuss bridging the gap and minimizing the sometimes damaging effects of aggressive web content providers.
Some more aggressive site owners and SEOs generate automated sites or employ techniques that eventually get domains banned from the search engines. Many search engine optimization companies, which sell services, employ long-term, low-risk strategies, and most SEO firms that do employ high-risk strategies do so on their own affiliate, lead-generation, or content sites, instead of risking client websites.
Some SEO companies employ aggressive techniques that get their client websites banned from the search results. The Wall Street Journal profiled a company that allegedly used high-risk techniques and failed to disclose those risks to its clients.[19] Wired reported the same company sued a blogger for mentioning that they were banned.[20] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[21]
Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences and seminars. In fact, with the advent of paid inclusion, some search engines now have a vested interest in the health of the optimization community. All of the main search engines provide information/guidelines to help with site optimization: Google's, Yahoo!'s, MSN's and Ask.com's. Google has a Sitemaps program [22] to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website. Yahoo! has Site Explorer that provides a way to submit your URLs for free (like MSN/Google), determine how many pages are in the Yahoo! index and drill down on inlinks to deep pages. Yahoo! has an Ambassador Program[23] and Google has a program for qualifying Google Advertising Professionals[24].
Getting into search engines' databases
Today's major search engines, by and large, do not require any extra effort to submit to, as they are capable of finding pages via links on other sites.
However, Google and Yahoo offer submission programs, such as Google Sitemaps, for which an XML type feed can be created and submitted. Generally, however, a simple link from a site already indexed will get the search engines to visit a new site and begin spidering its contents. It can take a few days or even weeks from the acquisition of a link from such a site for all the main search engine spiders to begin indexing a new site, and there is usually not much that can be done to speed up this process.
Once the search engine finds a new site, it uses a crawler program to retrieve and index the pages on the site. Pages can only be found when linked to with visible hyperlinks. However, some search engines, such as Google, are starting to read links created within Flash.
Search engine crawlers may look at a number of different factors when crawling a site, and many pages from a site may not be indexed by the search engines until they gain more PageRank, links or traffic. Distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled, as well as other importance metrics. Cho et al.[25] described some standards for those decisions as to which pages are visited and sent by a crawler to be included in a search engine's index.
A few search engines, such as Yahoo!, operate paid submission services that guarantee crawling for either a set fee or CPC. Such programs usually guarantee inclusion in the database, but does not guarantee specific ranking within the search results.
Blocking robots
Main article: robots.txt
Webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots.
When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed, and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled.
Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches.
Classifications
SEO techniques are classified by some into two broad categories: techniques that search engines recommend as part of good design, and those techniques that search engines do not approve of and attempt to minimize the effect of, referred to as spamdexing. Most professional SEO consultants do not offer spamming and spamdexing techniques amongst the services that they provide to clients. Some industry commentators classify these methods, and the practitioners who utilize them, as either "white hat SEO", or "black hat SEO".[26] Many SEO consultants reject the black and white hat dichotomy as a convenient but unfortunate and misleading over-simplification that makes the industry look bad as a whole. The comparison of white hat to black hat (spamdexing) methods is analogous to "positioning" compared to "guerilla marketing", with the latter spoiling the reputation of marketing as a whole.
Preferred "White hat" methods
An SEO tactic, technique or method is considered "White hat" if it conforms to the search engines' guidelines and/or involves no deception. As the search engine guidelines[27][28][29][30][31] are not written as a series of rules or commandments, this is an important distinction to note. White Hat SEO is not just about following guidelines, but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see.
White Hat advice is generally summed up as creating content for users, not for search engines, and then make that content easily accessible to their spiders, rather than game the system. White hat SEO is in many ways similar to web development that promotes accessibility[32], although the two are not identical.
Spamdexing "Black hat" methods
Main article: Spamdexing
"Black hat" SEO are methods to try to improve rankings that are disapproved of by the search engines and/or involve deception. This can range from text that is "hidden", either as text colored similar to the background or in an invisible or left of visible div, or by redirecting users from a page that is built for search engines to one that is more human friendly. A method that sends a user to a page that was different from the page the search engined ranked is Black hat as a rule. One well known example is Cloaking, the practice of serving one version of a page to search engine spiders/bots and another version to human visitors.
Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms or by a manual review of a site.
One infamous example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.[33]. Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's list. [3] [4]
SEO and marketing
There is a considerable sized body of practitioners of SEO who see search engines as just another visitor to a site, and try to make the site as accessible to those visitors as to any other who would come to the pages. They often see the white hat/black hat dichotomy mentioned above as a false dilemma. The focus of their work is not primarily to rank the highest for certain terms in search engines, but rather to help site owners fulfill the business objectives of their sites. Indeed, ranking well for a few terms among the many possibilities does not guarantee more sales. A successful Internet marketing campaign may drive organic search results to pages, but it also may involve the use of paid advertising on search engines and other pages, building high quality web pages to engage and persuade, addressing technical issues that may keep search engines from crawling and indexing those sites, setting up analytics programs to enable site owners to measure their successes, and making sites accessible and usable.
SEOs may work in-house for an organization, or as consultants, and search engine optimization may be only part of their daily functions. Often their education of how search engines function comes from interacting and discussing the topics on forums, through blogs, at popular conferences and seminars, and by experimentation on their own sites. There are few college courses that cover online marketing from an ecommerce perspective that can keep up with the changes that the web sees on a daily basis.
SEO, as a marketing strategy, can often generate a good return. However, as the search engines are not paid for the traffic they send from organic search, the algorithms used can and do change, there are no guarantees of success, either in the short or long term. Due to this lack of guarantees and certainty, SEO is often compared to traditional Public Relations (PR), with PPC advertising closer to traditional advertising. Increased visitors is analogous to increased foot traffic in retail advertising. Increased traffic may be detrimental to success if the site is not prepared to handle the traffic or visitors are generally dissatisfied with what they find. In either case increased traffic does not guarantee increased sales or success.
While endeavoring to meet the guidelines posted by search engines can help build a solid foundation for success on the web, such efforts are only a start. SEO is potentially more effective when combined with a larger marketing campaign strategy. Despite SEO potential to respond to the latest changes in market trends, SEO alone is reactively following market trends instead of pro-actively leading market trends. Many see search engine marketing as a larger umbrella under which search engine optimization fits, but it's possible that many who focused primarily on SEO in the past are incorporating more and more marketing ideas into their efforts, including public relations strategy and implementation, online display media buying, web site transition SEO, web trends data analysis, HTML E-mail campaigns, and business blog consulting making SEO firms more like an ad agency.
In addition, whilst SEO can be considered a marketing tactic unto itself, it's often considered (in the view of industry experts) to be a single part of a greater whole.[citation needed] Marketing through other methods, such as viral, pay-per-click, new media marketing and other related means is by no means irrelevant, and indeed, can be crucial to maintaining a strong search engine rank.[citation needed] The part of SEO that simply insures content relevancy and attracts inbound link activity may be enhanced through broad target marketing methods such as print, broadcast and out-of-home advertising as well.
SEO and Social Media Optimization (SMO)
With the rise of social media and networking was a whole new form of marketing created as a side effect of that, called social media optimization (SMO) or "social media marketing" (SMM). SMO gets often confused with SEO.
SMO is related to SEO, but are not the same. A different skill set is needed for SEO than it is for SMO. SMO is more related to Buzz Marketing or Word of mouth marketing, with some positive SEO side-effects.
Legal issues
In 2002, SearchKing filed suit in an Oklahoma court against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted an unfair business practice. This may be compared to lawsuits that email spammers have filed against spam-fighters, as in various cases against MAPS and other DNSBLs. In January of 2003, the court pronounced a summary judgment in Google's favor. [34] In March 2006, KinderStart.com, LLC filed a first amended complaint against Google and also attempted to include potential members of the class of plaintiffs in a class action. [35] The plaintiff's web site was removed from Google's index prior to the lawsuit and the amount of traffic to the site plummeted.
References
^ Finding What People Want: Experiences with the WebCrawler The Second International WWW Conference Chicago, USA, October 17-20, 1994, written by Brian Pinkerton
^ Comment by Dan Thies, January 11, 2007
^ Example Email from Google Groups
^ Usenet post mentioning SEO, July 26, 1997
^ Metacrap: Putting the torch to seven straw-men of the meta-utopia, written by Cory Doctorow, Version 1.3: 26 August 2001
^ Yahoo! Search Help - Search Indexing, Accessed February 18, 2007
^ What is a tall poppy among web pages?, Proceedings of the seventh conference on World Wide Web, Brisbane, Australia, 1998, written by Pringle, G., Allison, L., and Dowe, D.
^ Brin, Sergey and Page, Larry, The Anatomy of a Large-Scale Hypertextual Web Search Engine, Proceedings of the seventh international conference on World Wide Web 7, 1998, Pages: 107-117
^ PageRank Uncovered
^ WebmasterWorld.com- search engine optimization forum
^ The Clever Project, May 4, 2006
^ Search Engine Watch - Search Engine News and Forums. Organizer of SES (Search Engine Strategies) Conferences.
^ Search Engine Ranking Factors frequently updated by Rand Fishkin, SEOMoz.org
^ Google Ranking Factors - SEO Checklist updated frequently, Vaughn's One-Page Summaries
^ Information Retrieval Based on Historical Data, Google Patent Application, October 10, 2005
^ How URLs Can Affect Top Search Engine Rankings by John Heard, April 24, 2006, MarketPosition.com
^ Hyphen Filter SEOChat Thread, May 2004
^ AirWeb Adversarial Information Retrieval on the Web, annual conference and workshop for researchers and professionals
^ Startup Journal (Wall Street Journal), 'Optimize' Rankings At Your Own Risk by David Kesmodel at The Wall Street Journal Online, September 9 2005
^ Wired Magazine, Legal Showdown in Search Fracas, Sep, 08, 2005, written by Adam L. Penenberg
^ Cutts, Matt, Confirming a penalty, published on 2006-02-02 at Matt Cuts Blog
^ Google Web Master Central, formerly known as Google Sitemaps
^ Ambassador Program by Yahoo! Search Marketing
^ Google Advertising Professionals, a Program by Google AdWords, Google's Pay-Per-Click Advertising Service
^ Efficient crawling through URL ordering by Cho, J., Garcia-Molina, H. , 1998, published at "Proceedings of the seventh conference on World Wide Web", Brisbane, Australia
^ Goodman, Andrew, SearchEngineWatch, Search Engine Showdown: Black Hats vs. White Hats at SES
^ Ask.com Editorial Guidelines
^ Google's Guidelines on SEOs
^ Google's Guidelines on Site Design
^ MSN Search Guidelines for successful indexing
^ Yahoo! Search Content Quality Guidelines
^ Andy Hagans, A List Apart, High Accessibility Is Effective Search Engine Optimization
^ Ramping up on international webspam by Matt Cutts, published February 4, 2006, at Matt Cutts Blog
^ [1] Google replies to SearchKing lawsuit, James Grimmelmann at LawMeme (research.yale.edu), January 09. 2006
^ [2] (PDF) KinderStart.com, LLC, et al v. Google, Inc., C 06-2057 RS, filed March 17, 2006 in the Northern District of California, San Jose Division.
See also
Affiliate marketing
Free content
Internet marketing
Landing Pages
SEO contest
Search engine marketing
Spamdexing
Web syndication
Wikipedia:Search engine optimization
Big Daddy Google
SEO Organizations
Organization of Search Engine Optimization Professionals
Search Marketing Association - North America
Search Engine Marketing Professional Organization (SEMPO)
Notable SEOs
See: Category:Search engine optimization consultants
Search Engine Representatives
Matt Cutts
Jeremy Zawodny
Retrieved from "http://en.wikipedia.org/wiki/Search_engine_optimization"