google93407e1317119c0c.html SEO | SEO ONPAGE | HOW TO DO SEO | WHAT IS SEO Read more: http://allbloggingtips.com/2012/02/05/new-cool-social-icons-with-hover-effect-widget-v2/#ixzz2VM45GHXT Example: My title page contents
 
Icon Icon Icon Icon Follow Me on Pinterest

Monday

HOME

      
SEO | SEO ON PAGE | ABOUT SEO | HOW TO DO SEO | WHAT IS SEO


What is Seo?
 SEO stands for “search engine optimization.” It is the process of getting traffic from the “free,” “organic,” “editorial” or “natural” listings on search engines. All major search engines such as Google, Yahoo bing have such results, where web pages and other content such as videos or local listings are shown and ranked based on what the search engine considers most relevant to users. Payment isn’t involved, as it is with paid search ads.
                         
Who Uses Seo:
If a website is currently ranked #10 on Google for the search phrase, "how to make egg rolls," but wants to rise to #1, this websites needs to consider SEO. Because search engines have become more and more popular on the web, nearly anyone trying to get seen on the web can benefit from a little SEO loving.

  History of Search Engines:
   In the early days of Internet development, its users were a privileged minority and the amount of available information was relatively small. Access was mainly restricted to employees of various universities and laboratories who used it to access scientific information. In those days, the problem of finding information on the Internet was not nearly as critical as it is now.

   Site directories were one of the first methods used to facilitate access to information resources on the network. Links to these resources were grouped by topic. Yahoo was the first project of this kind opened in April 1994. As the number of sites in the Yahoo directory inexorably increased, the developers of Yahoo made the directory searchable. Of course, it was not a search engine in its true form because searching was limited to those resources who’s listings were put into the directory. It did not actively seek out resources and the concept of seo was yet to arrive.

   Such link directories have been used extensively in the past, but nowadays they have lost much of their popularity. The reason is simple – even modern directories with lots of resources only provide information on a tiny fraction of the Internet. For example, the largest directory on the network is currently DMOZ (or Open Directory Project). It contains information on about five million resources. Compare this with the Google search engine database containing more than eight billion documents.

   The WebCrawler project started in 1994 and was the first full-featured search engine. The Lycos and AltaVista search engines appeared in 1995 and for many years Alta Vista was the major player in this field.

   In 1997 Sergey Brin and Larry Page created Google as a research project at Stanford University. Google is now the most popular search engine in the world.

   Currently, there are three leading international search engines – Google, Yahoo and MSN Search. They each have their own databases and search algorithms. Many other search engines use results originating from these three major search engines and the same seo expertise can be applied to all of them. For example, the AOL search engine (search.aol.com) uses the Google database while AltaVista, Lycos and AllTheWeb all use the Yahoo database.search engine submissions go to http://seoterminologyanalytics.blogspot.in/p/directory-submissions.html

video



                                                  SEO GUIDELINES

 If you’re new to the world of search engine optimization, you may not know where to start. You may be fascinated with the idea of doing business around the world, but the reality is that you have to be seen before you can sell. This article covers ten basic facts to keep in mind about how the Internet and search engines work when you’re busy building a web site.

With the advent of computers and the Internet, a whole new meaning was given to the term "globalization." Websites came into being and what was initially seen as a sales gimmick for large scale companies, is today taken as an essential sales strategy, even for a local small-scale businesses. Almost everyone in the commercial sector is aware that websites are essential if one is looking to succeed and grow in their business sphere. It is no wonder, then, that right from Coke to a small scale bulb producer in Holland, everyone boasts a website.

Business people have realized that depriving your business of a website can translate to depriving your business of clients. The entire globe is a consolidated market. All one needs to do is to set up an efficient web site and tap potential customers worldwide.

This is where Search Engine Optimization steps in. Search engine optimization is the art of developing web content by adopting certain keywords that help in attracting traffic to the site. There is no doubt that there are several other forms of diverting traffic toward your website, such as pay-per-click adds, link advertising, and so forth. However, it has been observed that search engine optimized content development is one of the most effective as well as easy methods of increasing web traffic.

A few facts about SEO development will help you understand the importance of the strategy as well as learn how to develop appropriate search engine friendly content.

Most prospects come via search engines- Regardless of how easy it is to remember your web site’s name, it is probable that up to 90 percent of the traffic it sees is attracted via search engines. Most people use keywords to look for providers, and then conduct a comparison between the names that come up. Therefore, it is imperative that you do not lose out on these prospects, who may not have heard of your company, but are looking for products that you sell. It is a fact that ignoring search engine optimization will lead you to lose out to competitors who sell similar products.

Well-known search engines work the best – There are several search engines out there, but ask around and most people will stop after naming Google and Yahoo. These search engines are very well-known, therefore they are the most used — and they’re set up to handle the traffic you’d expect from such popular sites. Big-name search engines will ensure that the host site is functional 99 percent of the time. A host site that is frequently down repels prospects, which automatically translates to less traffic for the website begin hosted. So the trick is to ensure that you select search engines that are popular and keep their site operational almost always.

Most users use more than one keyword to search – Another important fact about SEO is that the keyword selection must be attacked from the point of view of the users. It is a known fact that most users use a search phrase rather than a single word. It is therefore imperative to organize content that is not keyword rich but rather key phrase rich. Instead of using a keyword as common as say, "shoes" it would be better to use"’red shoes," "cheap shoes," "leather shoes" and so on. Statistics show that 32.58% users use two word phrases when using a search engine.

Tracking the user- The golden rule of commercial success is to deliver in accordance with the client’s needs. If you know what the customer wants, then you will be able to tap in on the potential of a sale much more. Similarly, search engine optimization can also be mastered if you know what the prospect is looking for and the means they adopt for this search.

Interestingly, there are several methods that allow you to track your target customers. Tracking will enable you to assess which search engines are being used by your prospects and what types of keywords they are employing in this search. Armed with this knowledge, you shall be able to tame the SEO demon easily. Develop content around the popular keywords that prospects are using to search for products similar to yours. Once the content has been developed, place it on your site and make sure it is indexed by the search engines that the prospects are using the most.

Optimizing for multiple search engines – Choosing to optimize for a single search engine is like placing all your eggs in one basket. While it is true that one must try to optimize for a popular search engine, this does not mean that you can afford to ignore all the others (though that is becoming more and more true with Google’s increasing dominance). A more coherent SEO strategy would be one where several search engines are considered as means to deliver the site to the prospect. Each person visiting your site has the potential to convert into a client. Therefore the more search engines you use for this purpose the more traffic you add to your site.

Check and re check – Once you have created search engine optimized content and posted it, you cannot simply put your feet up and expect conversions. You must check and recheck to confirm if the content is actually working. Tracking comes in handy for this once more. E- commerce supporting software that allows you to analyze how many hits the site managed, what keywords were used by the prospects, which search engines were employed, how long the prospect stayed on the site, which pages they viewed, how many prospects converted, what services they were looking for, etc, are a must. The answers to these questions will help you develop a better website by summing up the results of your SEO efforts, so you can adjust your strategy accordingly.

Update constantly- Time is never still; it is in constant flux, and this holds true for the Internet too. Things on the net change continuously. What was good enough last night may be beaten by your competition by tomorrow morning. If you hope to maintain your standing and even prevail, you must at all times keep pace with this change. It is therefore necessary to update your content often. Maintaining your website is one of the most important aspects of search engine optimization. Continuing with an outdated site, content or keywords will fail to attract prospects.

High rank in the search engine results - Most users rarely go beyond the first or second page of the search results. Unless you maintain a high position on the search engine results pages (SERPs), your chances of attracting prospects via the search engine will be highly diminished. It is very important that your site comes up as high as possible on the results page . For this an effective keyword density will need to be adopted, along with the use of popular keywords and phrases.

Appropriate keyword density- Keyword density is the number of times that the keyword appears in a web page’s content. It is this density that decides the search engine page rank that the site shall enjoy. It is one factor search engines use in deciding whether the page is relevant to the searcher’s query.

However, this does not mean that only the keyword density is to be given regard while the quality of the content maybe completely ignored. Remember, unless your content is good, even if you do manage to pull in prospects using keywords, you will not be able to make converts. Since it is the final sale that is the backbone of any commercial venture, you cannot ignore the quality of the content at any given time.

Today there are highly intelligent search engines being developed which do not fall for content that is merely rich in keywords. Some search engines might even penalize web sites that are too keyword-rich, figuring that they contain nonsense. So keep in mind that search engines will look at other factors to judge whether your page is relevant to a particular search (Google’s algorithms are said to examine 50 factors or more). On average, a keyword density of three to four percent is considered safe.

Keyword placement - Keyword placement is as important as keyword density. Placing keywords carefully in the right areas rather than simply bunching them all together will go a long way to ensuring that the search engines see your content as being of good quality. On average, it is safe to say that ensuring that the keyword appears once in the title, once in the first and last sentence of the content and once every 100 to 150 words in the body of the content is fairly good keyword placement. At no point should the keyword override the quality of the content. Remember, most readers can be lost or captured by the first few sentences of any piece of writing.

While the ten pointers listed above are the basic facts about SEO content creation, this is not to say that they are the only steps one needs to follow. Building on these pointers and adding your own special qualities will help you to create high quality content and therefore generate additional traffic for your website.

Common search engine principles:
   To understand seo you need to be aware of the architecture of search engines. They all contain the following main components:

   Spider - a browser-like program that downloads web pages.

   Crawler - a program that automatically follows all of the links on each web page.

   Indexer - a program that analyzes web pages downloaded by the spider and the crawler.

   Database - storage for downloaded and processed pages.

   Results engine - extracts search results from the database.

   Web server - a server that is responsible for interaction between the user and other search engine components.

   Specific implementations of search mechanisms may differ. For example, the Spider+Crawler+Indexer component group might be implemented as a single program that downloads web pages, analyzes them and then uses their links to find new resources. However, the components listed are inherent to all search engines and the seo principles are the same.

   Spider  This program downloads web pages just like a web browser. The difference is that a browser displays the information presented on each page (text, graphics, etc.) while a spider does not have any visual components and works directly with the underlying HTML code of the page. You may already know that there is an option in standard web browsers to view source HTML code.

   Crawler  This program finds all links on each page. Its task is to determine where the spider should go either by evaluating the links or according to a predefined list of addresses. The crawler follows these links and tries to find documents not already known to the search engine.

   Indexer  This component parses each page and analyzes the various elements, such as text, headers, structural or stylistic features, special HTML tags, etc.

   Database  This is the storage area for the data that the search engine downloads and analyzes. Sometimes it is called the index of the search engine.

   Results Engine The results engine ranks pages. It determines which pages best match a user's query and in what order the pages should be listed. This is done according to the ranking algorithms of the search engine. It follows that page rank is a valuable and interesting property and any seo specialist is most interested in it when trying to improve his site search results. In this article, we will discuss the seo factors that influence page rank in some detail.

   Web server The search engine web server usually contains a HTML page with an input field where the user can specify the search query he or she is interested in. The web server is also responsible for displaying search results to the user in the form of an HTML page.

Internal ranking factors
   Several factors influence the position of a site in the search results. They can be divided into external and internal ranking factors. Internal ranking factors are those that are controlled by seo aware website owners (text, layout, etc.) and will be described next.

   A page consisting of just a few sentences is less likely to get to the top of a search engine list. Search engines favor sites that have a high information content. Generally, you should try to increase the text content of your site in the interest of seo. The optimum page size is 500-3000 words (or 2000 to 20,000 characters).

   Search engine visibility is increased as the amount of page text increases due to the increased likelihood of occasional and accidental search queries causing it to be listed. This factor sometimes results in a large number of visitors.

Learn How Search Engine Submission Results In Even More Site Traffic
Search engine submission can be considered a method to market your site and calls for the straightforward submission of a site’s URL to many search engines. This used to be the favored method of getting websites listed. But many search engines presently make use of different ways to seek out webpages. Although this is the case, there are a couple of good reasons why you must do search engine submission for your webpage.

If you have a brand-new web-based business, there is a big possibility that you also got a new webpage to support it. It’s better to submit a website than wait around for it to be noticed by spiders. Submitting your site will enable it to be detected by folks via basic search queries.

The next reason why site submission is a good way to advertise your website involves search engines’ updates: when a search engine is updated and you’ve sent your site’s URL to it, it would generate your website again as part of the search results.

There’s two ways to submit a webpage. You can submit it one page at a time or utilize a sitemap to submit all the site’s contents as well as links in a single instance. But the easiest way to execute site submission is to just send in your webpage’s home page. If your webpage is made well, search engines would start listing it right after it’s submitted. The idea is to make your webpage appear as among the best search results. The sites that get the best spots would have more traffic since people regularly consider just the leading results and select a site to visit in accordance with those rankings.

A webmaster has to optimize a website to be placed in the top 10 listings. The factors that have to be taken into consideration to make a website get noticed include the hierarchy structure of the webpage, keyword placement and keyword density.

In the year 2004, the top search engines acquired the capability to find new websites immediately. This is done by having an automatic indexer locate links coming from other sites. Backlink services could be used to make certain that you will have a lot of backlinks to your site, which would then be located by most search engines and also have an effect on your rankings.

These days, site submission is necessary only if a new website is rolled out. Believe it or not, this might violate the Terms of Service agreements of the major search engines. In the event that a search engine determines that you’ve dishonored its Terms of Service, it’ll bar your webpage from being added to its search results. Regardless, a webmaster could make use of all the submission methods in the international market as long as she / he goes through each and every Terms of Service agreement; making use of all of the popular search engines will yield the best results for a website.

Despite the presence of spiders, lots of international search engines still need to have sites submitted to them. This ascertains that website owners can reach an incredible number of internet users in a variety of markets. You could utilise applications to help you submit your site, and there’s also many enterprises that will help you make certain that your webpage is indexed correctly.

 Number of keywords on a page:
   Keywords must be used at least three to four times in the page text. The upper limit depends on the overall page size – the larger the page, the more keyword repetitions can be made. Keyword phrases (word combinations consisting of several keywords) are worth a separate mention. The best seo results are observed when a keyword phrase is used several times in the text with all keywords in the phrase arranged in exactly the same order. In addition, all of the words from the phrase should be used separately several times in the remaining text. There should also be some difference (dispersion) in the number of entries for each of these repeated words.

   Let us take an example. Suppose we optimize a page for the phrase "seo software” (one of our seo keywords for this site) It would be good to use the phrase “seo software” in the text 10 times, the word “seo” 7 times elsewhere in the text and the word “software” 5 times. The numbers here are for illustration only, but they show the general seo idea quite well.

 Keyword density and seo:
   Keyword page density is a measure of the relative frequency of the word in the text expressed as a percentage. For example, if a specific word is used 5 times on a page containing 100 words, the keyword density is 5%. If the density of a keyword is too low, the search engine will not pay much attention to it. If the density is too high, the search engine may activate its spam filter. If this happens, the page will be penalized and its position in search listings will be deliberately lowered.

   The optimum value for keyword density is 5-7%. In the case of keyword phrases, you should calculate the total density of each of the individual keywords comprising the phrases to make sure it is within the specified limits. In practice, a keyword density of more than 7-8% does not seem to have any negative seo consequences. However, it is not necessary and can reduce the legibility of the content from a user’s viewpoint.

Location of keywords on a page:
   A very short rule for seo experts – the closer a keyword or keyword phrase is to the beginning of a document, the more significant it becomes for the search engine.

 Text format and seo:
   Search engines pay special attention to page text that is highlighted or given special formatting. We recommend:

   - use keywords in headings. Headings are text highlighted with the «H» HTML tags. The «h1» and «h2» tags are most effective. Currently, the use of CSS allows you to redefine the appearance of text highlighted with these tags. This means that «H» tags are used less than nowadays, but are still very important in seo work.;

   - Highlight keywords with bold fonts. Do not highlight the entire text! Just highlight each keyword two or three times on the page. Use the «strong» tag for highlighting instead of the more traditional «B» bold tag.

Title Tag:
   This is one of the most important tags for search engines. Make use of this fact in your seo work. Keywords must be used in the TITLE tag. The link to your site that is normally displayed in search results will contain text derived from the TITLE tag. It functions as a sort of virtual business card for your pages. Often, the TITLE tag text is the first information about your website that the user sees. This is why it should not only contain keywords, but also be informative and attractive. You want the searcher to be tempted to click on your listed link and navigate to your website. As a rule, 50-80 characters from the TITLE tag are displayed in search results and so you should limit the size of the title to this length.

 Keywords in links:
   A simple seo rule – use keywords in the text of page links that refer to other pages on your site and to any external Internet resources. Keywords in such links can slightly enhance page rank.

 ALT Attributes in Images:
   Any page image has a special optional attribute known as "alternative text.” It is specified using the HTML «ALT» tag. This text will be displayed if the browser fails to download the image or if the browser image display is disabled. Search engines save the value of image ALT attributes when they parse (index) pages, but do not use it to rank search results.

   Currently, the Google search engine takes into account text in the ALT attributes of those images that are links to other pages. The ALT attributes of other images are ignored. There is no information regarding other search engines, but we can assume that the situation is similar. We consider that keywords can and should be used in ALT attributes, but this practice is not vital for seo purposes.

 Description Meta tag:
   This is used to specify page descriptions. It does not influence the seo ranking process but it is very important. A lot of search engines (including the largest one – Google) display information from this tag in their search results if this tag is present on a page and if its content matches the content of the page and the search query.

   Experience has shown that a high position in search results does not always guarantee large numbers of visitors. For example, if your competitors' search result description is more attractive than the one for your site then search engine users may choose their resource instead of yours. That is why it is important that your Description Meta tag text be brief, but informative and attractive. It must also contain keywords appropriate to the page.

 Keywords Meta Tag:
   This Meta tag was initially used to specify keywords for pages but it is hardly ever used by search engines now. It is often ignored in seo projects. However, it would be advisable to specify this tag just in case there is a revival in its use. The following rule must be observed for this tag: only keywords actually used in the page text must be added to it.

One page – one keyword phrase:
   For maximum seo try to optimize each page for its own keyword phrase. Sometimes you can choose two or three related phrases, but you should certainly not try to optimize a page for 5-10 phrases at once. Such phrases would probably produce no effect on page rank.

 Seo and the Main page:
   Optimize the main page of your site (domain name, index.html) for word combinations that are most important. This page is most likely to get to the top of search engine lists. My seo observations suggest that the main page may account for up to 30-40% percent of the total search traffic for some sites

   Search engines do have algorithms for consolidating mirrors and pages with the same content. Sites with session IDs should, therefore, be recognized and indexed correctly. However, it is difficult to index such sites and sometimes they may be indexed incorrectly, which has an adverse effect on seo page ranking. If you are interested in seo for your site, I recommend that you avoid session identifiers if possible.

 Redirects:
   Redirects make site analysis more difficult for search robots, with resulting adverse effects on seo. Do not use redirects unless there is a clear reason for doing so.

Hidden text, a deceptive seo method:
   The last two issues are not really mistakes but deliberate attempts to deceive search engines using illicit seo methods. Hidden text (when the text color coincides with the background color, for example) allows site owners to cram a page with their desired keywords without affecting page logic or visual layout. Such text is invisible to human visitors but will be seen by search robots. The use of such deceptive optimization methods may result in banning of the site. It could be excluded from the index (database) of the search engine.

  Link importance (citation index, link popularity):
   You can easily see that simply counting the number of inbound links does not give us enough information to evaluate a site. It is obvious that a link from www.microsoft.com should mean much more than a link from some homepage like www.hostingcompany.com/~myhomepage.html. You have to take into account link importance as well as number of links.

   Search engines use the notion of citation index to evaluate the number and quality of inbound links to a site. Citation index is a numeric estimate of the popularity of a resource expressed as an absolute value representing page importance. Each search engine uses its own algorithms to estimate a page citation index. As a rule, these values are not published.

  Link text (anchor text):
   The link text of any inbound site link is vitally important in search result ranking. The anchor (or link) text is the text between the HTML tags «A» and «/A» and is displayed as the text that you click in a browser to go to a new page. If the link text contains appropriate keywords, the search engine regards it as an additional and highly significant recommendation that the site actually contains valuable information relevant to the search query.

How to Analyze Keyword Competition
Competitive analysis for keywords involves assessing the amount of competition for each keyword and the strength of the competition. While the amount of competition for a given niche is hard to quantify except in broad, subjective terms like *somewhat competitive* or [extremely competitive], the amount of competition for a given keyword is easy to quantify. We define the amount of competition for a keyword by the number of pages that are indexed for the term in Google. This number is sometimes referred to as the index count. The keyword “personal trainer” entered into Google, surrounded by quotes, shows 38.5 million results in Google index.

This figure is shown directly under the search field in the Google search result page: *About 38,500,000 results (0.44 seconds).* The exact number of search results will naturally depend on which data center Google accesses, but will be roughly the same. When *online personal trainer* is searched, *only* 372,000 results were found. You’ll find lower result numbers for longer tail keywords to be a fairly consistent pattern: the more terms a keyword has for the same root term, the fewer pages will be indexed.
  
Be sure to put quotes around the phrase being checked. For counting search results indexed, quotation marks are necessary to distinguish variations between keywords. For instance, *red toasters* shows 92,300 results, while *toasters red* shows 53,600 results. Just like looking up a term in the Keyword Tool without setting it to Exact Match, looking up the index count in Google will yield misleading information if you forget the quotes.

Take 10 of the good keywords from your spreadsheets, and enter each of them with quotes into Google, e.g.:

    *online personal trainer*: 775,000 results
    *online personal fitness trainer*: 387,000 results
    *virtual personal trainer*: 53,000 results

Fewer results mean lower competition. All things being equal, it takes less time to outrank 53,000 other pages for a keyword that it would to outrank 775,000 pages. Depending on the strength of the competing pages, a keyword that needs to beat out fewer than 60,000 pages can get a top ten ranking in one to three months. What exactly does “strength” mean? There are a couple of ways to assess the strength of the competition: one that works in conjunction with index count, and one that ignores index count.

The first method is a little more labor intensive, but provides a slightly more reliable time frame for when to expect a good ranking. The second method is just simpler. It should be pointed out that no known method of competition analysis provides a magic formula for knowing exactly how long or how many links is required to get to the top of Google for a given keyword. The main objective of analyzing competition is to distinguish keywords that are worth pursuing from keywords that are time sinks that are unlikely to give a good return on investment.

1. How Search Engines Work
The first basic truth you need to know to learn SEO is that search engines are not humans. While this might be obvious for everybody, the differences between how humans and search engines view web pages aren't. Unlike humans, search engines are text-driven. Although technology advances rapidly, search engines are far from intelligent creatures that can feel the beauty of a cool design or enjoy the sounds and movement in movies. Instead, search engines crawl the Web, looking at particular site items (mainly text) to get an idea what a site is about. This brief explanation is not the most precise because as we will see next, search engines perform several activities in order to deliver search results – crawling, indexing, processing, calculating relevancy, and retrieving.

First, search engines crawl the Web to see what is there. This task is performed by a piece of software, called a crawler or a spider (or Googlebot, as is the case with Google). Spiders follow links from one page to another and index everything they find on their way. Having in mind the number of pages on the Web (over 20 billion), it is impossible for a spider to visit a site daily just to see if a new page has appeared or if an existing page has been modified, sometimes crawlers may not end up visiting your site for a month or two.

What you can do is to check what a crawler sees from your site. As already mentioned, crawlers are not humans and they do not see images, Flash movies, JavaScript, frames, password-protected pages and directories, so if you have tons of these on your site, you'd better run the Spider Simulator below to see if these goodies are viewable by the spider. If they are not viewable, they will not be spidered, not indexed, not processed, etc. - in a word they will be non-existent for search engines. After a page is crawled, the next step is to index its content. The indexed page is stored in a giant database, from where it can later be retrieved. Essentially, the process of indexing is identifying the words and expressions that best describe the page and assigning the page to particular keywords. For a human it will not be possible to process such amounts of information but generally search engines deal just fine with this task. Sometimes they might not get the meaning of a page right but if you help them by optimizing it, it will be easier for them to classify your pages correctly and for you – to get higher rankings.

When a search request comes, the search engine processes it – i.e. it compares the search string in the search request with the indexed pages in the database. Since it is likely that more than one page (practically it is millions of pages) contains the search string, the search engine starts calculating the relevancy of each of the pages in its index with the search string.

There are various algorithms to calculate relevancy. Each of these algorithms has different relative weights for common factors like keyword density, links, or metatags. That is why different search engines give different search results pages for the same search string. What is more, it is a known fact that all major search engines, like Yahoo!, Google, Bing, etc. periodically change their algorithms and if you want to keep at the top, you also need to adapt your pages to the latest changes. This is one reason (the other is your competitors) to devote permanent efforts to SEO, if you'd like to be at the top.

The last step in search engines' activity is retrieving the results. Basically, it is nothing more than simply displaying them in the browser – i.e. the endless pages of search results that are sorted from the most relevant to the least relevant sites.

2. Differences Between the Major Search Engines
Although the basic principle of operation of all search engines is the same, the minor differences between them lead to major changes in results relevancy. For different search engines different factors are important. There were times, when SEO experts joked that the algorithms of Bing are intentionally made just the opposite of those of Google. While this might have a grain of truth, it is a matter a fact that the major search engines like different stuff and if you plan to conquer more than one of them, you need to optimize carefully.

There are many examples of the differences between search engines. For instance, for Yahoo! and Bing, on-page keyword factors are of primary importance, while for Google links are very, very important. Also, for Google sites are like wine – the older, the better, while Yahoo! generally has no expressed preference towards sites and domains with tradition (i.e. older ones). Thus you might need more time till your site gets mature to be admitted to the top in Google, than in Yahoo!.

                                                    Search Engine Submissions
To put it simply, Search engine submission involves submitting a web site directly to a web search engine by webmasters. Though it is not necessary to submit the web pages manually as the spiders are capable of finding the web pages independently.

Web site submission is typically done in the case of new web sites that might take time to get indexed by the search engines. Web master also submit web pages to make sure that these sites are updated in the respective search engines.

Submission process
Webmasters can submit a few web pages at a time or can submit the entire site at one time with a sitemap. However, in most cases only the home page needs to be submitted as the search engines are capable of crawling the whole site by using the home page, which is well optimized and designed perfectly.

Most websites want to be listed in popular search engines and the web pages that appear high on the search results get the maximum visibility and user clicks compared to the results in the lower pages. Most search engine users never bother to scroll beyond the first page of search engine results, which makes it a hotly contested fight among websites to rank the best to find a place in top 10 search result page.

To ensure the best position in search engine results, webmasters must optimize their web pages, the process of which is called search engine optimization. The placement and density of keywords, the web page designs, navigation links and the number of web pages are some of the factors that could influence the pagerank of a website.

Web Site Promotion & Search Engine Submission Guidelines
Web site promotion by way of proper search engine submission is essential for most types of web sites, but not all. Which types require it? We answer this question by listing the different methods of website promotion (search engines being but one of these) and discuss how important each is to each type of website.

   Major search engines and directories
    Successful ranking (first 3 pages) in one or more of the major search engines for one or more popular search phrases AND/OR being listed in one of the major internet directories (Yahoo, Looksmart, DMOZ).

   An Overview of the Basic Rules
It’s an ongoing effort to get indexed and stay indexed. You want your rankings to improve over time – the goal is to be within the top 30 search results in at least one of the major search engines for your main keyword phrase, or you really are not ranked at all.

There are five (5) ways to improve or maintain your rankings:

    Keyword Optimization
    Link Popularity
    The more you are linked to by other web sites that have similar keywords to yours the more “popular” you are and the higher you rank for those keywords. This can only be expected to be a gradual improvement over time – the higher your rankings the more you’re seen which thus increases your chances of being linked to by others which increases your popularity factor which then improves your rankings.

    Re-submit Correctly
    Some search engines, like Google, say that you do not need to re-submit, especially if nothing on your web site has changed, because they will find your site anyway when they periodically spider the entire www. However, others (us included) say that it is a good idea to do so every 45 days if nothing has changed and every 3 weeks if keywords or content has changed significantly.

    “… you can repromote your site to the search engines as often as you want, but more than once a month or so isn’t going to do you much good. I recommend re-promoting your home page and other important pages to the search engines ‘Once a Month, whether they need it or not.’”

    Also, you never re-submit to the directories (Yahoo!, DMOZ, LookSmart) once you’ve been admitted and never more than once every 90 days while you’re still trying to get in.

Prepare Your Web pages

 Determine the most effective primary and secondary keyword phrases

Ideally, this was completed in Step 1 before composing the page content. If not, adjust the content now accordingly.

Google External Keyword Tool
This tool can be used to obtain the frequency with which specific keyword phrases were searched on in the last month. It allows you to see what keyword phrases users are actually entering into the search engines and it ranks them according to frequency. Throw a few keywords at it and see what it can tell you. Select the most popular, yet relevant, user keyword phrases as the main keyword phrases of your web page(s). Do this analysis for every page that you consider important and unique enough to warrant submission to the search engines. Usually this is only your index page.

In addition, the following links provide a more sophisticated analysis of what constitutes well selected primary and secondary keyword phrases …

Finding the Perfect Keywords
Expert Sumantra Roy factors in the number of existing pages which are already using a particular keyword phrase with the frequency data provided above to produce an “effectivity” index for determining the best keywords.

Focus Your Keywords

Successful web site promotion…

The Keyword Susser

Provides a convenient interface to the GoTo.com database of user entered keywords. Allows you to submit a list of keyword phrases and analyses their popularity.

Keywords Database Sites

Determine the most popular keywords in use at these FREE sites.

 Optimize the page(s) for search engine success by incorporating the primary and secondary keyword phrases into the page content

Ideally, this was completed in Step 1 (Web Site Design) before composing the page’s content. If not, adjust the content now accordingly.

Power up Your Traffic with Search Engines

Next, using the main keyword phrases incorporate them cleverly into your page´s meta tags. The following links provide the rules for this process.

Search Engine Optimization FREE!

“when it comes to getting ranked by search engines, the only tags that matter are TITLE, and the META tags KEYWORDS and DESCRIPTION…”

Which search engines should you submit to? The answer is – all of them. However, you cannot submit to every search engine in the world manually (it would take forever) and you cannot rely exclusively on auto submission services for all of your submission requirements because they are not as reliable as manual submissions (some major engines will not even acknowledge them). The solution, then, is to manually submit to the major search engines – the ones that are currently the most popular – and auto submit to everything else.

Now, to complete the analysis:
    Select the top 10 search engines from the statistical reports and consider these the major search engines. They alone account for about 72% of the total search engine activity for a given month.

    For each of these top 10 search engines determine their underlying “feed” engines (look at Search Engine Alliances Chart and Search Engine Relationship Chart). Some search engines use only their own indexes for results while others use an assortment of other search engines indexes as “feeds”. Write these all down for each of the top 10 and cross out any duplicates. This distilled “feed” list is now your manual submission list.

Do the required manual and auto submittals
    Manually submit just your index page to every search engine on the “feed” list that you have created above. Do this by going to each search engine home page and finding the link for ‘add url’ or ‘submit url’ or ‘add site’ or something similar. Enter the required information. The actual links to these “manual submission” pages are included in the example below.

    Auto submit just your index page to EVERY OTHER search engine not on the “feed” list. The top 10 only account for about 72% of the total search engine activity for a given month. This is not enough. You want to be in the upper 90th percentile in terms of search engine saturation. But this would take forever because you would have to submit to the top 50 (or thereabouts) and each would have to be analyzed and submitted to manually. This is why using an automatic submission service to complete the task is the most logical thing to do.The auto submission service that we use is:

    “… If you have a website you want to promote, you should check out SelfPromotion.com. It’s a resource for do-it-yourselfers where you can learn to prepare your pages for the search engines, then use a sophisticated url submission robot to submit your webpages to all the important search engines and directories. You’ll also find tutorials about website promotion, submitting to yahoo, and much more. Best of all, you can use the site for free — if you like it, pay what YOU think it’s worth! The guy who runs it has reinvented tipping! ”

    We paid a modest fee to establish a paid account that automatically resubmits our pages every 45 days for one year and transmits email notification each time this occurs.

    Auto submit all of your other important pages, if any, to EVERY search engine – even those on the “feed” list.
            Auto submit just your index page to every other search engine in the world (those not done manually). The top 10 only account for about 72% of the total search engine activity for a given month. This is not enough. You want to be in the upper 90th percentile in terms of search engine saturation. But this would take forever because you would have to submit to the top 50 (or thereabouts) and each would have to be analyzed and submitted to manually. This is why using an automatic submission service to complete the task is the most logical thing to do. Re-submit to these every 45 days just to be sure.The auto submission service that we use is:

    “… If you have a website you want to promote, you should check out SelfPromotion.com. It’s a resource for do-it-yourselfers where you can learn to prepare your pages for the search engines, then use a sophisticated url submission robot to submit your webpages to all the important search engines and directories. You’ll also find tutorials about website promotion, submitting to yahoo, and much more. Best of all, you can use the site for free — if you like it, pay what YOU think it’s worth! The guy who runs it has reinvented tipping! ”

    We paid a modest fee to establish a paid account that automatically resubmits our pages every 45 days for one year and transmits email notification each time this occurs.

    Auto submit all of your other important pages, if any, to every search engine in the world INCLUDING those in the top 10 “feed” list. Re-submit every 45 days, just to be sure, at a rate of no more than one page every 25 hours per search engine.

Monitor Your Rankings
Track Your Search Engine Rankings Monthly

SERanker FREE
Perfect tool to track your website´s ranking over time. You want to be ranked in the first three (3) pages of a search engine´s search results for your preferred keyword phrases. If you did everything correctly in keyword selection & web page optimization (see above) and you submitted everything correctly your ranking should steadily increase over time, particularly if other websites start linking to your´s (thereby increasing your website´s popularity).

                           Seo using two methods

On-Page Optimization: On Page Seo Refers all Seo Activities done on the Website that Needs to be Ranked. They Include:

 •    Domain Optimization

•    Keyword Analysis

•    Website Analysis

video

•    Competitor Website Analysis

•    Title Tag

•    Meta Tagging

•    Heading Tag

•    Alt Tags

•    Content Optimization

•    Internal Linking

•    W3C Markup Validation

•    Keyword Density

•    301 Redirection

•    Reporting client

•    Webmaster Tolls Setup [ Google, Yahoo and Bing ]

•    Google Analytics

Off-Page Optimization: Off Page Refers to any Activity we do to Get Links To point to Our Site. They Include:

•    Search Engine Submission

•    Directory Submission

•    Forum Posting

•    Social Book-marking

video
  
•    Article submission

•    Creating Wordpress, Blogs, Hub Pages, Squidoo Lenses

•    Creating Profiles in Social Networking [ Facebook, Twitter, Linkedin etc.,]

•    Posting Press Releases, Yellowpages and Classifieds

•    Comment Posting in related category

•    Link Exchange [ One Way and Reciprocal ]

•    Google Group Discussion

•    Yahoo Group Discussion

•    Yahoo Answers

•    Google Sitemap Creation [ xml sitemap ]

•    Yahoo Sitemap Creation [ txt sitemap ]

•    Robots.txt Creation [ Robots.txt ]

 •    Youtube submission

•    Hotfrog listing

•    Google places listing

•    Events promotion

•    Linkedin profile creation

•    Goggle+ profile creation

•    Twitter streaming

•    Web2.0 Technologies

•    Yahoo Flicker

•    Local classified posting