It is the search engines that at last bring in your site to the notice of the prospective customers. Therefore it's better to know how these SEs actually work and how they deliver info to the customer initiating a search.
There are basically two types of search engines. The 1st is by robots called crawlers or spiders.
Search Engines use spiders to index websites. Once you submit your website pages to a search engine by completing their required submission page, the search engine crawler will index your entire site. A ‘spider’ is an automatized program that is run by the search engine system. Spider visits a web site, read the content on the actual site, the site's Meta tags and also follow the links that the site connects. The spider then returns all that information back to a central depository, where the data is indexed. It will visit each link you have on your website and index those sites as well. Some bots will only index a certain number of pages on your site, thus don’t create a web site with 500 pages!
The spider will periodically return to the sites to check for any information that has altered. The frequency with which this happens is determined by the moderators of the search engine.
A spider is almost like a book where it contains the table of contents, the actual content and the links and references for all the websites it finds during its search, and it may index up to a million pages a day.
Example: Excite, Lycos, AltaVista and Google.
When you ask a search engine to locate information, it is actually searching through the index which it has created and not really searching the Web. Different SEs produce different rankings because not every search engine uses the same algorithm to search through the indices.
One of the things that a search engine algorithm scans for is the frequency and location of keywords on a web page, but it can also detect artificial keyword stuffing or spamdexing. Then the algorithms analyze the way that pages link to other pages in the Web. By checking how pages link to each other, an engine can both determine what a page is about, if the keywords of the linked pages are similar to the keywords on the original page.
Search Engine Optimization is a process of choosing the most appropriate targeted keyword phrases related to your site and ensuring that this ranks your site highly in search engines so that when someone searches for specific phrases it returns your site on tops. It basically involves fine tuning the content of your site along with the HTML and
The most popular search engines are Google, Yahoo, MSN Search, AOL and Ask Jeeves. Search engines keep their methods and ranking algorithms secret, to get credit for finding the most valuable search-results and to deter spam pages from clogging those results. A search engine may use hundreds of factors while ranking the listings where the factors themselves and the weight each carries may change continually.
Algorithms differ so widely that a webpage that ranks #1 in a particular search engine c rank #200 in another search engine. New sites need not be "submitted" to search engines to be listed. A simple link from a well established site will get the search engines to visit the new site and begin to spider its contents. It can take a few days to even weeks from the referring of a link from such an established site for all the main search engine spiders to commence visiting and indexing the new site.