Search Engine Optimizer
Active In SP
Joined: Mar 2010
22-04-2010, 12:20 AM
Search engine optimization (SEO) is the process of improving the volume and quality of traffic to a web site from search engines via "natural" ("organic" or "algorithmic") search results. Usually, the earlier a site is presented in the search results, or the higher it "ranks", the more searchers will visit that site. SEO can also target different kinds of search, including image search, local search, and industry-specific vertical search engines.
As a marketing strategy for increasing a site's relevance, SEO considers how search algorithms work and what people search for. SEO efforts may involve a site's coding, presentation, and structure, as well as fixing problems that could prevent search engine indexing programs from fully spidering a site. Other, more noticeable efforts may include adding unique content to a site, ensuring that content is easily indexed by search engine robots, and making the site more appealing to users.
The leading search engines, Google, Yahoo! and Microsoft, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. Some search engines, notably Yahoo!, operate a paid submission service that guarantee crawling for either a set fee or cost per click.Such programs usually guarantee inclusion in the database, but do not guarantee specific ranking within the search results. Yahoo's paid inclusion program has drawn criticism from advertisers and competitors. Two major directories, the Yahoo Directory and the Open Directory Project both require manual submission and human editorial review. Google offers Google Sitemaps, for which an XML type feed can be created and submitted for free to ensure that all pages are found, especially pages that aren't discoverable by automatically following links.
Search engine crawlers may look at a number of different factors when crawling a site. Not every page is indexed by the search engines. Distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled.
The proposed project and implimentation is a webapplication that will check submitted pages with known serach engine algorithms and make suggestion to improve the ranking. The ranking algorithm of each serach engine will stored in xml files for. The algorithms will be updated regularly by studying the rankng pattern of the search engines.
This system will also generate xml files that are compatible with Google and yahoo.
The proposed is developed using ASP.NET. This can also be developed in J2EE or PHP
Use Search at http://topicideas.net/search.php wisely To Get Information About Project Topic and Seminar ideas with report/source code along pdf and ppt presenaion