Terima kasih buat anda yang telah tidak sengaja tersangkut di blog saya ini, blog ini merupakan kumpulan artikel dari blog-blog yang pernah saya hinggapi. Jika kurang berkenan tolong di maafkan. Untuk melihat artikel aslinya anda bisa klik linknya....!! Terima kasih...........................!!

Translate

Tampilkan postingan dengan label tips SEO. Tampilkan semua postingan
Tampilkan postingan dengan label tips SEO. Tampilkan semua postingan

Fenomena Alexa dan Page Rank Google

Diposting oleh Bli Made on

Fenomena Alexa dan Page Rank Google

ekitar seminggu yang lalu aku memutuskan untuk mencabut AlexaWidget dari blog ini. Waktu itu posisi blog ini sekitar ranking 300 ribuan. Sekarang… ranking Alexa melorot jadi 445.017. Suatu penurunan yang lumayan drastis nih. Apakah blog ini semakin tidak populer? Dari statistik yang tercatat, kuncungan rata-rata per hari blog ini malah meningkat lho… Bisa ngga ya kita menarik kesimpulan kalau Alexa Badge memang berpengaruh pada ranking Alexa? Hmm… bisa jadi terlalu dini untuk menarik kesimpulan seperti itu. Masih banyak faktor lain seperti meningkat pesatnya ranking alexa website lain. Beberapa komentar sempat masuk pada artikelku tentang cara meningkatkan ranking Alexa. Waktu itu aku menyebutkan ada 5 cara mudah (sekarang sisa 4) untuk meningkatkan ranking Alexa. Dari pengalamanku, cara tersebut lumayan ampuh untuk meningkatkan ranking dari tanpa ranking menuju ranking 500 ribuan. Namun untuk lebih dari itu ternyata 5 cara belum cukup. Kita perlu 1000 cara!

Terlepas dari ranking Alexa yang masih misterius, ternyata PR (page rank)google juga menyimpan sejuta misteri. Baru saja mengeluh di blognya Wiellyamtentang PR blogku yang ngga naik-naik, tiba-tiba aku dikejutkan dengan labelPR4 yang disematkan om google setelah beliau mengupdate PR-nya. Secara update ranking memang dilakukan google tiap 3 bulan sekali. Mencoba flash back ke belakang, apa saja sih yang sudah kulakukan? Ternyata ngga banyak. Aku hanya mencoba untuk tetap konsisten menulis dan hadir lewat komentar di blog-blog tetangga. Tidak terlalu berharap atau memikirkan PR karena menurutku menaikkan page rank google adalah sebuah investasi. Sekarang kita berbuat, 3 bulan lagi baru ketauan hasilnya kan? So do the best and let it flow…

Sebagai penutup, terima kasih untuk teman-teman blogger Indonesia yang telah mensupport sehingga PR google blog ini bisa naik kelas. Dan ada sedikit pesan : Jangan terlalu berpatokan pada ranking, entah itu page rank google atau ranking Alexa. Ada yang jauh lebih penting, “kualitas ngeblog” yang hanya bisa diukur oleh diri sendiri.

http://www.andaka.com/fenomena-alexa-dan-page-rank-google.php

More aboutFenomena Alexa dan Page Rank Google

20 Quick Ways to Increase Your Alexa Rank

Diposting oleh Bli Made on

Alexa.com is a subsidiary of Amazon.com and is a website which provides information on traffic levels for websites. The Alexa rank is measured according to the amount of users who’ve visited a website with the Alexa toolbar installed.

In this article, I’ll examine the importance of the Alexa Rank as it relates to site monetization while briefly discussing some of the weaknesses involved in using Alexa ranking as a reliable traffic measure for any website.

Lastly, I’ve also included an extensive list of twenty methods and strategies you can use to increase your Alexa Rank dramatically in the short and long run.

What is the Alexa Rank?

Put simply, the Alexa Rank is a ranking system which bases its ranking schema on the level of traffic each website receives from the number of people who visit a website with the Alexa toolbar installed.

See Alexa’s definition of the Alexa Traffic Rank:

The traffic rank is based on three months of aggregated historical traffic data from millions of Alexa Toolbar users and is a combined measure of page views and users (reach). As a first step, Alexa computes the reach and number of page views for all sites on the Web on a daily basis.

The main Alexa traffic rank is based on the geometric mean of these two quantities averaged over time (so that the rank of a site reflects both the number of users who visit that site as well as the number of pages on the site viewed by those users)

Why would you want to increase your Alexa rank?

Webmasters, advertisers and ad networks use your blog’s Alexa rank as a gauge to determine the worth of a link on your website. If you depend on link or site selling as a form of monetization you’ll definitely want to increase your Alexa rank, because it’ll increase your bargaining power when it comes to ad pricing.

ReviewMe, Text Link Ads and Sponsored Reviews are just three of the networks which base your ad selling strength on Alexa Ranks.

Problems with the Alexa Rank

Alexa ranking is heavily skewed towards websites which have a large webmaster/tech audience. This is because webmasters or web savvy audiences are much more likely to have the Alexa toolbar installed than websites whose visitors are unaware of Alexa.

As such, many have indicated that Alexa is a vastly inaccurate method of measuring a website’s reach, traffic and potential. I don’t disagree.

Alexa is a silly way to measure web traffic but unfortunately, in an imperfect world Alexa is still heavily used by webmasters and ad networks when measuring the value of advertising on your website.

I understand the defects of Alexa’s ranking system and I’m not going to go into more detail about it here. What’s primarily important to me is that the Alexa Rank has become a central element in site monetization strategies.

I’m not concerned with the utility and value of Alexa but it’s perceived importance in the eyes of potential advertisers.

Dosh Dosh’s Alexa Rank

Alexa Traffic Graph for http://www.doshdosh.com

Since moving to my own domain (from Blogspot) in the middle of January 2007, Dosh Dosh has moved from a rank of around 3 million to the current Alexa rank of 21, 709 within two months.

The growth has been consistent and I think most of it was due to the fact that the content on Dosh Dosh is orientated towards webmasters. Another plausible reason is because overall daily traffic for Dosh Dosh has been growing steadily day by day.

The increase in Alexa Rank was also partially due to the fact that I’m active in several webmaster forums, notably Digital Point which sends me some visitors every day. Getting stumbled and receiving thousands of visitors in a day has also undoubtedly helped to increase Dosh Dosh’s Alexa Rank.

How do I get started with Alexa?

There are two easy ways to start using Alexa. If you are using Internet Explorer, visit this page and download the Alexa Toolbar. If you’re using Firefox, download the SearchStatus extension which displays the Alexa Rank, Google PageRank as well as other useful features.

I highly recommend that you use Firefox and SearchStatus instead of Alexa toolbar, which I find to be more bulky and less useful.

Can one actually game or manipulate the Alexa Ranking?

I believe that there are methods which will allow you to easily bring an Alexa ranking in the millions down to the 100,000 level. However, bringing it past the 10,000 or 1,000 mark is a considerably more difficult process, because of the stiff competition among websites.

Some have adamantly stated that there are no proven ways to game Alexa, while others have claimed that auto-surfs and scripts do work to some degree.

I’m not going to take any sides because I can’t guarantee that auto-surfs or other artificial methods will have similar effects for every blog.

The easiest way to know to know if any of the tips mentioned below really work is to actually try them for yourselves and monitor the results.

alexa-toolbar.jpg

20 Ways to Increase your Alexa Rank

Here is a collection of methods you can use to boost your Alexa Rank. Most of these tips are derived from several fellow webmasters I know who claimed to have derived positive results through their experiments with the Alexa Rankings.

Some of the other tips were derived articles and sources, which I have duly referenced at the end of this post.

Do these tips work? According to some, yes they definitely do work. But do note that most of them require active effort of some sort and hence, they will work as long as long as you are consistently performing specific actions.

To increase your Alexa rank in the long run, I would highly recommended that one focus ondeveloping quality content which attracts and maintains a large audience instead of purely focusing on artificially increasing your Alexa Rank.

Great link-worthy content will leads to an natural increase in site traffic and is an excellent way to passively increase your Alexa rank.

It is important to emphasize that you should devote most of your efforts in growing your site audience alongside integrated implementation of any of the following tips below.

  1. Install the Alexa toolbar or Firefox’s SearchStatus extension and set your blog as your homepage. This is the most basic step.
  2. Put up an Alexa rank widget on your website. I did this a few days ago and receive a fair amount of clicks every day. According to some, each click counts as a visit even if the toolbar is not used by the visitor.
  3. Encourage others to use the Alexa toolbar. This includes friends, fellow webmasters as well as site visitors/blog readers. Be sure to link to Alexa’s full explanation of their toolbar and tracking system so your readers know what installing the toolbar or extension entails.
  4. Work in an Office or own a company? Get the Alexa toolbar or SS Firefox extension installed on all computers and set your website as the homepage for all browsers. Perhaps it will be useful to note that this may work only when dynamic or different IPs are used.
  5. Get friends to review and rate your Alexa website profile. Not entirely sure of its impact on rankings but it might help in some way.
  6. Write or Blog about Alexa. Webmaster and bloggers love to hear about ways to increase their Alexa rank. They’ll link to you and send you targeted traffic (i.e. visitors with the toolbar already installed). This gradually has effects on your Alexa ranking.
  7. Flaunt your URL in webmaster forums. Webmasters usually have the toolbar installed. You’ll get webmasters to visit your website and offer useful feedback. It’s also a good way to give back to the community if you have useful articles to share with others.
  8. Write content that is related to webmasters. This can fall in the category of domaining and SEO, two fields in which most webmasters will have the Alexa toolbar installed. Promote your content on social networking websites and webmaster forums.
  9. Use Alexa redirects on your website URL. Try this: http://redirect.alexa.com/redirect?www.doshdosh.com . Replace doshdosh.com with the URL for your website. Leave this redirected URL in blog comments as well as forum signatures. This redirect will count a unique IP address once a day so clicking it multiple times won’t help. There is no official proof that redirects positively benefit your Alexa Rank, so use with caution.
  10. Post in Asian social networking websites or forums. Some webmasters have suggested that East Asian web users are big Alexa toolbar fans, judging by the presence of several Asia-based websites in the Alexa Top 500. I suggest trying this only if you have the time or capacity to do so.
  11. Create a webmaster tools section on your website. This is a magnet for webmasters who will often revisit your website to gain access to the tools. Aaron Wall’s webpage on SEOTools is a very good example.
  12. Get Dugg or Stumbled. This usually brings massive numbers of visitors to your website and the sheer amount will have a positive impact on your Alexa Rank. Naturally, you’ll need to develop link worthy material.
  13. Use PayperClick Campaigns. Buying advertisements on search engines such as Google or Exact Seek will help bring in Traffic. Doubly useful when your ad is highly relevant to webmasters.
  14. Create an Alexa category on your blog and use it to include any articles or news about Alexa. This acts as an easily accessible resource for webmasters or casual search visitors while helping you rank in the search engines.
  15. Optimize your popular posts. Got a popular post that consistently receives traffic from the search engines? Include a widget/graph at the bottom of the post, link to your Alexa post or use Alexa redirection on your internal URLs.
  16. Buy banners and links for traffic from webmaster forums and websites. A prominent and well displayed ad will drive lots of webmaster traffic to your website, which can significantly boost your rank.
  17. Hire forum posters to pimp your website. Either buy signatures in webmaster forums or promote specific articles or material in your website on a regular basis. You can easily find posters for hire in Digital Point and other webmaster forums.
  18. Pay Cybercafe owners to install the Alexa toolbar and set your website as the homepage for all their computers. This might be difficult to arrange and isn’t really a viable solution for most. I’m keeping this one in because some have suggested that it does work.
  19. Use MySpace . This is a little shady so I don’t recommended it unless you’re really interested in artificially inflating your Alexa Rank. Use visually attractive pictures or banners and link them to your redirected Alexa URL. This will be most effective if your website has content that is actually relevant to the MySpace Crowd.
  20. Try Alexa auto-surfs. Do they work? Maybe for brand new sites. I think they are mostly suitable for new websites with a very poor Alexa rank. Note that there be problems when you try to use auto surfs alongside contextual ads like Adsense. They aren’t also long term solutions to improving your Alexa Rank so I suggest using with caution.

Resources on Alexa Rank

Several of the tips listed above were taken from Aaron Wall’s article on Alexa Ranking. This post by Aaron will also give you some insight on Alexa’s webmaster bias.

A lower Alexa number means a greater level of traffic, and the traffic drops off logarithmically. You can fake a good Alexa score using various techniques, but if it shows your rankings in the millions then your site likely has next to no traffic.

Alexa by itself does not mean that much, but it simply provides a rough snapshot of what is going on. It can be spammed, but if a site has a ranking in the millions then it likely has little traffic.

Peter Norvig writes about Alexa Toolbar and the Problem of Experiment Design. He examines some problems with Alexa as a traffic measuring tool:

But one bias they don’t really comment on is the selection bias: the data would be good if it truly represented a random sample of internet users, but in fact it only represents those who have installed the Alexa toolbar, and that sample is not random.

The samplees must be sophisticated enough to know how to install the toolbar, and they must have some reason to want it. It turns out that the toolbar tells you things about web sites, so it is useful to people in the SEO (Search Engine Optimization) industry, so it overrepresents those people.

Google’s Matt Cutts compares his website against Ask.com and explains why his website enjoys such an impressive Alexa ranking:

One possible source of skewing in Alexa data is a bias toward webmaster-y sites. Alexa shows how popular web sites are, so it’s natural that webmasters install the Alexa toolbar.

Some do it just so that their normal day-to-day visits around the web (including their own site) are added to Alexa’s stats. The net effect is that webmaster-related sites are going to look more important to Alexa

Marketing Scoop has some tips on improving the Alexa Rank. (Thanks Beth!)

A very long thread on Digital Point which sees webmasters having a discussion on how Alexa Ranks can be gamed or manipulated through scripts and auto-surfs. Worth a read.

There you have it… twenty ways to boost your Alexa Rank and increase your site’s monetization potential.

What do you think of Alexa? Have you tried increasing your Alexa Rank by any of these methods?


http://www.doshdosh.com

More about20 Quick Ways to Increase Your Alexa Rank

Apa Itu SEO

Diposting oleh Bli Made on

SEO atau yang lebih akrab disebut search engine optimization adalah sebuah teknik pemasaran dalam bisnis internet yang bertujuan untuk mendatangkan pengunjung secara tertarget melalui mesin pencari seperti Google, Yahoo, Altavista, dll.

Dengan melakukan serangkaian optimasi tersebut diharapkan suatu website bisa menduduki posisi terbaik di hasil pencarian. Adapun posisi terbaik adalah posisi yang sering dikunjungi orang dalam suatu keyword.

Biasanya posisi yang sering dikunjungi orang itu antara halaman 1 sampai 3. Jika sampai halaman tersebut pengunjung belum mendapat informasi yang didapatkannya, maka ia akan beralih menggunakan keyword lain.

Untuk itulah bisa berada di posisi tersebut merupakan suatu keharusan jika menggunakan teknik SEO ini. Terlebih bisa berada di TOP 10 apalagi sampai peringkat 1, pastinya akan banyak kunjungan yang akan didapat.


http://tipsseo.net

More aboutApa Itu SEO

Tips SEO On Page

Diposting oleh Bli Made on

Beberapa hal yang harus Anda lakukan dalam kegiatan optimasi on page ini adalah dengan meletakkan keyword-keyword bidikan Anda pada kaedah-kaedah yang telah umum ditetapkan. Misalkan saja meletakkan keyword di title, di H1-H6, menebalkan dan memiringkan huruf dalam konten dan sebagainya.

Dalam meletakkan keyword-keyword tersebut, hendaknya Anda mengkombinasikan dengan kalimat lain agar pengunjung lebih tahu secara spesifik apa yang Anda informasikan.

Seperti misalnya di headline ini sendiri, yakni memberikan informasi tentang tips SEO, dengan menambahkan on page, berarti pengunjung sudah tahu bahwa web atau postingan yang akan dicarinya memberikan informasi mengenai tips seo on page dan bukan yang lain.

Dengan begini pengunjung yang Anda datangkan akan lebih tertarget karena mengetahui secara lebih spesifik terutama dengan kombinasi huruf yang Anda hubungkan dengan kata kunci.

Anda sebagai pembuat informasi seharusnya juga memberikan konten yang relevan dengan headline. Hal ini untuk menghindari kekecewaan pengunjung yang bisa saja berbuntut mereka kapok mengunjungi web Anda lagi disebabkan kontennya tidak sesuai dengan headline.

http://tipsseo.net
More aboutTips SEO On Page

TIPS SEO off Page

Diposting oleh Bli Made on

Setelah menyelesaikan kegiatan sebelumnya agar web kita dianggap oleh search engine benar-benar berkualitas untuk ditampilkan di hasil pencarian pada suatu kata kunci, langkah yang harus dilakukan selanjutnya adalah mengoptimasi halaman luar web itu sendiri, yakni SEO off page.
Kegiatan inilah yang digadang-gadang memiliki andil besar dalam menaikkan posisi suatu web di SERP (search engine result page). Jadi tidak cukup hanya berkualitas di sisi internal saja, tapi yang bagian external juga harus ikut berkualitas.
Agar SEO off page kita dinilai berkualitas oleh search engine, maka web kita harus berada di web-web yang sebelumnya telah dianggap berkualitas oleh search engine, yakni dengan mendapatkan link dari mereka. Kegiatan inilah dalam SEO off pagedisebut menanam backlink atau link building.
Caranya bisa dengan memberikan komentar di web atau blog-blog populer tersebut. Tidak hanya itu, agar search engine menganggap backlink yang kita bangun berkualitas, maka dalam menanam backlink seharusnya menanam di web yang kontennya relevan dengan web kita.
More aboutTIPS SEO off Page

Teknik SEO Jadul yang Seharusnya Tidak Anda Lakukan

Diposting oleh Bli Made on

Beberapa orang masih sering menggunakanteknik SEO jadul (jaman dahulu) di era sekarang ini. Dalam hal ini adalah mensubmit situs mereka ke search engine.
Memang sah-sah saja sebenarnyateknik SEO ini digunakan, tapi alangkah baiknya jika Anda meninggalkannya. Mari kita gunakan logika sederhana berikut ini.
Coba banyangkan, tatkala Anda mensubmit link Anda ke beberapasearch engine, itu berarti Anda bersedia meminta search enginemenambahkan website Anda di databasenya. Di saat bersamaan, kebanyakan orang yang menggunakan teknik SEO jadul ini tidak sabar agar webnya terindex, maka dilakukanlah backlink.
Pastinya web Anda tidak serta muncul (terindex) begitu saja begitu Anda daftarkan, ada tenggang waktu yang harus Anda tunggu untuk memastikan bahwa web Anda benar-benar terindex.
Di sinilah kejanggalan itu mulai ada, tatkala robot search engineakan memasukkan website Anda ke databasenya, mereka curiga, mengapa website yang baru dibuat telah memiliki link dimana-mana, ini sebagai akibat dari tindakan sebelumnya yakni menanam backlink di saat robot search engine belum mengenali website.
Search engine menganggap website Anda melakukan kecurangan, bisa jadi mereka menggap SPAM dan tentunya akan dinilai rendah di mata mereka.
Sebagai akibat dari proses itu adalah bisa jadi proses index berlangsung lebih lama atau bisa juga mereka akan mengkarantina website Anda untuk beberapa sekian waktu (shandbox) atau hal-hal lain yang berdampak buruk bagi web Anda.
Jadi solusi terbaik tips seo kali ini adalah melakukannya secara natural, yakni tidak mendaftarkan web Anda ke search enginemelainkan cukup dengan menananm link ke web lain itupun tidak boleh dilakukan secara berlebihan untuk menghindari shandbox.
Dengan begini search engine akan “merasa malu” sendiri bahwa di databasenya dirasa ada yang kurang, yakni tidak adanya web Anda. Padahal web Anda sendiri telah populer dikalangan web lain, yakni melalui backlink yang Anda tanam secara natural.
Melihat web Anda telah populer, mereka segera cepat-cepat memasukkan web Anda di databasenya untuk kemudian diperhitungkan apalah layak mendapat posisi terbaik di hasil pencarian suatu keyword. Pada tahap ini juga tergantung teknik optimasi yang Anda lakukan kedepannya.
Anda bisa pelajari tips, trik dan teknik SEO ini lebih lengkap di:TOP 10 Google search engine.

http://tipsseo.net
More aboutTeknik SEO Jadul yang Seharusnya Tidak Anda Lakukan

Tips SEO Agar Posisi Website Stabil di Search Engine

Diposting oleh Bli Made on

Tips seo kali ini membahas bagaimana supaya website kita berada di posisi yang stabil di search engine. Karena tidak sedikit dari web-web para SEOERS yang ketika webnya sudah berada di posisi yang didambakan tapi ternyata itu hanya bersifat sementara dan tahu-tahu sudah terpental ke halaman yang sangat jauh sekali.
Apakah Anda juga sering mengalami hal ini ?

Hayo jujur, kalau iya itu berarti ada beberapa hal yang mungkin terlewatkan oleh Anda. Di tips seo ini saya tidak membahas panjang lebar, sebab jika tidak ada halangan akan saya buatkan panduan yang lebih lengkap.
Beberapa hal yang harus Anda perhatikan diantaranya adalahusia website. Website yang baru berumur jagung biasanya rentan terdepak, tapi ini biasanya hanya bersifat sementara saja. Ini biasanya sengaja dilakukan Google untuk mengetes website Anda.

Selain itu, teknik SEO off page yang Anda lakukan haruslahdilakukan senatural mungkin. Seperti misalnya ketika Anda menanam backlink, tanamlah secara pelan-pelan. Dalam artian jangan melakukan backlink dalam banyak jumlah sekaligus dalam satu waktu, apalagi web Anda baru lahir, tentu saja melihat hal ini Google curiga dan akan membuatnya marah yang menyebabkan web Anda di penalti di posisi yang jauh dari sebelumnya.
Bagaimana tidak, web yang baru berumur jagung sudah mempunyai backlink puluhan atau bahkan sudah mencapai ratusan, apalagi robot Google belum sampai mendatangi web Anda. Maka sudah pasti mereka curiga dan ganjaran penalti adalah hukumannya.

Demikian dalam memilih layanan web hosting, pilihlah layanan hosting yang mempunyai server stabil dan mempunyai uptime 99,9% seperti di Hosting Murah TOP. Hal ini agar tidak membuat kecewa search engine, dimana waktu spider Google datang web Anda tidak dalam keadaan downtime.


http://tipsseo.net
More aboutTips SEO Agar Posisi Website Stabil di Search Engine

SEO (Search engine optimization)

Diposting oleh Bli Made on

Search engine optimization (SEO) is the process of improving the visibility of a web site or a web page in search engines via the "natural" or un-paid ("organic" or "algorithmic") search results. Other forms of search engine marketing (SEM) target paid listings. In general, the earlier (or higher on the page), and more frequently a site appears in the search results list, the more visitors it will receive from the search engine. SEO may target different kinds of search, including image search, local search, video search and industry-specific vertical search engines. This gives a web site web presence.
As an Internet marketing strategy, SEO considers how search engines work and what people search for. Optimizing a website may involve editing its content and HTML and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic.
The acronym "SEO" can refer to "search engine optimizers," a term adopted by an industry of consultants who carry out optimization projects on behalf of clients, and by employees who perform SEO services in-house. Search engine optimizers may offer SEO as a stand-alone service or as a part of a broader marketing campaign. Because effective SEO may require changes to the HTML source code of a site, SEO tactics may be incorporated into web site development and design. The term "search engine friendly" may be used to describe web site designs, menus, content management systems, images, videos, shopping carts, and other elements that have been optimized for the purpose of search engine exposure.
Another class of techniques, known as black hat SEO or spamdexing, uses methods such as link farms, keyword stuffing and article spinning that degrade both the relevance of search results and the user-experience of search engines. Search engines look for sites that employ these techniques in order to remove them from their indices.


History
Webmasters and content providers began optimizing sites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed to do was submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[1] The process involves a search engine spider downloading a page and storing it on the search engine's own server, where a second program, known as an indexer, extracts various information about the page, such as the words it contains and where these are located, as well as any weight for specific words, and all links the page contains, which are then placed into a scheduler for crawling at a later date.
Site owners started to recognize the value of having their sites highly ranked and visible in search engine results, creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997.[2]
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag, or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using meta data to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[3] Web content providers also manipulated a number of attributes within the HTML source of a page in an attempt to rank well in search engines.[4]
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, allowing those results to be false would turn users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate.
Graduate students at Stanford University, Larry Page and Sergey Brin, developed "backrub," a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[5] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random surfer.
Page and Brin founded Google in 1998. Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[6] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[7]
By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. Google says it ranks sites using more than 200 different signals.[8] The leading search engines, Google and Yahoo, do not disclose the algorithms they use to rank pages. Notable SEO service providers, such as Rand Fishkin, Barry Schwartz, Aaron Wall and Jill Whalen, have studied different approaches to search engine optimization, and have published their opinions in online forums and blogs.[9][10] SEO practitioners may also study patents held by various search engines to gain insight into the algorithms.[11]
In 2005 Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[12] In 2008, Bruce Clay said that "ranking is dead" because of personalized search. It would become meaningless to discuss how a website ranked, because its rank would potentially be different for each user and each search.[13]
In 2007 Google announced a campaign against paid links that transfer PageRank.[14] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat nofollowed links in the same way, in order to prevent SEO service providers from using nofollow for PageRank sculpting[15]. As a result of this change the usage of nofollow leads to evaporation of pagerank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated Javascript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and Javascript. [16]
In December 2009 Google announced it would be using the web search history of all its users in order to populate search results [17].
Real-time-search was introduced in late 2009 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[18] This new approach to search places importance on current, fresh and unique content.



Relationship with search engines

By 1997 search engines recognized that webmasters were making efforts to rank well in their search engines, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Infoseek, adjusted their algorithms in an effort to prevent webmasters from manipulating rankings.[19]
Due to the high marketing value of targeted search results, there is potential for an adversarial relationship between search engines and SEO service providers. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web,[20] was created to discuss and minimize the damaging effects of aggressive web content providers.
SEO companies that employ overly aggressive techniques can get their client websites banned from the search results. In 2005, the Wall Street Journal reported on a company, Traffic Power, which allegedly used high-risk techniques and failed to disclose those risks to its clients.[21] Wired magazine reported that the same company sued blogger and SEO Aaron Wall for writing about the ban.[22] Google's Matt Cutts later confirmed that Google did in fact ban Traffic Power and some of its clients.[23]
Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, chats, and seminars. In fact, with the advent of paid inclusion, some search engines now have a vested interest in the health of the optimization community. Major search engines provide information and guidelines to help with site optimization.[24][25][26] Google has a Sitemaps program[dead link][27] to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website. Google guidelines are a list of suggested practices Google has provided as guidance to webmasters. Yahoo! Site Explorer provides a way for webmasters to submit URLs, determine how many pages are in the Yahoo! index and view link information.[28]


Methods

Getting indexed
The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. Some search engines, notably Yahoo!, operate a paid submission service that guarantee crawling for either a set fee or cost per click.[29] Such programs usually guarantee inclusion in the database, but do not guarantee specific ranking within the search results.[dead link][30] Two major directories, the Yahoo Directory and the Open Directory Project both require manual submission and human editorial review.[31] Google offers Google Webmaster Tools, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that aren't discoverable by automatically following links.
Search engine crawlers may look at a number of different factors when crawling a site. Not every page is indexed by the search engines. Distance of pages from the root directory of a site may also be a factor in whether or not pages get crawled.

Preventing crawling
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots. When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed, and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.

Increasing prominence
A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to most important pages may improve its visibility.[35] Writing content that includes frequently searched keyword phrase, so as to be relevant to a wide variety of search queries will tend to increase traffic.[35] Adding relevant keywords to a web page's meta data, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL normalization of web pages accessible via multiple urls, using the "canonical" meta tag[36] or via 301 redirects can help make sure links to different versions of the url all count towards the page's link popularity score.

White hat versus black hat
SEO techniques are classified by some into two broad categories: techniques that search engines recommend as part of good design, and those techniques that search engines do not approve of and attempt to minimize the effect of, referred to as spamdexing. Some industry commentators classify these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[37] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites will eventually be banned once the search engines discover what they are doing.[38]
A SEO tactic, technique or method is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[24][25][26][39] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines, but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see.
White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the spiders, rather than attempting to game the algorithm. White hat SEO is in many ways similar to web development that promotes accessibility,[40] although the two are not identical.
White Hat SEO is merely effective marketing, making efforts to deliver quality content to an audience that has requested the quality content. Traditional marketing means have allowed this through transparency and exposure. A search engine's algorithm takes this into account, such as Google's PageRank.
Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses text that is hidden, either as text colored similar to the background, in an invisible div, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking.
Search engines may penalize sites they discover using black hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One infamous example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.[41] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's list.

As a marketing strategy
SEO is not necessarily an appropriate strategy for every website, and other Internet marketing strategies can be much more effective, depending on the site operator's goals.[43] A successful Internet marketing campaign may drive organic traffic, achieved through optimization techniques and not paid advertising, to web pages, but it also may involve the use of paid advertising on search engines and other pages, building high quality web pages to engage and persuade, addressing technical issues that may keep search engines from crawling and indexing those sites, setting up analytics programs to enable site owners to measure their successes, and improving a site's conversion rate.[44]
SEO may generate a return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. (Some trading sites such as eBay can be a special case for this, it will announce how and when the ranking algorithm will change a few months before changing the algorithm). Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[45] It is considered wise business practice for website operators to liberate themselves from dependence on search engine traffic.[46] A top-ranked SEO blog Seomoz.org[47] has suggested, "Search marketers, in a twist of irony, receive a very small share of their traffic from search engines.." Instead, their main sources of traffic are links from other websites.

International markets
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[49] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[50] As of 2006, Google had an 85-90% market share in Germany.[51] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[51] As of June 2008, the marketshare of Google in the UK was close to 90% according to Hitwise.[52] That market share is achieved in a number of countries.[53]
As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable markets where this is the case are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.
Successful search optimization for international markets may require professional translation of web pages, registration of a domain name with a top level domain in the target market, and web hosting that provides a local IP address. Otherwise, the fundamental elements of search optimization are essentially the same, regardless of language.

Legal precedents
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[54][55]
In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. Kinderstart's web site was removed from Google's index prior to the lawsuit and the amount of traffic to the site dropped by 70%. On March 16, 2007 the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend, and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.

More aboutSEO (Search engine optimization)