Monday, February 8, 2016

Right Domain Name

The Right Domain Name

I either make my domain names define themselves exactly or I think of a creative way to state its purpose. Your best bet is to think of a name that is an extension of who you are.

A short and easy to remember URL is friendly to word-of-mouth marketing. Would you feel better referring a friend to seobook.com or seo-search-engine-optimization-marketing-e-book-book.com? Which would be easier to remember and say? If you make a message hard to spread, then it will probably spread at a slower rate.

Often times it is worth it to buy multiple domain names, even if you do not use them all. By securing multiple domain names you can decide to use some of your secondary domains to cover similar thematically-related topics AND prevent competitors from purchasing the name.

Within six months of my starting the SEOBook.com website, someone was already spamming me trying to sell me SEOBooks.com. I should have spent the additional $8 to register that domain from the start. You also may want to buy a generic name and the domain name that matches your business, and direct them both to the same location.


I do not recommend buying multiple domain names exclusively for deceptive practices. Many of my sites are about SEO, but you can break ideas down to their core and make useful sites in less competitive markets.

For example, one site I own is Search-Marketing.info. This site is similar to the contents of this e-book, although the site is somewhat dated. That site is not a well-branded name. I had many concepts on that site that later were extracted and made into their own sites:

  1. I had a blog at Search-Marketing.info; the blog was not successful. I moved the blog from that site to SeoBook.com, and it has likely become one of the top half dozen most popular blogs in the SEO industry.
  2. I had a directory list on Search-Marketing.info. I decided to turn that list into a directory of directories, and created that idea at DirectoryArchives.com.
  3. I listed some bad SEO practices on my Search-Marketing.info; I decided to turn that idea into BlackHatSEO.com.


Each of the last three sites occasionally spikes in popularity and helps give me a multi-brand approach. I would not be nearly as successful if I kept all of those ideas inside my first site.

When you spin out micro domains, they allow you to try to be humorous or different without necessarily having as much impact on your root brand as if you said and did the same things on your main business site.

There are also tactical business reasons for using multiple sites. For example, if what you are doing might get you sued, it may make sense to put it on a non-income-generating site to try to make it easier to get free legal help if the lawsuit is bogus.


Some businesses will require brand development to become successful. Being a branded SEO makes it far easier to charge a fair rate for my services than if I was unbranded. My original website (http://www.search-marketing.info) is really a weak brand and was a huge mistake.

I like the idea of creating things that I think add long-term value to the web, so I usually opt for branded names over generic names.

There is more than one way to skin a cat, and the same idea can be said for picking a domain name. If you aim to extract long-term profits and want to make the site you are working on become your career, then you want to pick a name that is not overly generic.

Before you pick a name or start building sites, you should decide what your goals are for the site. If you are unsure what type of site you want to make or why you want to make it, you may want to participate in web communities to find problems that need solutions and create a personal site until you find what you want to do.


Note: This is an advanced SEO technique most webmasters do not need to do.

With how many pages there are on the web, quality will usually win over quantity. With that being said, sometimes it will make sense to have multiple, similar websites covering slightly different topics. Doing this can help you create topically authoritative inbound links to different sites in your network and give you a multi-branded approach to marketing.

However, you want to make sure your sites are all different and unique. If your sites are extremely similar, then your sites may receive a spam penalty or have the nepotistic link popularity discounted. Even worse is that if you interlink them all, then all of your sites could get penalized at the same time.

Those using strong brands and good ideas can usually do well without creating a topical network. If you create a topical network expressly to deceive search engines, then you are taking a risk and your sites may get removed from the search indexes. In addition, some search engine relevancy algorithms, such as Google’s current algorithm, tend to favor one authoritative domain over using many smaller similar domains.

Many of the more aggressive techniques are used by people who create crash-and burn-domain names. They use a site until it gets penalized and then use a new one. They actually start building up multiple other sites and networks before the first even gets penalized. If your brand and domain name are important to you, then make sure you use caution to protect them.

How you wrap/package/sell the content is important. Many blog networks seem to be able to get away with murder right now because they are called a blog network. Other publishers that have tried similar network approaches have got banned for it. Over time, how blogs are treated may change though, and any way you slice it, you still need to get links from outside your network.

Keep the following in mind when developing a website network:

  1. Make unique sites. Make sure each site is unique enough that it can stand on its own merit.
  2. Only cross link the sites where it is logical. Blogs being part of a blog network might be considered legitimate cross-linking if it does not look like it was primarily done to spam the engines.
  3. Use various hosts. This way, if any of your sites go down, not all of your sites are down. Also, some search algorithms can devalue links that come from sites hosted on the same C block IP address. Some hosts also provide random C block IP addresses for each of your sites for a rather reasonable price on a single account.
  4. Get inbound links from external sources. Register your sites with directories and other topical sites to make sure you have plenty of inbound links into your link network. This will help prevent your sites from looking like an isolated island or link farm.
  5. Use various link sources. Each of your sites should have many unique link sources outside of your network.
  6. Do not interlink hundreds of domains together unless you are actively trying to get penalized.
  7. If you are creating and interlinking sites exclusively for the reason to manipulate search results, then you stand a good chance to eventually be penalized.
  8. You probably do not want to use the same WHOIS data on a large number of sites if the sites are made with deceptive intent. Additionally, you may want to register sites at a variety of registrars so that there is no discernible pattern. If you register a ton of your domains via proxy and cross link them, that too can look somewhat suspicious.



Have a Site for Each Language

Have a Site for Each Language


Many hucksters sell some translational cross-submission products that are complete bunk. Essentially you pay them money to accept your check and nothing more. Search engines typically do not translate text on the fly when people search—it is too computationally expensive.

If you use automated software to copy your text into a different language, it is likely to read clumsily and turn people off. You are better off not having the text on the web if you do not have a person fluent in the language proof the final page copy.

If you have significant content and target audiences in different languages, then usually you will want to have a different subdomain or site for each language or major market. This will make it easier to get links from the different geographic or ethnic communities you are interested in without the losing focus of your site. It also makes it easy for engines to understand the clear divisions in your site.
  
Subdomains are best to use if you have a small amount of content about each market. If you have significant content for each market, it may make more sense to create a site for each market.

If your site is in an exceptionally competitive category and you have many links that would be hard to replicate, then you might want to use folders (or possibly subdomains) on your main site for each language instead of trying to build up the link popularity of many different sites.

U.K. English is much different than English in the United States. Even within the same language you may need multiple versions to cater to different dialects, customs, and tastes.

If your main site is a thin e-commerce site, or if it lacks authority, it might make sense to keep most of your authoritative content together with the low authority content. But if your site has great authority, then using a subdomain for some of your content should allow you to get more listings in the search results.

An advantage of placing some of your content on a subdomain is that if you have a strong brand and the subdomain is also authoritative, then when people search for your brand some search engines will feature both your domain and the subdomain at the top of the search results, and searchers will have to scroll a long way down to find competing sites or sites that are critical of your brand.

Any more than about two or three words in a domain name and it becomes less memorable. Some of the most memorable websites do a great job of branding by creating their own word: eBay, Skype, PayPal, Yahoo!, Expedia, Slashdot, Fark, Travelocity, Google…

However, most successful businesses are soft innovations; they may not be able to afford the time, money, and effort required to create, brand, and add a new word to our language. You can create a name that is well-related to something people already know. It is easier to market corn sugar than it is to market fructose.

If you are not going to develop a strong brand, then using keywords in your domain name may give you a competitive advantage in search results. Having your keywords in your domain name can increase click-through rates on search engine listings and paid ads as well as make it easier to get keyword rich descriptive inbound links.

If your brand is exceptionally strong and your content quality is second to none you still can rank well in search results after enough related resources reference your site, even if most references do not mention the keywords you want to rank for. Google’s search algorithms have moved toward pushing natural authority sites even if they do not have much relevant anchor text.
  
A keyword-rich domain name will make it easy to get inbound link text with your primary keywords in them, but don’t forget that your domain name also plays a role in your branding. Your domain name should have your branding in mind as it can help reinforce the ideals of your brand.

On the web there are many different business ideas or business models. If low cost is your business model, then you will find people who are willing to work for half your wage that will slash throats to get by on razor thin margins. It is not a way to enjoy life.

Someone can always do your job cheaper. For example, Google turned labeling images into a game. Now thousands of people are labeling images, for free, to improve Google’s image search relevancy. You can take a look at this process by visiting http:// images.google.com/imagelabeler/.

Branding is one of the most important parts of building any website or web-based business, and it is what allows you to establish healthy profit margins. Every Monday, Rob Frankel holds free branding clinics on his website. I recommend going to at least one of them and asking a question or two. He also wrote a great branding book by the name of The Revenge of Brand X.


The site is not seriously launched yet, but a cool domain name for a site about sleeping might be something like LikeABaby.com. Using a creative name makes it easier to build a memorable brand than just focusing on keyword phrases.


A guy I met did not have a large marketing budget, but wanted to market a video clip idea. I thought that it would be a great idea to use the viral nature of blogs to market the initial product (i.e., let the bloggers market the product for us by word-of-mouth, because word-of-mouth spreads like wildfire). I came up with the name BlogFlix.net. The site later went under after some technical errors, but within a few months of being finished, it was featured on popular sites such as Smart Mobs.


Choosing a Domain Name

Choosing a Domain Name


Many web-based businesses fail because they do not have a functional business model. Before you even choose a name for your site you should know your target audience, what you intend to sell to them, and what will make your business idea unique or different than everything else that is already on the market.

You can still make significant profits without being sure what you want to sell if you can solve large problems and make life easier for a group of targeted people. At any level, you still have to know your goals and the reasons why you are creating a site. What makes your site different than the millions of sites already published?


My first few sites failed because they had no functional business models. They added little value to the web. That is not to say that I didn’t learn from them, because I did, but they led me to creating this one. If you are uncertain of yourself, don’t be afraid to create multiple channels just to try them out. If you know you want to do something for the long run, you may want to spend a bit of time

watching the marketplace before pouring too much effort into making a huge site that is hard to update.

SEOBook.com has done exceptionally well, and it has even revived the value and business models of some of my other sites. The single biggest thing I have going for me is the social currency my blog has created.

Some people think it is incredibly important to have keywords in a domain. People purchase domains like look-4-buy-cheap-discount-viagra-online-pharmacy.com. This is a horrible domain name!

An exact matching domain name in a competitive market can be seen by Google as a signal of quality since acquiring one would either indicate that you were early to the market or paid a premium for the URL. Some strong .com names sell for millions of dollars, while associated .net and .org domains can range from a few hundred dollars to a hundred thousand dollars.

If your site exhibits a number of strong brand related characteristics, such as high search volume for brand related keywords, high clickthrough rate for core brand related searches, many repeat visitors, or relevant matching URL and/or anchor text, there is a good chance that Google will place a set of Sitelinks in the search results for brand related search queries.

Keywords in the domain name may help some (as people tend to link to websites using their official names as the link text), but if I were going to create a long-term business, I would put brand above keyword-rich, unless you can find a name that exactly matched your core keywords or something that allows you to leverage both assets.

If you are creative, you usually can get keywords in the domain while keeping it short, memorable, fairly brandable, and free of hyphens.

Here is an article about the effect of domain names on anchor text (http://www.search-marketing.info/newsletter/articles/domain-name.htm). Since I originally wrote the article, Google has gotten a lot better at detecting natural

linkage data, so it is important to ensure you mix your inbound link text. Tips on mixing anchor text are located in the link building sections of this e-book.


Direct marketing mail campaigns usually peak in effectiveness around the third exposure to a marketing message. Many shoppers look around. If you want them to come back, you want to have a domain name that will stick in their heads. It can have keywords in it, but the thing you want more than anything else is a name that sticks.

If the option between having the dash in the domain and not having a dash exists, you are probably better off going without the dash as it looks more professional and would most likely be more memorable.

It is branding suicide to only have users find your site via search engines. If you are hoping to make sales on the first view in search engines, you need strong copywriting and usability.

If you are just using quick-buck-lead-generation websites then you may want to use a keyword-rich hyphenated domain for the small benefit it may offer, but in most cases, I do not recommend a hyphenated domain name for long-term websites.


People will forget the words in a domain name that is exceptionally long. Another problem with exceptionally long URLs is that they get cut off in e-mails and some other data transmission types. If you make the idea hard to spread, then you limit your site’s potential income.


What if you could get some of the benefits of a long keyword rich domain while still using a shorter and easy to brand domain name? You can!

If you use a short, branded domain, you can still include your keywords in your page title and logo to help control how some people link at your site. For example, a company like PayPal can register the domain Paypal.com, but put the words payment solutions or online payments in their logo near the word PayPal. Some people will reference them with those words as part of the company name.

Search engines associate words that occur near one another. For example, Google showed the following as suggested advertisements.

On the far right notice how Google realized that my name and the name of my strongest brand are related.

If you can work your name or your company name into the default topical vocabulary of a search, you will have a strong advantage over your competitors.
  
Some regional-based search engines or indexes will only list sites that are registered in their country code. If your site exclusively or primarily caters to a specific country, then you will most likely want to register a domain using the local country code.

Some search engines will still show your site in regional based search results if your site is hosted within that country, has links to and from other local sites, and/or has your address and phone number in the page text, but many directories are extremely picky and will only list regional domains.

As search progresses, localization of results will become more common. Some of the major search engines already give sites a regional ranking boost based upon where the site is hosted and the domain extension.

If you develop a regional domain (.co.uk for example), I also suggest buying the

.com version of your domain, if it is available, and forwarding it to the regional domain you registered. By buying the .com version and forwarding it your site, you help retain traffic you may have lost by people forgetting to put your region specific domain extension in their address bar when they type the website address directly into the address bar.

I prefer to use a .com version of a URL over other generic TLDs. People may assume your site is a .com even if it is .net, .biz, .org, or .info when they go to look for it on the web. If all they remember is your domain name, they may type your domain name followed by ‘.com’ because ‘.com’ is the default TLD in most people’s heads. If you don’t own the .com version as well, you are giving some of your hard-earned traffic to a competitor.

If you are running a charity or organizational website a .org may be seen by some people as a sign of credibility.

It is a good idea to place your business location on your web pages. If you are in a country where the search technology is primitive, local searchers will frequently add the country or city name to their searches, and if you have them on your pages you stand to be returned as a relevant result for more searches.



SEO Feedback Loop


The effects of SEO do take time to kick in. At any given time, considering how dynamically the web changes, there will be some holes in search algorithms that make certain SEO techniques exceptionally effective.


I have spoken with current search engine engineers working at major search engines in regards to this e-book. I also have spoken with database programmers who later became some of the world’s most technically advanced SEOs. Some of these programmers have told me what some would consider tricks that work really well, but they only work really well because few people know about them.

I do not try to promote the latest search spamming techniques in this e-book for the following reasons:

  1. They are the most likely to quickly change. Some things that are cutting-edge and effective today can become ineffective and actually hurt you tomorrow.
  2. Some of them can be damaging to your brand.
  3. Aggressive techniques are the some of the most likely techniques to get your site banned.
  4. Some things are told to me as a secret, and if they are made openly available to anyone (including search engine engineers—some who have read this e-book), then they lose their value, and I lose my friends and resources.
  5. I do not have a lot of experience with exceptionally aggressive promotional techniques, as I have not needed them to rank well in most the markets I worked in.
  6. People who use aggressive techniques are not evil or bad, but I cannot possibly put accurate, current, useful, and risky information out to everyone in an e-book format and expect it to not cause problems for some people.
  7. To me, effective web promotion is balancing risk versus reward. SEOBook.com got on the first page of Google for SEO within nine months of making the site, with less than $5,000 spent on promotion. Most sites do not need to use overly aggressive and risky promotional techniques. SEO works so well because most sites on the web do not actively practice effective SEO.

Meta Tags 2016

Meta Tags 2016

Meta tags were used to help search engines organize the Web. Documents listed keywords and descriptions that were used to match user queries. Initially these tags were somewhat effective, but over time, marketers exploited them and they lost their relevancy.

People began to stuff incredibly large amounts of data (which was frequently off topic) into these tags to achieve high search engine rankings. Porn and other high-margin websites published meta tags like “free, free, free, free, Disney, free.” Getting a better ranking simply meant you repeated your keywords a few more times in the meta tags.


It did not help anything that during the first Web bubble stocks were based on eyeballs, not profits. That meant that people were busy trying to buy any type of exposure they could, which ended up making it exceptionally profitable to spam search engines to show off topic random banners on websites.

The Internet bubble burst. What caused such a fast economic recovery was the shift from selling untargeted ad impressions to selling targeted leads. This meant that webmasters lost much of their incentive for trying to get any kind of traffic they could. Suddenly it made far greater sense to try to get niche-targeted traffic.

In 1998, Overture pioneered the pay-per-click business model that most all major search engines rely on. Google AdWords enhanced the model by adding a few more variables to the equation—the most important one is factoring ad click-through rate (CTR) into the ad ranking algorithm.

Google extended the targeted advertisement marketing by delivering relevant contextual advertisements on publisher websites via the Google AdSense program.

More and more ad spending is coming online because it is easy to track the return on investment. As search algorithms continue to improve, the value of having well-cited, original, useful content increases daily.


Instead of relying exclusively on page titles and meta tags, search engine now index the entire page contents. Since search engines have been able to view entire pages, the hidden inputs (such as meta tags) have lost much of their importance in relevancy algorithms.


The best way for search engines to provide relevant results is to emulate a user and rank the page based on the same things the user see and do (Do users like this website? Do they quickly hit the back button?), and what other people are saying

about the document (For example, does anybody link to this page or site? Who is linking at it? What is the link text? And so on.).

  
Search engines make billions of dollars each year selling ads. Most search engine traffic goes to the free, organically listed sites. The ratio of traffic distribution is going to be keyword dependent and search engine dependent, but I believe about 85% of Google’s traffic clicks on the organic listings. Most other search engines display ads a bit more aggressively than Google does. In many of those search engines, organic listings get around 70% of the traffic. Some sites rank well on merit, while others are there due exclusively to ranking manipulation.

In many situations, a proper SEO campaign can provide a much greater ROI than paid ads do. This means that while search engine optimizers—known in the industry as SEOs—and search engines have business models that may overlap, they may also compete with one another for ad dollars. Sometimes SEOs and search engines are friends with each other, and, unfortunately, sometimes they are enemies.

When search engines return relevant results, they get to deliver more ads. When their results are not relevant, they lose market share. Beyond relevancy, some search engines also try to bias the search results to informational sites such that commercial sites are forced into buying ads.

I have had a single page that I have not actively promoted randomly send me commission checks for over $1,000. There is a huge sum of money in manipulating search results. There are ways to improve search engine placement that go with the goals of the search engines, and there are also ways that go against them. Quality SEOs aim to be relevant, whether or not they follow search guidelines.

Many effective SEO techniques may be considered somewhat spammy.
Like anything in life, you should make an informed decision about which SEO techniques you want to use and which ones you do not (and the odds are, you care about learning the difference, or you wouldn’t be reading this).

You may choose to use highly aggressive, “crash and burn” techniques, or slower, more predictable, less risky techniques. Most industries will not require extremely aggressive promotional techniques. Later on I will try to point out which techniques are which.


using overtly deceptive techniques. In any business such as SEO, there will be different risk levels.

Search engines try hard not to flag false positives (label good sites as spam), so there is usually a bunch of slack to play with, but many people also make common mistakes, like incorrectly using a 302 redirect, or not using specific page titles on their pages, or allowing spiders to index multiple URLs with the same content. If you are ever in doubt if you are making technical errors, feel free to search a few SEO forums or ask me.

The search engines aim to emulate users. If you design good content for users and build a smart linking campaign, eventually it will pay off.

New aggressive techniques pop up all the time. As long as they are available, people will exploit them. People will force the issue until search engines close the loophole, and then people will find a new one. The competitive nature of web marketing forces search engines to continuously improve their algorithms and filters.

In my opinion, the ongoing effort of keeping up with the latest SEO tricks is usually not worth it for most webmasters. Some relational database programmers and people with creative or analytical minds may always be one step ahead, but the average business owner probably does not have the time to dedicate to keeping up with the latest tricks.

Tying ethics to SEO techniques is a marketing scam. Either a technique is effective, or it is not. There is nothing unethical about being aggressive. You probably do not want to take big risks with domains you cannot afford to have blacklisted, but there is nothing wrong with owning a few test sites.


Some sites that are not aggressively promoted still fall out of favor on occasion. As a webmaster following Google’s guidelines, you still can not expect Google to owe you free traffic. You have to earn it by making others cite your website.


Origins of the Web

Origins of the web

The Web started off behind the idea of the free flow of information as envisioned by Tim Berners-Lee. He was working at CERN in Europe. CERN had a somewhat web-like environment in that many people were coming and going and worked on many different projects.

Tim created a site that described how the Web worked and placed it live on the first server at info.cern.ch. Europe had very little backing or interest in the Web back then, so U.S. colleges were the first groups to set up servers. Tim added links to their server locations from his directory known as the Virtual Library.

Current link popularity measurements usually show college web pages typically have higher value than most other pages do. This is simply a function of the following:

  1. The roots of the WWW started in lab rooms at colleges. It was not until the mid to late 1990s that the Web became commercialized.
  2.  The web contains self-reinforcing social networks.
  3.  Universities are pushed as sources of authority.
  4.  Universities are heavily funded.
  5.  Universities have quality controls on much of their content.



The Web did not have sophisticated search engines when it began. The most advanced information gatherers of the day primitively matched file names. You had to know the name of the file you were looking for to find anything. The first file that matched was returned. There was no such thing as search relevancy. It was this lack of relevancy that lead to the early popularity of directories such as Yahoo!.

Many search engines such as AltaVista, and later Inktomi, were industry leaders for a period of time, but the rush to market and lack of sophistication associated with search or online marketing prevented these primitive machines from having functional business models.

Overture was launched as a pay-per-click search engine in 1998. While the Overture system (now known as Yahoo! Search Marketing) was profitable, most portals were still losing money. The targeted ads they delivered grew in popularity and finally created a functional profit generating business model for large-scale general search engines.


As the Internet grew in popularity, people realized it was an incredibly cheap marketing platform. Compare the price of spam (virtually free) to direct mail (~ $1 each). Spam fills your inbox and wastes your time.

Information retrieval systems (search engines) must also fight off aggressive marketing techniques to keep their search results relevant. Search engines market their problems as spam, but the problem is that they need to improve their algorithms.

It is the job of search engines to filter through the junk to find and return relevant results.

There will always be someone out there trying to make a quick buck. Who can fault some marketers for trying to find holes in parasitic search systems that leverage others’ content without giving any kickback?


Though I hate to quote a source I do not remember, I once read that one in three people believe the top search result is the most relevant document relating to their search. Imagine the power associated with people finding your view of the world first. Whatever you are selling, someone is buying!


Sometimes good things happen to you and sometimes the competition gets lucky. Generally the harder you work, and the more original and useful your site is, the more often you will get lucky.


As easy as it is to get syndicated with useful interesting and unique information, it is much harder to get syndicated with commercial ideas, especially if the site does not add significant value to a transaction. Often times links associated with commercial sites are business partnerships.

Many people do well to give information away and then attach a product to their business model. You probably would have never read this e-book if I did not have a blog associated with it. On the same note, it would also be significantly easier for me to build links to SEOBook.com if I did not sell this e-book on it.

Depending on your skills, faults, and business model, sometimes it is best to make your official voice one site and then sell stuff on another, or add the commercial elements to the site after it has gained notoriety and trust. Without knowing you, it is hard to advise you which road to take, but if you build value before trying to extract profits, you will do better than if you do it the other way around.


If my site was sold as being focused on search and I wrote an e-book or book about power searching, it would be far easier for me to get links than running a site about SEO. For many reasons, the concept of SEO is hated in many circles. The concept of search is much easier to link at.

Sometimes by broadening, narrowing, or shifting your topic it becomes far easier for people to reference you.


As the Web grew, content grew faster than technology did. The primitive nature of search engines promoted the creation of content, but not the creation of quality content. Search engines had to rely on the documents themselves to state their purpose. Most early search engines did not even use the full page content either, relying instead on page title and document name to match results. Then came along meta tags.

Search Interface

Search Interface

The search algorithm and search interface are used to find the most relevant document in the index based on the search query. First the search engine tries to determine user intent by looking at the words the searcher typed in.

These terms can be stripped down to their root level (e.g., dropping ing and other suffixes) and checked against a lexical database to see what concepts they represent. Terms that are a near match will help you rank for other similarly related terms. For example, using the word swims could help you rank well for swim or swimming.

Search engines can try to match keyword vectors with each of the specific terms in a query. If the search terms occur near each other frequently, the search engine may understand the phrase as a single unit and return documents related to that phrase.


WordNet is the most popular lexical database. At the end of this chapter there is a link to a Porter Stemmer tool if you need help conceptualizing how stemming works.


Some search engines, such as Google and Yahoo!, have toolbars and systems like Google Search History and My Yahoo!, which collect information about a user. Search engines can also look at recent searches, or what the search process was for similar users, to help determine what concepts a searcher is looking for and what documents are most relevant for the user’s needs.

As people use such a system it takes time to build up a search query history and a click-through profile. That profile could eventually be trusted and used to

  1. aid in search personalization
  2. collect user feedback to determine how well an algorithm is working
  3. help search engines determine if a document is of decent quality (e.g., if many users visit a document and then immediately hit the back button, the search engines may not continue to score that document well for that query).


I have spoken with some MSN search engineers and examined a video about MSN search. Both experiences strongly indicated a belief in the importance of user acceptance. If a high-ranked page never gets clicked on, or if people typically quickly press the back button, that page may get demoted in the search results for that query (and possibly related search queries). In some cases, that may also flag a page or website for manual review.

As people give search engines more feedback and as search engines collect a larger corpus of data, it will become much harder to rank well using only links. The more satisfied users are with your site, the better your site will do as search algorithms continue to advance.

Real-Time versus Prior-to-Query Calculations

In most major search engines, a portion of the relevancy calculations are stored ahead of time. Some of them are calculated in real time.

Some things that are computationally expensive and slow processes, such as calculating overall inter-connectivity (Google calls this PageRank), are done ahead of time.

Many search engines have different data centers, and when updates occur, they roll from one data center to the next. Data centers are placed throughout the world to minimize network lag time. Assuming it is not overloaded or down for maintenance, you will usually get search results from the data centers nearest you. If those data centers are down or if they are experiencing heavy load, your search query might be routed to a different data center.


Search engines such as Google and Yahoo! may update their algorithm dozens of times per month. When you see rapid changes in your rankings, it is usually due to an algorithmic shift, a search index update, or something else outside of your control. SEO is a marathon, not a sprint, and some of the effects take a while to kick in.

Usually, if you change something on a page, it is not reflected in the search results that same day. Linkage data also may take a while to have an effect on search relevancy as search engines need to find the new links before they can evaluate them, and some search algorithms may trust links more as the links age.

The key to SEO is to remember that rankings are always changing, but the more you build legitimate signals of trust and quality, the more often you will come out on top.


The more times a search leads to desired content, the more likely a person is to use that search engine again. If a search engine works well, a person does not just come back, they also tell their friends about it, and they may even download the associated toolbar. The goal of all major search engines is to be relevant. If they are not, they will fade (as many already have).


Search engines make money when people click on the sponsored advertisements. In the search result below you will notice that both Viagra and Levitra are bidding on the term Viagra. The area off to the right displays sponsored advertisements for the term Viagra. Google gets paid whenever a searcher clicks on any of the sponsored listings.

The white area off to the left displays the organic (free) search results. Google does not get paid when people click on these. Google hopes to make it hard for search engine optimizers (like you and I) to manipulate these results to keep relevancy as high as possible and to encourage people to buy ads.

Later in this e-book we will discuss both organic optimization and pay-per-click marketing.



LAST UPDATE

PageRank Checker