Meta Tags 2016
Meta tags were used to help search engines organize the
Web. Documents listed keywords and descriptions that were used to match user
queries. Initially these tags were somewhat effective, but over time, marketers
exploited them and they lost their relevancy.
People began to stuff incredibly large amounts of data
(which was frequently off topic) into these tags to achieve high search engine
rankings. Porn and other high-margin websites published meta tags like “free,
free, free, free, Disney, free.” Getting a better ranking simply meant you
repeated your keywords a few more times in the meta tags.
It did not help anything that during the first Web bubble
stocks were based on eyeballs, not profits. That meant that people were busy
trying to buy any type of exposure they could, which ended up making it
exceptionally profitable to spam search engines to show off topic random
banners on websites.
The Internet bubble burst. What caused such a fast
economic recovery was the shift from selling untargeted ad impressions to
selling targeted leads. This meant that webmasters lost much of their incentive
for trying to get any kind of traffic they could. Suddenly it made far greater
sense to try to get niche-targeted traffic.
In 1998, Overture pioneered the pay-per-click business
model that most all major search engines rely on. Google AdWords enhanced the
model by adding a few more variables to the equation—the most important one is
factoring ad click-through rate (CTR) into the ad ranking algorithm.
Google extended the targeted advertisement marketing by
delivering relevant contextual advertisements on publisher websites via the
Google AdSense program.
More and more ad spending is coming online because it is
easy to track the return on investment. As search algorithms continue to
improve, the value of having well-cited, original, useful content increases
daily.
Instead of relying exclusively on page titles and meta
tags, search engine now index the entire page contents. Since search engines
have been able to view entire pages, the hidden inputs (such as meta tags) have
lost much of their importance in relevancy algorithms.
The best way for search engines to provide relevant
results is to emulate a user and rank the page based on the same things the
user see and do (Do users like this website? Do they quickly hit the back
button?), and what other people are saying
about the
document (For example, does anybody link to this page or site? Who is linking
at it? What is the link text? And so on.).
Search engines
make billions of dollars each year selling ads. Most search engine traffic goes
to the free, organically listed sites. The ratio of traffic distribution is
going to be keyword dependent and search engine dependent, but I believe about
85% of Google’s traffic clicks on the organic listings. Most other search
engines display ads a bit more aggressively than Google does. In many of those
search engines, organic listings get around 70% of the traffic. Some sites rank
well on merit, while others are there due exclusively to ranking manipulation.
In many
situations, a proper SEO campaign can provide a much greater ROI than paid ads
do. This means that while search engine optimizers—known in the industry as
SEOs—and search engines have business models that may overlap, they may also
compete with one another for ad dollars. Sometimes SEOs and search engines are
friends with each other, and, unfortunately, sometimes they are enemies.
When search
engines return relevant results, they get to deliver more ads. When their
results are not relevant, they lose market share. Beyond relevancy, some search
engines also try to bias the search results to informational sites such that
commercial sites are forced into buying ads.
I have had a
single page that I have not actively promoted randomly send me commission
checks for over $1,000. There is a huge sum of money in manipulating search
results. There are ways to improve search engine placement that go with the
goals of the search engines, and there are also ways that go against them. Quality SEOs aim to be relevant,
whether or not they follow search guidelines.
Many effective
SEO techniques may be considered somewhat spammy.
Like anything
in life, you should make an informed decision about which SEO techniques you
want to use and which ones you do not (and the odds are, you care about
learning the difference, or you wouldn’t be reading this).
You may choose
to use highly aggressive, “crash and burn” techniques, or slower, more
predictable, less risky techniques. Most industries will not require extremely
aggressive promotional techniques. Later on I will try to point out which
techniques are which.
using
overtly deceptive techniques. In any business such as SEO, there will be
different risk levels.
Search engines
try hard not to flag false positives (label good sites as spam), so there is
usually a bunch of slack to play with, but many people also make common
mistakes, like incorrectly using a 302 redirect, or not using specific page
titles on their pages, or allowing spiders to index multiple URLs with the same
content. If you are ever in doubt if you are making technical errors, feel free
to search a few SEO forums or ask me.
The
search engines aim to emulate users. If you design good content for users and build a smart linking campaign, eventually
it will pay off.
New aggressive
techniques pop up all the time. As long as they are available, people will
exploit them. People will force the issue until search engines close the
loophole, and then people will find a new one. The competitive nature of web
marketing forces search engines to continuously improve their algorithms and
filters.
In my opinion,
the ongoing effort of keeping up with the latest SEO tricks is usually not
worth it for most webmasters. Some relational database programmers and people
with creative or analytical minds may always be one step ahead, but the average
business owner probably does not have the time to dedicate to keeping up with
the latest tricks.
Tying ethics
to SEO techniques is a marketing scam. Either a technique is effective, or it
is not. There is nothing unethical about being aggressive. You probably do not
want to take big risks with domains you cannot afford to have blacklisted, but
there is nothing wrong with owning a few test sites.
Some sites
that are not aggressively promoted still fall out of favor on occasion. As a
webmaster following Google’s guidelines, you still can not expect Google to owe
you free traffic. You have to earn it by making others cite your website.
0 komentar:
Post a Comment