

Domain Availability
Domain Whois
Ping Test
Reverse/IP Lookup
Server Status
Website Speed Test

Backlink Checker
PageRank Checker
PageRank Prediction
Link Popularity
Keyword Suggestion
Rank Checker
Multi-Rank Checker
Search Engine Position
Search Listings Preview
Spider View

HTML Optimizer
HTTP Headers
Link Extractor
Meta-Tags Extractor
Meta-Tags Generator
Source Code Viewer







include"http://www.wmrealm.com/wide.php" ?>
|
|
Predicting
Search Engine Algorithm Changes
By : John Metzler
With moderate
search engine optimization knowledge, some common sense, and a resourceful
and imaginative mind, one can keep his or her web site in good standing with
search engines even through the most significant algorithm changes. The
recent Google update of October/November 2005, dubbed "Jagger", is what
inspired me to write this, as I saw some web sites that previously ranked in
the top 20 results for extremely competitive keywords suddenly drop down to
the 70th page. Yes, the ebb and flow of search engine rankings is nothing to
write home about, but when a web site doesn't regain many ranking spots
after such a drop it can tell us that the SEO done on the site may have had
some long-term flaws. In this case, the SEO team had not done a good job
predicting the direction a search engine would take with its algorithm.
Impossible to predict, you say? Not quite. The ideas behind Google's
algorithm come from the minds of fellow humans, not supercomputers. I'm not
suggesting that it's easy to "crack the code" so to speak because the actual
math behind it is extremely complicated. However, it is possible to
understand the general direction that a search engine algorithm will take by
keeping in mind that any component of SEO which is possible to manipulate to
an abnormal extent will eventually be weighted less and finally rendered
obsolete.
One of the first such areas of a web site that started to get abused by
webmasters trying to raise their rankings was the keywords meta tag. The tag
allows a webmaster to list the web site's most important keywords so the
search engine knows when to display that site as a result for a matching
search. It was only a matter of time until people started stuffing the tag
with irrelevant words that were searched for more frequently than relevant
words in an attempt to fool the algorithm. And they did fool it, but not for
long. The keywords meta tag was identified as an area that was too
susceptible to misuse and was subsequently de-valued to the point where the
Google algorithm today doesn't even recognize it when scanning a web page.
Another early tactic which is all but obsolete is repeating keywords at the
bottom of a web page and hiding them by changing the color of the text to
match the background color. Search engines noticed that this text was not
relevant to the visitor and red-flagged sites that employed this method of
SEO.
This information is quite basic, but the idea behind the aforementioned
algorithm shifts several years ago is still relevant today. With the Jagger
update in full swing, people in the SEO world are taking notice that
reciprocal links may very well be going the way of the keywords meta tag.
(i.e. extinct) Webmasters across the world have long been obsessed with link
exchanges and many profitable web sites exist offering services that help
webmasters swap links with ease. But with a little foresight, one can see
that link trading has its days numbered, as web sites have obtained
thousands of incoming links from webmasters who may have nevër even viewed
the web site they are trading with. In other words, web site popularity is
being manipulated by excessively and unnaturally using an SEO method.
So with keyword meta tags, keyword stuffing within content, and nöw link
exchanges simply a part of SEO history, what will be targeted in the future?
Well, let's start with what search engines currently look at when ranking a
web site and go from there:
On-page Textual Content
In the future, look for search engines to utilize ontological analysis of
text. In other words, not only your main keywords will play a factor in your
rankings, but also words that relate to them. For example, someone trying to
sell NFL jerseys online would naturally mention the names of teams and star
players. In the past, algorithms might have skipped over those names, deemed
them irrelevant to a search for "NFL jerseys." But in the future, search
engines will reward those web sites with a higher ranking than those that
excessively repeat just "NFL jerseys." With ontological analysis, web sites
that speak of not only the main keywords but other relevant words can expect
higher rankings.
The Conclusion: Write your web site content for your visitors, not search
engines. The more naturally written sites can expect to see better results
in the future.
Offering Large Amounts of Content
This can frequently take the form of dynamic pages. Even nöw, search engines
can have a difficult time with dynamic content on web sites. These pages
usually have lengthy URLs consisting of numbers and characters such as &, =,
and ? The common problem is that the content changes so frequently on these
dynamic pages that the page becomes "old" in the search engine's database,
thus leaving searchers seeing results that contain old information. Since
many dynamic pages are created by web sites displaying hundreds or thousands
of products they sell, and the number of people selling items on the
Internet will obviously increase in the coming years, you can expect that
search engines will improve their technology and do a better job indexing
dynamic content in the future.
The Conclusion: Put yourself ahead of the game if you are selling products
online and invest in database and shopping cart software that is SEO-friendly.
Incoming Links
Once thought to be a very difficult thing to manipulate, incoming links to
one's web site have been abused by crafty SEOs and webmasters the world
over. It is finally at a point where Google is doing a revamp of what
constitutes a "vote from [one site to another]" as they explain it in their
webmaster resources section. Link exchanges are worth significantly less nöw
than ever to the point where the only real value in obtaining them is to
make sure a new web site gets crawled by search engine spiders.
Over the years, many web sites reached top spot for competitive keywords by
flexing their financial muscle and buying thousands of text links pointing
to their site with keywords in the anchor text. Usually these links would
appear like advertisements along sidebars or navigation areas of web sites.
Essentially this was an indirect way of paying for high Google rankings,
something which Google is no doubt trying to combat with each passing
algorithm update. One idea of thought is that different areas of a web page
from a visual point of view will be weighted differently. For example, if a
web site adds a link to your site within the middle of their page text, that
link should count for more than one at the bottom of the site near the
copyright information.
This brings up the value of content distribution. By writing articles,
giving away free resources, or offering something else of value to people,
you can create a significant amount of content on other web sites that will
include a link back to your own.
The Conclusion: It all starts with useful content. If you are providing your
web site visitors with useful information, chances are many other sites will
want to do the same. SEO doesn't start with trying to cheat the algorithm;
it starts with an understanding of what search engines look for in a quality
web site.
About the Author :
An expert at organic SEO, John Metzler has held executive positions
in the search engine marketing industry since 2001. He is the President of
FreshPromo, a Canadian-based SEO firm, and services American clients through
www.SEOTampa.com.
|