By now the entire world has been alerted to the fact the Google algorithm and its companion robots are roving around the web re-indexing and prioritizing all websites it comes across.
What does this mean to you? As I understand it the Google universe is sick and tired of awful commercial-laden websites with no intrinsic value. These days it’s all too common that scam, spam, and phishing sites outrank thoughtful informative sites that have something to offer.

CheckSite Websites & SEO St.Catharines  Niagara SEO

CheckSite Websites & SEO St.Catharines Niagara SEO

In order to rid the Web of these scam, spam, and phishing sites it was necessary to create and enforce a Web code of conduct so to speak. Google’s first attempt at reigning in the madness was to eliminate the use of the Meta Tag Keywords. All web designers use keywords to associate their sites with pertinent and supposedly unique words that portray what their web presence is all about.

The early nineties robust keywords garnered your site premium ranking in the search engines, which is what all companies, designers, and site owners want. Speaking from a designer’s point of view, we want you, the viewer, to find and enjoy our work. The keyword system relied on the integrity of designers to use only words germane to the website.

So much for the honor system, thousands of marketing and so-called Search Engine Optimization a.k.a. SEO companies emerged overnight promising page one listings in the search engines for a price. Those that paid reputable firms got the top spots and those that didn’t, well, you know they got lost in the shuffle and wound up one page 696,430. Not much of a chance of being seen is there?

Research indicates that most people don’t wander past page three. The real tragedy of this Black Hat Marketing is we, the surfing public, winds up with scams or just plain bad information. Cheating by associating irrelevant terms quickly became the norm. Tire companies mention a tantalizing Led Zeppelin snippet that lures music lovers to their site—only to find plain old tires.

ince search engines are the backbone of all our Web queries, without them finding anything in the Cyber Sea would be like finding that proverbial needle in a haystack. Google’s answer to this problem is re-indexing the web with a new algorithm that magically can separate the pertinent from the useless. The truth is so many factors go into to making that search engine magic that only the coders at Google know all the criteria that pops the results they think we all want.

Web 2.0 is in full swing I’m sure you all have noticed the annoying ads are still present and now they’re personalized to your viewing habits.

Web designers and marketing gurus alike are mandated to play nice or the almighty Google will ban all who violate the no-Black Hat Marketing rule. How they purpose to do that is anyone’s guess. What mystifies me is how do you ban the evildoers from the Web?

The last and most important change Web 2.0 is ushering in is the one that will make those do-it-yourself web designers a thing of the past is 100 percent complaint code. For the novice or experienced designer, the coding that makes websites work has standards that were setup to achieve conformity in all browsers equally. The fact that code errors effect indexing hasn’t hit home until now.

W3C organization maintains a code checker that will validate Web code and show the errors that can torpedo even the best of websites. Ironically less than two percent of the websites I check in the validator pass including Yahoo and you guessed it Google itself. Can you say hypocrites?

Any errors mean the robots that index the web will pass your site over, yet the major players that don’t have complaint code still have page one hits. The demise of those colorful quirky websites has arrived, what was different and cool has fallen way to pay for play and cookie-cutter.

You now stand a Google (1 followed by 100 zeros) of a chance of ever seeing a page one hit.