Search Engine Optimization: Alive or Dead?
As always, the rules and role of search engine optimization (SEO) are in a state of flux. Why? Because Google continues to battle search engine spammers and others who try to "game" their system. But our sense is that this year (2013) the tide has finally stranded the SEO community. Here's why we and others in the website business think this.
A short history lesson
Back in the early days, pre-2000 and pre-Google, search engines like Alta Vista struggled to keep up with the growth in websites. The search algorithms weren't sophisticated enough or fast enough to deal with the explosion (though minor by comparison to today) in website development. So in order to speed the process they relied on web developers to define the sites they built.
At the top of the HTML code that underlies most websites are several places where website designers can enter information about the website. Collectively they're known as meta tags. The main tags are Title - the name of the page, Description - a brief recap of what each page is about, and Keywords - words and phrases that one might search in order to find this page. Together, these tags were used by search engines to determine how to index or categorize a certain web page.
Of course, pretty quickly web-smart people realized that using tags that were invisible to the user created a situation that less than ethical people could exploit. So by 2000 search engines began distancing themselves from these tags. First they stopped using the keyword tag because it was the most gameable.
As the big search engines of the day began to develop alternatives to meta tags - like actually indexing the text on the page - the "black hat" SEO community really kicked into gear. First they used ploys like invisible text and blank images with invisible links. Search engines went into overdrive to plug the holes that allowed the spammers to succeed. But the reality of the situation was that every time the search engines tweaked their algorithms the spammers just found more holes.
Along comes Google
During its first few years Google turned the entire web industry on its ear by solving two big problems simultaneously. First they created a search engine that could actually keep up with the pace of website launches. This was huge because at it's best, Alta Vista was only able to index less than half of all websites. Second, they eliminated, or so they hoped, search engine spammers by basing search engine rankings on links and not spammable text. Simply put, Google would record the in-bound links to a webpage (links from other websites to a particular page on your site). Google reasoned that the more other webmasters found your page interesting the better your content must be. And later, they found a way to measure the "quality" of the in-bound links. Why? Because spammers created things like "link farms" that could deliver hundreds of in-bound links to a site over night.
Quality Links and more
With link farms blackballed from Google, spammers then turned to link exchanges. Google countered with a tweak that determined the relevancy of a link. If a link to a page about real estate came from a website about laundry detergent, it didn't count. Spammers then started to create hundreds of thousands of "directory sites" - websites that only had links to other sites for content. And Google spent quite a few years getting rid of those.
Content, Content, Content
Finally, Google began downplaying its in-bound link measurements in favor of using visible content (text on a page) as a greater source of good rankings. They began really rewarding solid content that was updated frequently. Of course, this gave rise to content churning. Writers - we use that term very loosely - were assigned to find and copy content from other websites in an effort to beef up the content of the client's website. Google found a way to determine first publication, in other words what website generated an article first. Thus the unique content could be rewarded and its duplication punished. Spammers then began to hire people who could change an article just enough to make the algorithm think it was unique. Google countered, again and again...
Early in 2013 Google announced that it was dropping all use of Meta Tags. No longer would their spiders even pick up the Title tag. Then they let it be known that they would begin penalizing sites for using SEO tricks of the trade, like keywords in hyperlinks. bolding keywords, etc. In other words, Google can now tell when a website has been professionally optimized and it plans to punish the site's owner. In much the same way in-bound links have not only been demphasized but now any attempt to "force" the natural linking process between related sites will be punished.
Now we're back to content, content, content. Google rewards high quality, original, fresh content. This content includes text, images and video. The more the better. The more frequently the content is updated the better.
For you, the website owner, this should be a blessing - unless of course you've been paying through the nose for SEO work. Why better? For one thing the new system is fair. It no longer punishes new websites by forcing them to wait for other webmasters to discover and link to them. It's also much better for web users because that's what we're all looking for: great content on the subject matter desired. And finally, it's better for web developers like us. We can build good clean websites, research and write strong copy and, in the end, be rewarded for our efforts. No games. No spamming. No search engine optimization.