The Mender

Metamend's Search Engine Optimization Newsletter
Issue 39
Current Issue | Archives

Advice and techniques for the promotion of your web site plus Search Engine Optimization News and valuable resources for the serious on-line marketer. Happy New Year!

Sign Up Below!

If you would like to receive upcoming issues of The Mender, please enter your email address below and we will automatically add you to our mailing list.

Email address:
Welcome to "The Mender" Issue 39



SEARCH ENGINES AND RELATED INDUSTRY NEWS

<!-- WEB SITE MARKETING -->

Survey shows "organic" search engine optimization still most important marketing method for website owners
October 28, 2003

<!-- SEARCH ENGINE NEWS -->

Better search results than Google?
Next-generation sites help narrow Internet searches
CNN.com

Retailers Rise in Google Rankings as Rivals Cry Foul
By Lisa Guernsey

<!-- WEB SITE DESIGN -->

Extensible Markup Language (XML) - What is it?

XForms Basics
Contributed by Harish Kamath
Dev Shed
Today, XML is most definitely in the mainstream, and proving its mettle by making all kinds of new and unique applications possible (witness the success of Amazon.com's AWS service, or the Google APIs, both based on XML technology).

W3C Director Tim Berners-Lee to be Knighted by Queen Elizabeth
Web Inventor recognized for contributions to Internet development

<!-- WEB SITE CONTENT -->

Drupal is an open-source platform and content management system for
building dynamic web sites offering a broad range of features and
services...



WHAT'S NEW @ METAMEND

Changes at Google

Last November the Search Engine Google adjusted the way it weighs, measures and ranks web sites. It seems everyone is talking about these changes. In the Search Engine Optimization world it's been the news of the year, so we've dedicated this Issue of The Mender to the topic. Rest assured, Metamend is aware of the changes and is monitoring the situation very carefully. We are also conducting independent testing to ascertain scientifically whether or not changes to our SEO procedures are necessary. Our Clients will be the first to know. We'll keep you abreast of our findings. The articles below provide some keen insights as to what we have discovered thus far.



New Affiliate Members

Metamend would like to welcome more than forty new web sites and their owners to our affiliate program. Thanks for signing up! Would you like to earn additional revenue simply by recommended Metamend's services to others? Click here to join.



ADMINISTRATOR'S CORNER: Something has happened to Google!
by Richard Zwicky

Sometime in mid to late November, many web sites no longer appeared in the search engine rankings when their owners used the search terms that they considered most important, or had most commonly found them under. There were some cries that the sites targeted were almost exclusively English language, or e-commerce, because Google wanted to drive the e-commerce sites to buy Adwords, boosting their revenues, and making their IPO more attractive. Quite honestly, if reports of Google's existing revenues are even close to accurate, they are not too worried. That doesn't mean I would buy the stock, just that there is no way Google would sacrifice integrity of results for a short term financial windfall. Search engines whose results decline in quality quickly fade from popularity. Altavista went through 3 bad months, and they never recovered.

It's estimated that there are over 100 million searches per day run through Google, directly or through its partners. Thus, Google has become the main source of referrers for most businesses online. Naturally, many people have seen this as an opportunity, and have become professionals at finding ways to abuse Google's algorithms, and thereby generating undeserved high search engine rankings for web sites that are largely irrelevant to a query.

Webmasters and SEO experts have been making statements for a long time that Google was packed with irrelevant results as a result of 'Google Bombing' and assorted techniques designed to drive the link popularity of a web site through the roof, and thus get sites to the top of the search engines; regardless of the quality of the site. The people at Google actually listened to all the complaints, and did something about it: They made some algorithmic updates to the search engine. A new filter was put in place to try and target the most heavily spammed areas of the Internet. The filter was not without flaws. People quickly set out to test the new filters, and found that it could be defeated by entering what is called an exclusionary term. Exclusionary terms are when you search for something that is more specific. For example, if you wanted to search for only vegetables that were not green, you would type "vegetables -green" into the search box. To defeat the algorithm, you had to use a term like -abczxy2wyfjs in other words, something nobody would ever include on a web page. When you used a garbage term, you received search results, unfiltered; in other words, search results that may contain your missing web site, but also all the spam sites. From a practical point of view, this discovery was meaningless. How many members of the general public will actually search this way? It doesn't really matter anyhow, Google fixed the glitch in December.

The reality is the new filter may have hurt a lot of web site's search engine position, and in attempting to improve overall search results, the filter eliminated a lot of good, decent quality web sites at the worst possible time of year; the Christmas shopping season.

It may have been bad timing, and that may have been a mistake, but the rationalization behind the update is logical. Resulting search results have been mixed. On topic web sites that feature particular terms, like "software", "drugs", and other similar terms which are often spammed have been hurt, but only for a search query that includes that key term; overall they continue to perform well. Obviously the filter on these commonly spammed words is deliberate: the people at Google are trying to improve the quality of its results. Unfortunately, the a byproduct of this is that Google is making itself less relevant as a resource on the Internet. This at a time when not only is a contemplated IPO arriving quickly, but their competition is heating up considerably; On one side, Yahoo! is soon to replace Google with the results of combining and improving on the collective resources of Inktomi, FAST (Alltheweb.com), and Altavista. If you take the best elements of all three engines, there's no question which will have more accurate results. On the other side, Microsoft will unleash its new monster search engine (MSE) on MSN later this year. The people in Redmond have been hard at it for a while now; It should be interesting.

Therefore, the question for Google becomes what to do about it? It's a real mess. Google's integrity is on the line. Google doesn't want to become another footnote in Internet History. They want to help power the web.

What about you? What should you be doing to your web site to help compensate for these changes. There's strength in good solid content, and proper optimization. The proper use of your key terms within your entire web site will get noticed, just like it always has. However, overuse (spam), or misuse, (inadvertent or not), will also be picked up. Word density and the text on internal links may also play a role. Only on images that link will the alt tags be analyzed; don't stuff these with repetitive terms.

One search trick that is of interest relates to geographical terms. If you enter a place name, and then a keyword; ie. Sonoma Winery you get web sites discussing about a subject, However, if you search for a keyword, followed by a place name; i.e. Florist Victoria, you get florists in Victoria. This change may herald the start of Google's experiments with geo-locational search abilities.

Naturally, people are very suspicious of the latest Google update. This is the way it is in life with any major change. This is just Google's way of policing the web, and trying to ensure relevant search results. But like many new policing schemes, things often go too far at first.

The best thing we can recommend is: honesty. Don't try and spam the search engines, because they are constantly searching for spammers. Focus on good content, and the proper optimization of your site. Occasionally, an algorithm update will have adverse side effects. The search engines don't want that any more than you do. If you were not spamming, and you were affected, you can bet that lots of other people were too. The search engines know this and are constantly working to fix it.



MARKETING: Getting Googley Eyed!
by Robert K. McCourty

Hello Google! You probably don't know this but we've been helping you succeed for the past four years. You see, we're a search engine optimization company dedicated to ensuring proper indexing within your search engine. Well, actually, ALL of the top search engines, but everyone wants to be ranked highly on yours, so we pay very close attention to your mood swings and temperament in order to keep you satisfied.

As it stands, our Clients do rather well within your listings, which begs the question, why did you change the way your results are produced? In your attempts to ban spam sites from your index you've let in a -LOT- of unsavory elements and they are ranking pretty high on your charts. I suspect you folks will be plugging these holes as time marches on. I certainly hope this is the case.

I've also noticed a lot of the top results which appear to come from different domains are actually just affiliate sites with disguised front pages, but once you click on any link within the site you are redirected to 'the mother ship.' Musicians Friend and Music123.com are examples of this type of 'doorway' marketing. I assume these sites will eventually be weeded out as well, but for now they are permeating your results. At best it's a nuisance for the searcher, at the worst, it's quickly diluting the relevance of your results. Be careful Google! The Internet is a fickle place.

For all you site owners out there, here are three observations theories I've been researching recently, relating to how Google has adjusted the way they produce search results.

One: Word Frequency Filters. Certain words seem to trigger a filter in Google and eliminate sites which were once on the list. The word 'software' for example seems to do this for certain sites but not for others. It seems to depend upon the density of the word and how many times it is used within the text of the page and how it is used within sentence structure. I.E. Repetition. Some words will not trigger the filter no matter how many times they are repeated. Other words will trigger the filter at a maximum stand alone word count of a mere three or four times. More than that triggers the filter four Which brings us to observation number two.

Two: Phraseology. Although just a personal theory I will be investigating further in the near future, it seems the way certain words are used within specific phrases can affect Google's search results. Also the frequency of the exact phrase has a bearing on things. It's sort of a combination of allowable keyword density, where certain keywords are OK on their own (used sparingly) and also OK within certain specific phrases within the content, again with considerations toward limits on repetition. But.. if the keyword has been spread out through the use of different text phrases, the frequency filter is not triggered as readily.

Remember Google recently purchased a company working on phraseology and natural language analysis. It may be a little early for the software to be entirely integrated within Google's search algorithms, but they may be testing the waters.

Ed. NOTE:
Metamend will soon be introducing technology to not only absorb web site content through the use of a multilingual thesaurus tool, (just another of our little inventions) but will be able to suggest terms people may want to have on their sites to increase their relevance. We think having the ability to tell people what is NOT on their site but -should- be, is kind of a cool tool. We trust you will agree.

Three. A theory I call "The Vagueness Factor" Taking into consideration the two points above, it seems logical and somewhat interesting to note that the more arduously specific you are with your search terms in Google, the better your results. Use a three or four or five word query and you stand a much better chance of finding what you were looking for. It wasn't always like that. You used to be able to be 'vague' in your searches and simply type in "Used Car Dealers" This would bring you a list of sites of commercial operations which sold used cars. Now however a somewhat vague term like this gives you back DMOZ category listings, national automobile dealers networks and other resources from which to choose. A lot more work for the lazy searcher.

Type in an extremely specific query however, such as "Used Car Dealers, Honda, Memphis" and you'll find exactly what you want. This raises two important points. A) Google is, for lack of a better term, "training" the general searching public to be much more specific when looking for something. A neat trick. I like it. B) They seem to be moving toward integrating location-based search results. Geographic addresses are becoming increasing relevant to have on your web site. Check your web site to ensure your full address is on there, on several pages. Also, if your site is indeed location specific, for example a bank or gas station, remember the first two points as well and add some content with the location name in several different sentence phrases. E.g.. "Here are directions to our gas station in Memphis! or... "When visiting Memphis our gas station is easy to find. or "Our Gas Station is conveniently located in Memphis just off the Interstate."

The point being, by combining location, keywords and phraseology all together within your content, (but no overdoing it on the repetition) you'll stand a much better chance of being found in Google's newly adjusted search results.

Another ED NOTE:
I won't go into all the details here but if you are a Metamend Client, you should already know we have anticipated location-based searching as a growing trend and have in fact developed and implemented address extraction technology within our SEO services. You can read more about it here.



METAMEND CLIENT OF THE MONTH: Hemp and Company

Let's get back to the basics. Check out the long wearing quality of hemp clothing and learn more about industrial hemp products.



METAMEND SEARCH ENGINE OPTIMIZATION SOLUTIONS PRODUCE RESULTS!

Search for "hemp clothing company" or Canadian Hemp Products" on Google and look for "Hemp and Company." Metamend Client since July 2002.



WANT SOME FREE ADVERTISING FOR YOUR WEB SITE?

This space provided free to all Metamend clients. Just ask. We'll be happy to include you in a future issue.



Did you find the information in this issue useful? Feel free to pass it along to a friend or drop us a line at Feedback Form.

Or, contact:
Todd Hooge
Director of Operations
Metamend Automated Website Promotion
Toll Free: 1-877-307-2701


Metamend: Innovative website optimization and marketing service which automatically corrects keywords and meta tags increasing website relevance and popularity within search engines.