The MenderMetamend's Search Engine Optimization Newsletter
Current Issue | Archives
Advice and techniques for the promotion of your web site plus Search Engine Optimization News and valuable resources for the serious on-line marketer.
Welcome to "The Mender" Issue 34
SEARCH ENGINES AND RELATED INDUSTRY NEWS
<WEB SITE MARKETING>
Linking Matters is a free resource that helps web site owners, managers and consultants to run an effective linking strategy. The report contains step-by-step instructions as well as extensive links to articles and resources.
<SEARCH ENGINE NEWS>
Google: An engine of change
POWERFUL TOOL HAS ALTERED WORK, SOCIAL HABITS, INCREASED PRIVACY CONCERNS
By Mary Anne Ostrom and Matt Marshall
Mercury News, Posted on Mon., May. 05, 2003
Changes Forthcoming at Google
There's been a few events of note to watch at the leading search engine this month. First off, it was discovered that they now are operating 9 data centers worldwide, up from 8. The latest one can be tested at www-fi.google.com, and has a different database from the other data centers. This probably will change after the next Google Dance. The second piece of note about Google is it seems they may be releasing some major algorithm changes as part of this next 'Dance', which should start being felt in the next week. It appears that content in
<WEB SITE DESIGN>
How Important Is The Look 'n' Feel Of Your Website? *Link No Longer Valid
By Gerry McGovern
Contributing Writer - WebProNews.com
<WEB SITE CONTENT>
Corporations seek better search results
By Paul Festa
Staff Writer, CNET News.com
April 28, 2003
** Special note - Metamend clients already benefit from search analytics techniques as described in the article above.
WHAT'S NEW AT METAMEND?
New SEO Technology Enhancements
We have recently made two changes to our technology to improve SEO procedures and allow search engines to expedite listing your site. First, you will notice upon your next mend that your keyword tag will be shortened. We have refined the process of keyword generation to enhance the 'weight' of your top keywords and thus give many engines better relevance of your site in a shorter amount of keywords. Better weighting results in better relevance.
Second. The same theory will be applied to your description tag. By paying close attention to search engine trends we have observed that although many engines can read longer descriptions of up to 1000 characters in length, most of the important robots and spiders which scan your site will stop reading this tag at a much shorter length. Therefore we have reduced the total character length of the description to place it firmly within 'best scenario' readability of the engines. The purpose being to ensure your entire site description, gets read and absorbed, plus this change gives your site exposure to the greater possible range of robots and spiders than ever before. It's a good thing!
New Reseller: WeDoHosting.com
WeDoHosting.com Inc. is one of Western Canada's largest shared and managed web hosting providers. Hosting professionally since 1999, WeDoHosting.com currently operates and manages over 120 shared and dedicated web hosting servers. With cost-competitive products and industry leading support performance, WeDoHosting.com continues to experience steady growth, allowing the company to introduce enhanced additional services on an ongoing basis. Read the full press release.
New Channel Partner: Falcon Software
Falcon-Software offers a complete range of web based design and animation services including prototype development, site architecture planning, advanced programming capabilities, dynamic multimedia production and product marketing services. Falcon-Software takes great pride in its collaborative approach and proven techniques in defining, developing and delivering high impact web based solutions and designing award-winning websites for companies across North America. Animation Division.
ADMINISTRATOR'S CORNER: The Deep-Web
by Richard Zwicky
Searching on the Internet can be compared to searching for extraterrestrials with a single telescope and a computer. While a great deal of information may be learned, there is still a wealth of information that is deeper, and in the wrong light frequency range than you can see; Therefore this wealth of information is missed. Physicists estimate that up to 95% of the universe is comprised of 'dark matter' They can't see it, and can't prove it exists. But it has to be there for the universe to hold together according to the laws of physics. Fortunately, we can see the 'Deep Web'. It's just that the search engines cannot.
The reason the search engines cannot see it is simple: Most of the information on the Internet is disconnected from any other web site, or contained within dynamically generated sites, and in either case the search engines can not find it, or cannot properly utilize it. In the case of the disconnected web, someone built a web site, keeps adding documents, but never submitted it, never resubmitted it, never optimized it, never re-optimized it as search engine algorithms changed, never re-optimized for content changes, and lastly, never asked anyone else to link to it. They may also have received links from other pages which are not linked to by any sites within the visible web. Pretty complicated really.
More likely though is the chance that they are invisible because they are in a format that is unfriendly to the search engines. This could mean they are completely graphical, completely built in Flash, Framed, or have Dynamically generated pages.
How The Search Engines Work
Search engines create their indices by spidering or crawling web pages. To be read, the page must either be submitted, or linked to from another page that has already been spidered, and indexed. That's only part of the battle though. To be indexed, the same web page must be in a format that the search engines can retrieve time and time again. many types of dynamic web pages cannot. Here's why. Have you ever noticed on some web pages have a different url every time you visit? Visit some you know. Now delete your cookies (in MS Explorer go to Tools > Internet Options > "Delete Cookies"). Now in the same browser window one of the url's you visited. The url you reached will have changed.
Search engine robots do not accept cookies. Some dynamic, cookie based sites, do not allow visitors unless they accept cookies. Others allow them, but in the case of a search engine, it may not matter. Here's why: The search engine's business model is based on being able to get their customers (searchers) to their destination quickly and efficiently. At its most simplistic, they do this by indexing the Internet, and bookmarking every page they deem relevant, so that when someone queries them, they can give that response. Part of their bookmarking process is to check that the page continues to exist, and that's where the problem starts. They need to be able to send clients to exactly the same spot every time. But with dynamic pages, they just can't, simply because from their perspective, the url changes each time. So the search engines see that website as a failure. Thus they refuse to deliver traffic to these web pages. They will still read through the content, and factor it into the overall site score, but if you have a large web site, it's likely that details from deep with the website will be missed, or not viewed as important with respect to the whole.
In regards to framed pages, the search engines see none of the content within the frame. They can only see what is within the source code - click 'View > Source' If it's not there, they won't find it. Flash sites and graphics only sites are just as bad - A picture is worth 1000 words to you and me, but not a single one to a computer. So essentially, it boils down to the fact that search engines can not properly "see" or retrieve content from the Deep Web - those pages do not exist until they are created dynamically as the result of a specific search, and they do not exist for any longer than it takes to display the page.
Search Engine Optimization and The Deep Web
Making this deep web visible to the search engines is part of the search engine optimization process. For many of our e-commerce clients, it's a key part - all of a sudden their products and services are visible - something they never experienced before. Because search engine bots cannot probe beneath the surface, the Deep Web remains hidden. The Deep Web is different from the rest of the Internet, in that most of the Deep Web content is stored in searchable databases that only produce results dynamically in response to a direct request. If the most coveted commodity of the Information Age is indeed information, then the value of Deep Web content is immeasurable.
How Big is the Deep Web?
It is estimated that the Deep Web contains up to 500 times the information seen in the regular Internet! That means that while the largest search engine has indexed 1.4 billion web pages, (Google has 1.4 billion web pages, 900 million images, and 900+ million usenet newsgroups postings in its index), there are over 700 billion potential documents out there waiting to be indexed. It's easy to see this happening. Everyday Hewlett Packard adds 1 page per employee to its web site. They have 100,000 employees worldwide, which means their site alone grows by 36 million documents per year. The search engines are not keeping up. Here's some stats on the Deep Web:
- The Deep Web is the fastest growing segment of information on the Internet.
- Deep Web sites tend to have copious amounts of content.
- Deep Web content is highly focused and relevant.
- More than 50% of Deep Web content resides in specialty databases.
- Password protected, and secure content is not considered to be part of the Deep Web.
Within the strict context of the Web, most users are aware only of the content presented to them via search engines. 85% of web users find websites for the first time via the search engines. A similar number use search engines to find needed information, but nearly as high a percentage cite the inability to find desired information as one of their biggest frustrations. Without optimization, your website will never get found. At its most basic, this means that without anyone linking to your website, your website may never be found, or attached to the fabric of the net.
MARKETING: An Introduction to Location-Based Web Site Marketing
by Robert McCourty
Momentum is quickly gathering for location-based navigation among search engines and mobile device makers. Web search technology companies including FAST Search & Transfer and Google are working with wireless phone companies to power mobile searches that can benefit from locally targeted results. In addition, wireless Internet devices are becoming more equipped with GPS technology which can track the user's physical whereabouts. Both parties want to localize Web searching to make it more relevant for the searcher and hopefully draw more regional advertising dollars. "Alltheweb" today can tailor results to the searcher's country. It's only a matter of time before search results will be narrowed down to city block size or smaller.
GIS Latitude and Longitude coordinates within web sites will soon become an absolute necessity for everyone performing ecommerce via the Internet. This data may relate to the physical location of the web site or where the site is being served from (if applicable) or where the actual business represented by the site is physically located. There may also be multiple web site locations and coding involved, if for example, you have a franchise with multiple locations, (e.g.. Starbucks) each location will probably need a page of it's own with the correct corresponding locational data.
Obviously a web site is a stable beast. It sits on a server somewhere and doesn't move much, so at first glance it may not seem plausible to need GIS Locational Data tucked into the web site's source code. On the contrary, one aspect the web site represents, is the business's physical location(s) and if people are going to try to find services and products based purely upon location, shouldn't you at the very least, tell them where you are and how to get there?
Let's look at one example of how location-based web site marketing may be utilized.
You are vacationing in a new city for the first time. Once you get settled into your Hotel room you pull out your hand-held, wireless device, log onto the web and search for "Italian Food in San Francisco." Five hundred results come back. So you click the new "location-based" feature on your device which triggers a GPS satellite feed to pinpoint your exact location within the city. Now only ten Italian Restaurants, relevant to your physical location (who's respective marketing firms' were smart enough to code their client's web sites with GIS data,) show up in the search results. Guess which Restaurants did not show up? The other four hundred and ninety. Starting to get the picture?
From a marketers perspective the advent of location-based marketing can include many benefits:
- A captured target. The consumer is already in or near your place of business. Remember, this is location-based marketing. A customer is much more likely to come through your door if a competitors store is a twenty minute drive away, but your store happens to be right around the corner from where they are standing.
- Increased Impulse buying. Real time delivery of advertising prompting benefits of immediate response. Example; Come in within the next 30 minutes and receive 20% off your meal.
- Development of one-to-one relationship marketing. Consumer purchasing history can be examined, thereby enhancing future marketing messages.
- Direct marketing spending effectiveness. True targeting of promotional materials. Materials are delivered electronically and on demand, as required. No hard copy waste or excess printing inventory.
- Psychological Nurturing. The consumer 'feels like a somebody,' building brand recognition and loyalty.
- Increased return on investment (ROI) Repeat or additional consumer purchases during a visit.
You should be also prepared for your marketing budget to go through some changes as the location-based phase of Internet marketing begins to expand. Flexibility will be key. Additional costs such as subscribing to a "streaming" service to feed your site into those search results should be included. This aspect of service delivery could prove to be an escalating expense if based on a 'price-per-clickthrough' scenario.
Industry standards and the methods of serving out GIS, web-based, locational data are still within the developmental phases but given the speed of technology, it's a safe bet full implementation will be sooner rather than later. Give yourself and/or your clients a competitive edge. Find out if your site is ready for wireless searching. Today's science fiction is tomorrow's science fact. "Thank you for your wireless reservation at Luigi's! Your table and online bonus of a chilled bottle of complimentary wine, will be waiting for you upon arrival." Hmmm, a person could get used to this in a hurray!
METAMEND CLIENT OF THE MONTH: School House Teaching Supplies
Looking for School Supplies? You've come to the right place for all your classroom needs. School House Teaching Supplies new Online Store offers an ever growing catalog of products and supplies. One Stop online shopping for teachers, schools and parents.
METAMEND SEARCH ENGINE OPTIMIZATION SOLUTIONS PRODUCE RESULTS!
Try searching for "School Supplies Store" or "Teaching Supplies Store" on Google and look for schoolhouseteaching.com. Metamend Client since March 2003.
Metamend: Innovative website optimization and marketing service which automatically corrects keywords and meta tags increasing website relevance and popularity within search engines.Want some free advertising for your website? This space free to all Metamend clients. Just ask. We'll be happy to include you in a future issue. Let the promotion begin.
Contact us via our Feedback Form.
Director of Operations
Metamend Automated Website Promotion
Toll Free: 1-877-307-2701