Number One In Your Niche

Search engine optimization news and updates from Priya Shah, author of Number One In Your Niche.

Monday, October 24, 2005

 

Splogs + Scraping + AdSense = Fraud

By Jim Hedger, StepForth Placement Inc. (c) 2005


The other day, an article appeared in Search Engine Journal suggesting webmasters monetize their sites using Google AdSense. While the article neglected to mention an alternative webmaster advertising program offered by Yahoo Search Marketing, the idea of using one's website as a commercial medium (if possible or practical) makes good sense and can provide a minor side-income. Such minor side-incomes are often the first ingredients in making the gravy craved by all small business owners.


Since the advent of Google's AdWords grassroots distribution program, AdSense, several webmasters have built businesses out of taking content off of other people's websites and using that content to build pages designed specifically to attract ad-clicks.


As the average commission earned by sites running AdSense generated advertising is approximately $20/month, webmasters working this type of scheme need to create hundreds, if not thousands of pages to make a living.


In order to create those pages and attract ad-clicking visitors, content must be created, begged, borrowed, or most commonly, simply stolen. Known as Splogs, these sites only exist to game Google in one way or another, mostly for money but also for increased search rankings or as a means of manipulating search spiders.


Splogs most often get their content by scraping, the process of sending an electronic copying bot to take everything it sees, recreating it on an unlimited number of instant documents. By running advertising generated through the AdSense program, the owners of the splogs make money when visitors click on the ads. In other words, literally millions of instant sites have sprung up over the past twelve months, most of which are free-hosted Blogs, containing content scraped out from the original sites.


Before continuing, I would like to make it clear that there are several publications that request permission to reprint content. That's ok. Chances are, this article is being read in one of those publications. Online business runs on such agreements.


Splogs are bad business and the practice is finally getting the notice it deserves. Several search heavyweights have weighed in on Splogs over the past two weeks and a flame-war (the virtual equivalent of fisticuffs) broke out between members of two well-known SEO/SEM forums. As a result, the practice of producing AdSense revenues from stolen content on spammy sites got a little bit harder, starting today.


Matt Cutts, Google's spam fighter and quality assurance czar, has taken an obvious and positive interest in Splogs. In the SEO/SEM community, Cutts' name is as widely known as Page, Brin, and even Gates' names are. Cutts is "the man" when it comes to explaining the state of Google's various indexes and how they work. He is referred to as the Chief Spam Fighter at Google. In a posting to his Gagets, Google, and SEO blog earlier today, Cutts invites Google users to report Splogs displaying AdSense driven advertising.


"You see a low-quality site that is running AdSense If you run across a site that you consider spammy and it has AdSense on it, click on the "Ads by Goooooogle" link and click "Send Google your thoughts on the ads you just saw". Enter the words spamreport and jagger1 in the comments field."


The name, "Jagger1" is the reference name given the Google algorithm update that is currently causing the present shuffling of Google's search results.


Splog fraud is a big problem for Google and a growing concern for the other major search advertising providers such as Yahoo Search Marketing, and MSN. It is also a problem for others working on the Internet. The way content is taken from one site and replicated to dozens of others can cause no end to technical and financial issues for honest webmasters.


Content, incidentally, is not always limited to what the viewer sees on the screen. Stolen content often includes source-code and as anyone familiar with code can tell you, there's a lot of domain and document specific information embedded in source-code.


Over at Search Engine Journal, a funny posting shows how one poorly executed scrape made an honest webmaster afraid of being branded a click-fraud artist by Google. After scraping the site, the splog-artist apparently forgot to remove the AdSense code from the stolen content.


That's how the honest webmaster found out he had been stolen from. He was moved to contact Google before his AdSense account status was affected. If the webmaster hadn't been paying attention, he might have been badly branded by Google, burned by someone else's scam.


That's not the only way that scrappers could adversely affect honest webmasters however. The content webmasters create, or have created for them, is the attraction that prompts visitors to their sites. Attracting lots of site visitors is a pretty important step to making money from AdSense or the Yahoo Publishing Network. If someone is stealing that content, they are also stealing potential visitors. For the webmaster, that content represents investment. For the content creator, it represents product. Either way, the scraping of content is theft.


The stolen product is then used to create what is essentially duplicate content on another site. Duplication of content can have an adverse effect on the search engine placement of all documents containing the similar items.


Imagine losing your placements because someone else took the material you laboured over. Fortunately, Google's historic record of documents is fairly good at weeding through which source first displayed specific content.


Search engines have several other reasons to be concerned about splogs. As many of them are created using the free-blog software offered and hosted by most of the major search engines, the proliferation of so many splogs consumes a lot of resources.


They also gum up search results with sites not actually relevant to search engine users. Lastly, they devalue the legitimate uses of blogs as communications and marketing tools, which might lead future blog readers or users away from the growing blogosphere.


Citizen's publishing is seen as a major revenue source for both Google and Yahoo. Having invested so much time, energy and money into the establishment of blogs, the major search engines would be loath to let their investments go the way of the dodos without a fight.


Now that the web development community is talking about the issue in earnest, some forms of protections might evolve. As it stands currently, there is little a webmaster can do to protect his or her content from being stolen for profit. You can use Copyscape (http://www.mattcutts.com/blog/update-jagger-contacting-google/) to see if your material has been nabbed but after doing that, there is little one can do except write angry letters to the thief and a lawyer.


Google is inviting users and webmasters to report splogs running AdSense whenever they are seen. In a just universe, not only would the AdSense accounts of those scrappers be closed, their bank accounts would be emptied after Google sues them for fraud.


Jim Hedger is a writer, speaker and search engine marketing expert based in Victoria BC. Jim writes and edits full-time for StepForth and is also an editor for the Internet Search Engine Database. He has worked as an SEO for over 5 years and welcomes the opportunity to share his experience through interviews, articles and speaking engagements. He can be reached at "jimhedger@stepforth.com"


Friday, October 21, 2005

 

Chasing the Search Engines' Algorithms: Should you or Shouldn't you?

It's a common occurrence. SEOs often spend countless hours trying to "break" a search engine's algorithm.


"If I could just crack Google's algo, my pages would soar to the top of the rankings!"


Let's look at some flaws in this way of thinking.


1. Picture the Google engineers and tech folks turning the algo dial as soon as you "think" you have "cracked" the algo. Your rankings may fall, and you would have to figure out what's working with the engine right now. In other words, your rankings may never be long term.


2. Instead of spending all of this time trying to impress a search engine with a perfect page, why not impress your true target audience . . . your customers. Has Google, MSN, or Yahoo! Search ever bought anything from you? They're not your target audience. Your customers are your target audience. Write your pages and content for them.


3. When you expend so much of your energy chasing algorithms, you often focus on only a few elements that influence ranking ­ those elements that are working right now and that you hope will give your pages the best chance for success. It is said that Google has over 100 ranking elements that influence ranking and relevancy. Some are more important than others. But focusing on just one or two "main" elements and discounting the rest can prove disastrous to a Web site.


A different approach . . .


Wouldn't you rather achieve top rankings and keep them there, and have those rankings equate to sales and money in your back pocket?


After all, isn't it ultimately the sales you're after, as opposed to just the rankings? If those rankings don't equate to traffic that equates to sales, you lose, any way you look at it.


Five Basic Steps for Achieving Top Rankings without Chasing Algorithms


1. Forget about the search engines. Yes, you heard me correctly. The search engines aren't and never will be your "ideal target audience." They don't buy your goods and services. They're not who you should be trying to please with your Web pages and site. Instead, write your Web page content for your target audience.


2. Don't ever forget the basics. No matter what's happening in the algorithms, continue using your main keyword phrase prominently in your title tag, META description and keyword tags, link text, body, heading tags, and so forth. That way, when the algo dial is turned, you won't have to make changes to all of your pages. You'll always be ready.


3. Focus your keyword-containing tags and body text on one keyword phrase only. Each page should be focused on one keyword phrase, and each page should have its own unique tags.


4. Write well-crafted content for your Web pages, and add new content on a regular basis. If content is king, context is queen. Focus on your keyword phrase, synonyms and related words, and surrounding text. Use a program like ThemeMaster if you need help determining those supporting words.


5. Remember that both on-page and off-page factors are important. Don't sacrifice one for the other. On-page factors are your tags, body text, prominence, relevance, etc. Off-page factors are link popularity (quality and number of your inbound links) and link reputation (what those inbound links "say" about your Web page when they link to you).


What about search engine research? Isn't it important?


It's crucial.


Let me give you an example. At the beginning of this year, pages began falling out of Google's index. The forums were alive with speculation and what to do about it.


Through research, we determined this was a compliancy issue. By having compliant code, the search engine spiders are more easily able to spider the content.


The solution? Make sure you use a DOCTYPE tag and an ISO Character Set Statement at the top of every Web page.


For example:


<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.1 Transitional//EN">


<META HTTP-EQUIV=content-type CONTENT="text/html; charset=ISO- 8859-1">


If you didn't know about the compliancy issues, you could have made changes to your Web pages that didn't need to be made, wasted countless hours trying this or that, all to come up dry.


Research helps to make sure you remain on top of what's happening in the search engine industry. It's what sets you apart from other SEOs. You make your decisions based on research and facts, versus speculation and theory.


In Conclusion . . .


"Take it from someone who has been in this business for nine years and studies the algorithms closely - don't chase the algorithms. You say that you have a #2 ranking for a certain keyword phrase that alone is bringing your site 550 visitors per day? Great. In the time that you have spent gaining that ranking, I have written 285 pages of unique content, obtained 821 links, etc., and collectively I am getting over 1,300 visitors per day," says Jerry West of WebMarketingNow.


In other words, by focusing on more than just chasing algorithms, you have the potential of having a much more successful Web site.


Robin Nobles conducts live SEO workshops in locations across North America. She also teaches online SEO training. Localized SEO training is now being offered through the Search Engine Academy.


Copyright 2005 Robin Nobles. All rights reserved


Friday, October 14, 2005

 

Blogs And SEO: Theme Relevance

Blogs provide search engine optimization (SEO) power by their very design. The blog strength lies in keyword rich content and in the abundance of inbound links.


Blogs can even become more SEO oriented by thinking in terms of relevance.


The most recent Google results, whether involving incoming links, Google PageRank levels, or positions in the search engine results pages (SERPs), there is one common denominator. That overarching similarity lies in the value that Google is placing on theme and topic related content. The watchword of the day in the land of Google, as already noted, is relevance.


The Google algorithm, which is the mathematical calculation that determines a site's position in the search results for any given keyword or phrase, has been revised to stress the importance of relevance. That importance not only affects the search results that are found in the SERPs, but in Google's PageRank and link valuation formulas as well.


With the need for more relevant content and linking partners, website owners are looking for a possible solution. The requirements for help include theme and topic relevant content and linking partners who are also theme and topic related. While there are several ways to achieve relevance, one such method is adding a blog component to your website.


Blogs are regularly updated postings of information, usually related to the theme of a website, and includes incoming and outgoing links on the same topics. Once thought to be merely online journals and diaries, blogs have moved far beyond the personal realm and into the world of business and information.


Blogs are becoming an important component in many business owners' toolboxes for marketing, public relations, and search engine optimization. The benefits of blogs for increasing visitor traffic numbers are countless...or at least a very high number.


Businesses in almost every industry can benefit from the blog boost in the search engines. There is little doubt that all websites will receive a healthy injection of relevance, simply by posting regularly to a business blog.


Since the posts will be on topics, related to the overall website theme, and incoming links will arrive from similarly themed blogs, relevance is an obvious and natural result.


In the early days of the internet, it was thought that related sites would link naturally to one another. Little thought was given to placements in the various search engines at first. In fact, the early internet wasn't really SEO oriented at all, and users cared about good content and natural links. Fortunately for everyone, those are still the keys to effective blogs and websites today.


It was generally agreed that good content would attract natural related incoming links, from similarly themed websites. In fact, what was being described was what is currently happening with blogs. People simply link to interesting and informative content. Bloggers are very free and generous linkers, especially to blog posts that are related to their own areas of interest. There's that relevance factor in play again.


Blog posts are generally written on one, but just as often on two, three, or even more related topics. If the topics are not related, they become so by virtue of there being a large number of posts on those formerly unrelated topics.


On occasion, the unrelated topics will even appear in the same post. The important aspect of the blog is the overall development of powerful theme relevance. A blogger who writes on several topics can even make the blog theme relevant in more than one subject area. By adding several posts on alternate topics, expanded theme relevance is possible.


Bloggers are also free and generous linkers, and understand the value of relevant content instinctively. By regularly linking to interesting blog posts, bloggers provide value to their own readers, by offering them the best of other bloggers. These added incoming links provide additional Google PageRank, as well as boosts in the search engine rankings from the link.


The clickable links are often rich, in the receiving blog's most important keywords, and are contextually surrounded by theme related content as well. This combination of strong link anchor text and theme relevant content gives the blog exactly what the search engines are seeking.


The Google algorithm could have easily been written with blogging in mind. The algorithm is even more blog powered with the additional emphasis on theme relevance. After all, bloggers sought and created relevant content from the very beginning, giving them a head start in the relevance arena.


Theme relevance in Google is just one more great reason to start a blog for your online or offline business.


Wayne Hurlbert provides insigtful information about marketing, promotions, search engine optimization and public relations for websites and business blogs on the popular Blog Business World.


Thursday, October 06, 2005

 

How Blogs And RSS Boost Your Search Engine Visibility

Copyright © 2005 Priya Shah


Marketers have found that blogs are excellent tools for communicating with their audience. Anyone who has something to sell or an idea to promote can benefit from using blogs.


Search engine marketers especially favour blogs because they have a number of features that make them the darling of search engines.


1. Fresh, Updated, Relevant Content


When you write a good blog about a theme that you’re passionate about and post to it frequently, you’re creating fresh, keyword-rich, content that search engines love.


2. Natural, One-Way Links


Search engines view links to your site as a recommendation of your site content. More links pointing to your site or blog boosts your visibility and search engine rankings.


Google gives more weight to natural, one-way incoming links, and blogs make it easy to get two types of one-way links to your site.


� Similarly Themed Blogs


A well-written, authoritative blog, with unique content, is likely to get linked to from a number of other bloggers writing on similar topics. These are natural links that are viewed highly and given more weight by search engines like Google.


� RSS Feed Syndication


Blogs and the RSS feeds built into them, help you build valuable, one-way links to your site by syndicating your content online.


3. Get Indexed Within Hours


When you post to a blog it “pings” a number of services that list blogs. This notifies the service that your blog has been updated.


Search engines like Google give more weight to blogs that are updated regularly. It is possible to get your pages indexed in Google and other search engines within hours of writing your first blog post.


Compared to the time it takes to index a website (days, even weeks), you can see why blogs are better search engine optimisation tools than static websites.


Indirect SEO Benefits of Blogs


Besides the SEO benefits, a well-written, authoritative blog can also create publicity and branding for you, which promotes even more people to read and link to your blog.


Priya Shah is a partner in the search engine optimization firm, SEO & More. Request the detailed version of this whitepaper Boost Your Search Engine Visibility With Blogs And RSS here.


This article may be reprinted as long as the resource box is left intact and all links are hyperlinked.


Friday, September 23, 2005

 

Black Hat Search Engine Optimization the White Hat Way

Black Hat

A fairly common black hat search engine optimization tactic is to build multiple websites on a general theme. The sites are then cross-linked to other sites in the same network, and would also include one-way links to the primary site with varying anchor text.


The whole aim is to give one or more sites a huge boost in search engine results pages (SERPs), and for it to also benefit from additional traffic flowing from the various network sites.


People undertaking such methods generally create the websites with automated tools, use scraped content from other peoples websites, and most of the sites have no purpose other than to drive traffic to the primary site.

White Hat

First it is important to understand a little about linking structure. I am not going to go into excessive details.

It is widely understood that internal linking on any website can represent as much as 50% of the page rank attributed to any single page within a site. How your pages are linked together, for which terms, and whether links are reciprocated all play an import role in the calculation.

If I told you that there are hundreds of websites on the internet, with very high page rank on multiple terms themselves, who would be willing to create a niche portal within their pages, highly optimized for your website, niche and keywords, it is something you would probably be willing to pay for.

We are not talking about a simple directory site. We are talking about high quality content pages, that will pass on pagerank to your site, plus a central hub, similar to a home page, that benfits from all the content pages linking to it, and that in turn also points directly to your website.

Of course:-

- You will have complete control over the content of each of these sites
- You will be provided with an interface for managing the site's contents
- You will be able to add content whenever you like, on almost any subject
- They will even act as brokers to encourage other people to create hubs which will also point directly to your website
- Most of these sites have powerful linking structures, that magnify the value of your content, and the links both to your website, and to the central hub.

This is all "white hat". You will never be penalised for using this tactic by the search engines, and it is permanent! Your traffic hubs will be a permanent fixture. Some of these hubs will disappear, but many more will appear to replace them.

Is this something you would pay for? You can get this highly powerful promotion of your website for free!

Simply write and submit articles to article directories.

Every day I see questions on multiple marketing forums along the lines of:-

"Does article marketing really work?"
"I submitted an article 2 weeks ago and my search engine results have stayed the same, why?"
"When I submit an article, how long until I will see traffic to my website?"

Describing exactly how this all works in words is very difficult, but lets look at a very simple math formula.

1x1x1x1x1=1

It is not very impressive is it?

You have to remember however that an individual article you publish gains incoming links in a number of ways.

- Snippets from your article will appear on the pages of other articles in the same niche
- You will have a link in the main topic
- you will most likely have a link for some time in the RSS feed
- if a website uses that RSS feed for content, the article on the directory site would gain at least a temporary link, but quite often a permanent one.

So we may be looking at more like

1.3x1.3x1.3x1.3x1.3=3.71

Some of the numbers however are going to be bigger or smaller, depending on the authority of the page linking to the article, the number of links from that page etc.

You might well have to use addition rather than multiplication when regarding many aspects of a real formula.

What is important however is that not only is each individual article you publish gaining in pagerank, but also your author profile.

Lets take some examples

These are the current top 5 article authors listed at Ezine Articles

Lance Winslow 2029 Articles
Jeff Herring 340 Articles
Tim Gorman 306 Articles
John Mussi 303 Articles
Dennis Siluk 286 Articles

Now do a search for any of those author names in Google.

Every single result has a reference to their Ezine Articles profile within the top 3 positions.

This isn't true in every case. Well known (and popular for good reason) internet marketer Willie Crawford's profile only appears at the bottom of the first page, but he has hundreds of links pointing to his popular websites, and has a baseball player competing for ranking.

Profile pages concentrate and magnify the linking benefit of every article you publish, thus the links from a profile page carry a lot of weight.
Some author bio pages allow a lot of customization. Most allow you to have some text (which can be keyword targeted), along with website links. A few even allow you to set anchor text for every link in your profile.

Thus to answer all the questions I see every day on various marketing forums.

- Yes, article marketing does work.


- the more articles you submit, the more effect you will have from using articles as one form of marketing. A hub with a single page has very little weighting. A hub with 10, 20 or even 100 articles will carry an immense amount of weight, and having lots of hubs pointing to your websites will have a massive effect on search engine results.


- you might see an immediate burst of traffic within a few days of when you submit an article, however article marketing is a short, medium and long-term solution.

Short term it can be a fast route to having a website spidered by search engines.


Medium term, you will gain some exposure within your niche as other sites and ezines publish your content. Many of them don't write about your topic every day.


Long-term is really up to you. The more quality articles you write, the larger your hubs will become. Large article hubs pick up traffic from a larger variety of search engine traffic, but also make your author bio more prominent, thus magnifying the value of external links placed there.

Andy Beard has worked in Sales, Marketing and Localization for the last 15 years, primarily in the computer games industry.  He publishes his articles with the services of Article Marketer.


Sunday, September 04, 2005

 

The Blog and the Nature of Natural Linking

A lot of people are talking, and few of them know, the soul of a search engine was created in the blog. ;-)


There has been a lot of talk about Natural Linking. From Oakland (Ask Jeeves), Sunnyvale (Yahoo!) and Mountain View (Google), California, all of the search engine companies are talking more publicly about "Natural Linking"


WHY IS LINKING IMPORTANT?


The whole concept of linking in the eyes of the search engine companies is that when Site A links to Site B, then Site A is making a personal recommendation of Site B. Because Site A is willing to put their reputations on the line to share the story of Site B, the search engines have determined that Site B MUST be of higher value than Site C.


Google established their PageRank system a few years back based on this conceptual idea. Over the past few years, the other search engine companies have began to adopt the linking model in their attempt to catch up with Google's lead in the marketplace.


Because the search engine companies want to provide the best possible results to their users for a particular search, they have all climbed on the concept of link counting to determine the value of the sites that they are recommending to their users.


WHAT IS NATURAL LINKING?


The idea behind "Natural Linking" is that you can have for example five people linking to the same site giving the site a recommendation via the hyperlinked text.


The hyperlinked text is any text that appears between the <a href=> and the </a> tags. When used in a HTML document, the hyperlinked text in the viewable webpage becomes a live clickable link like this: http://BloggerSupport.Com.


When viewing this link from within the HTML coding, it will look like this:
<a href=http://BloggerSupport.Com>http://BloggerSupport.Com</a>


Each webmaster who decides to link to an individual website has a different idea and reasoning as to why his or her visitors should look at the site they are recommending. As a result, each webmaster will outline their reasoning within their links to your website.


In our example, we have five Site A's pointing to Site B:


Example Site One will put the following on their site:
<a href="http://BloggerSupport.com">Technical Support for Your Blog</a>


Example Site Two will put the following on their site:
<a href="another/'>http://BloggerSupport.com">Another Blogging Site</a>


Example Site Three will put the following on their site:
<a href="blogging/'>http://BloggerSupport.com">Blogging Articles, Tips and Tricks</a>


Example Site Four will put the following on their site:
<a href="technical/'>http://BloggerSupport.com">Technical Services for Bloggers</a>


Example Site Five will put the following on their site:
<a href="support/'>http://BloggerSupport.com">Support Services for Bloggers</a>


Each webmaster in this example has shown their users why they should visit BloggerSupport.com. In doing so, each of them has shown their link using their own descriptive text. It is this "descriptive text" that the search engines view as "natural links".


So, "Natural Links" are links that are created by individual webmasters and not by Site B's owner. In the eyes of the search engine programmers, these links will likely have a more accurate representation of the content that appears on a website. And the search engine masters understand that a stranger is always more honest in his representations than the webmaster trying to promote his own website.


WHAT DOES NATURAL LINKING HAVE TO DO WITH BLOGS?


Over the last few years, you have read many an article from people pitching the importance of the blog in the search engine optimization game. But, do you know why blogs have become so important to the search engine companies?


Natural Linking.


When all is said and done, the text within a link has been given more weight in the search engines than the real content on a webpage.


I heard the guffaws in the audience. I can see the look of utter disbelief on your faces as you site in front of your computer staring incredulously at my comments. ;-)


But, wait. I can actually prove this to you.


Come back when you are done, but do click this link to view the search results for the Worst President


Did you notice the results in search result #1? Did you go to the webpage to see if you could find the word "worst" in the text? I did too. And guess what. I could not see the word "worst" in the text either.


The same search with a slight variation, searching for the Worst President in History puts the same page in search result #4.


THIS IS THE TRUE POWER OF BLOGS


The search engine companies put far more value in the natural link text than they put in the terms that show up within a webpage! The bloggers are the ones who have said that George W. Bush is the worst president in history. And, I will bet that they get a great chuckle every time someone like me points out their accomplishments.


The reason why the blogs are the best new resource of the search engines is because blogs use natural linking far more often than regular website pages.


THE CHALLENGES FACING US NOW


As we move forward to promote our website in the here and now and tomorrow, we must keep in mind the need for natural linking. The challenge for us is how to communicate our sales messages in such a way that links to our websites are perceived by the search engines as natural links.


As we face this challenge, we should consider giving other people more leeway in how they post a link to our websites. If you are using reciprocal links or paid advertising, you should by all means give the person showing your ad several choices for your advertisement.


If you are using reprint articles to promote your online business, you should find a way to offer publishers and webmasters multiple article resource boxes, or you should just give them more free reign in developing another natural link to your website.


Good luck in your linking endeavor.


Copyright © 2005 Bill Platt
The Phantom Writers


Bill Platt owns http://thePhantomWriters.com . He specializes in getting your reprint articles distributed to at least 17,000 publishers and webmasters seeking good content. Pre-Written ghosted articles are now available for purchase. If you need help setting up your blog, please review his blogging services.


 

Search Engine Wars - Quality Searches Vs Quantity

It is no secret that Google and Yahoo are on a continuous battle to win our hearts and get everyone to convert, but is converting someone really a matter of the quantity or the quality?

Let's take a look at some top key searches and compare them with some search engines online. I will outline a few things for each search result:

1) Search Engine
2) Number of results found
3) Quality & content of the top 10 sites
4) What you find going beyond the first 10 pages

Each section will get ranked out of 10 points for quality (information taken on August 26,2005).

=
Starting with my all-time favorite search term: "INTERNET MARKETING"
=

Google.com
- 99,000,000 results
- 10/10 on quality
- 10/10 Past 10 pages still delivers top quality results

Yahoo.com
- 281,000,000 results
- 7/10 on quality. There's no reason to list a "Hotel Marketing Firm" & "Building websites". The #1 spot was reserved for Yahoo marketing.
- 9/10 Past 10 pages, it is very generic for business, not specific.

MSN.com
- 93,661,176 results
- 10/10 on quality
- 10/10 Past 10 pages still delivers high quality results. I am surprised and give MSN two thumbs up for their attention to detail.

=
Moving onto the search term for "BUSINESS NEWS"
=

Google.com
- 627,000,000 results
- 10/10 on quality
- 10/10 Beyond 10 pages delivers high quality & local news centers.

Yahoo.com
- 1,260,000,000 results (wow)
- 10/10 on quality
- 9/10 Beyond 10 pages. There are still some sites that should never be there.

MSN.com
- 381,631,054 results
- 8/10 on quality - Some aren't related at all and brand new sites as well.
- 10/10 beyond 10 pages. Their results seem to tighten up and get better.

Let's now take some more "local" search results to see deeper and more targeted results...

=
Moving on to the search term for "WASHINGTON UNIVERSITIES".
I hope to find the "University of Washington" come up #1 or thereabouts.
=

Google.com
- 22,700,000
- 10/10 on quality (www.washington.edu is #1)
- 10/10 beyond 10 pages. The results still deliver "university" topics.

Yahoo.com
- 143,000,000 results
- 9/10 on quality - The top spot is reserved for Yahoo on Washington University in St. Louis. There seems to be a strong battle going on here on which university to list.
- 10/10 beyond 10 pages. Content is specific and relevant.

MSN.com
- 34,442,536 results
- 9/10 on quality - Again another battle going on and washington.edu is #10 tilting on the verge of page 2.
- 10/10 beyond 10 pages. All related to Washington & University living

Let's now get even more specific than that... For this search term I will be using something very local that I can relate to and give a better analysis.

=
Moving on to the search term for "HAMILTON ONTARIO CANADA".
=

Google.com
- 3,700,000 results
- 10/10 on quality - Great job
- 10/10 beyond 10 pages - Anything goes but is directly related to Hamilton.

Yahoo.com
- 14,900,000 results
- 10/10 on quality - Almost the same as Google but a couple of different choices.
- 9/10 on quality - Some results are found by keyword stuffing their pages.

MSN Local *New (http://search.msn.com/local/)
- 1,280,405 results
- 9/10 on quality - Getting some random placements & keyword stuffing
- 10/10+ beyond 10 pages - Many local companies well listed in the results.

= = = = = = = = = = = = = =

Let's now take a look at these results as a total. Out of a possible score of 80 points, here are the total scores for each:

Google - 80 points
MSN - 76 points
Yahoo - 73 points

Total search results delivered:

Google - 752,400,000
Yahoo - 1,698,900,000
MSN - 511,015,171

And people wonder why Google is the king of searching?! But wait, without even noticing the results, as I totaled the final point standings, MSN came up in second place! In my study, I tried to be as neutral as possible.

How is it that Msn & Google have 50% the amount of results shown in Yahoo, but outrank Yahoo in quality?

In conclusion:

Even though Yahoo delivered a report stating that it has over 19 billion search items, what does it matter when you're still trying to figure out how to deliver all that content?


MSN & Google seem to know exactly what to do with their results. They don't seem "off the wall" at all. If Yahoo is to step up and be crowned the search king, I really think that they need to refine the amount of search terms they have, to match the quality of search results as well.

Yahoo has major potential to outreach Google but they still have a lot of work ahead of them and by the time they figure things out, Google may even step up their game even further and wow us all on some other search plateau.

About The Author:

Martin Lemieux is the president of the Smartads Advertising Network. Smartads is dedicated to helping you expose your business online and offline.


International: http://www.smartads.info
Article Submission Website (Beta): http://www.article99.com
Post Comments Here: http://forum.article99.com/viewtopic.php?t=24

Copyright © 2005 Smartads Advertising Network


 

Torpedo and Sink the Ship SS Search Engine Rankings

I was recently contacted by one of my best clients who asked me what I thought of his decision to make a major change to one of his highly ranked pages. His initial concern was that visitor sales conversion ratio was low. At almost one percent, it was just below normal, but I'm always happy when a client wants to improve. Conversion and rankings though, are very different beasts and his concern was overly focused on the former to the total exclusion of the latter.


As his SEO I should have realized that the top rankings of this already optimized page were in danger when his first sentence referred to the existing "Dusty, tired old page, that just isn't getting enough sales." That page had just been optimized for search engines about 6 months previously, and went from page 10 (invisible) or so of the Search Engine Results Pages (SERP's) to the top three on the first pages of all three major search engines virtually overnight after a few tweaks to gain traction from a popular movie reference to his product.


The page had been up for several years before the movie release without gaining substantial web sales of that same product, but our optimization six months ago lead to a leap in sales and consistently improving page visits after that theatrical release. But sales plateaued over time and slowly decreased after the movie which had mentioned his product transitioned to DVD sales. Somehow he hadn't forseen that decrease and wanted to continue the level of sales he had enjoyed while the movie mention was fresh.


To achieve the continued sales though, he wanted to completely replace the page text with new material he'd been given by the manufacturer of the product. As is the case with marketing material provided by many companies, keyword density was non-existent with emphasis was on slick new photos, covered with stylized, graphical text. Text with keywords that couldn't be repeated in any page text since they had already been embedded in the image graphics several times.


What to do? I suggested creating an entirely NEW page with the manufacturer provided information linked within his site menu links on each page and from the sitemap. While maintaining the old page for it's top rankings in the search engines we could simply use internal linking to keep the search engines crawling that (old dusty) fully optimized page. That way we would still rank in the top 5 for that page and it's coveted keywords and provide the new conversion focused page to site visitors from the menu links.


For some reason though, the client insisted on using the existing filename for the new content and moving the old content to a NEW filename! Why? Because he wouldn't have to have his programmer change a script which loaded a rotating banner to a select few highly trafficked pages. The programmer costs too much to change a few lines of code for a profitable product page?


This tactic meant that we would completely lose the existing rank on the next visit of the search engine crawlers after the new page was posted. I was convinced that we could gain the rank back, but only over time and with substantial extra work. The cost to the client to get a new page into the top five on SERP's was going to exceed the cost of programming updates of banner rotation scripts. But he insisted we use the new manufacturer provided (image only) content on the old filename. OK, I relent.


The web designer wanted to use the new manufacturer provided page in an iframe and embed the old page text in noframes tags - making it visible to search engines, but not visitors. Silly idea and borderline spam technique that may drop our top five rankings off the charts. I dug my heals in and refused that idea.


The client suggested simply keeping previous metatags and title tag to maintain ranking. Sorry, that simply won't work. If it did, we'd return to the bad old days of simplistic keyword stuffing in those (no longer) magical metatags. I started to wonder ... "Am I here as an SEO only to stop designers from using SE spamming techniques, programmers from having to write new code and clients from doing absurd keyword stuffing in metatags?"


No you actually have to use carefully crafted keyword rich text on the visible page - and NOT embedded in graphics files as text painted across photos with photoshop and illustrator software. Search engines can't read text on images and that image "Alt" text in the HTML is no longer useful in SEO since it has been so badly abused by simplistic optimizers for ranking gains before the search engines began to ignore it in their ranking algorithm.


The new page may initially see sales increases due to the pretty new photos (there is zero text on that new page) but after a long series of email exchanges with this client and a final phone discussion over ranking issues, he proceeded with this change anyway. I normally don't hope for poor rankings on client pages, but since this one runs counter to every fiber of my SEO being, I'm actually looking forward to that torpedo striking and the ranking to sink off the charts and the client to pay attention to his SEO's advice.


The old page is still showing up in cached pages at the search engines, so they haven't yet crawled the new version. I will dutifully point out the sinking of the venerable "SS Search Engine Ranking" ship next week when Googlebot revisits this client site and finds all that text has disappeared from his previously #1 ranked page and suggest to him that he review his WebTrends traffic reports to see that it has settled to the bottom of the ocean.


I guess I better get busy finding a way to rank the previous (old optimized) page on the brand new shiny filename. Won't he be surprised to learn that most of his sales come from that (newly named) "old dusty page" within a few weeks?


Copyright September 3, 2005 Mike Banks Valentine


Have you done anything to torpedo and sink your ship "SS Search Engine Rankings" lately? Call me at 562-572-9702 if you need a salvage operation to raise that venerable ship from the bottom of the vast search engine rankings ocean. http://www.seoptimism.com/SEO_Contact.htm Mike Banks Valentine operates the article distribution site http://Publish101.com and a Small Business Ecommerce Tutorial for Web Entreprenuers at http://WebSite101.com


Monday, August 29, 2005

 

Do Not Ever Link To A Site Without Doing This First!

Links are a crucial part of attaining high search rankings, but you must be very careful about to whom you link. I'm going to help you develop a simple link strategy for your website that will help you decide which sites to link to so you're making your way up the search engine rankings and not accidentally hurling yourself backwards.


So the natural questions are:


Should I link to everyone I can find?


Should I allow everyone to link to me?


Should I get one of those "link to 2,000 site for $10" things?


The answer to all of the above is NO!


Develop A Link Strategy


We're going to do an easy-to-digest version of what a search engine optimizer would do if you were to hire one. There are many reasons for having your site professionally optimized which would take up articles in themselves. This is one of the attack strategies for determining optimum, quality links for your site.


Step 1 Where are your competitors linked?


Don't arbitrarily find random sites that you like and link to them. A little bit of research goes a long way. Start with your competitors. Type the keywords for your site into a search engine. You have them, right? This is the list of key word phrases that you want to score the number one position when someone types them in a search. Who appears in the top 10 positions? They're your direct competition that is doing something right or they wouldn't be coming up first. So let's look underneath and see how they got it to work and save you a year of work!


Go back to your search engine and type "link:http://www.competitorsite.com". Up will pop a long list of sites that have a direct link to your competitor. Do this for your top 10 competitors. Do you see any trends in those results? Do you see any similar sites, or perhaps directory listings? Take some notes. A spreadsheet or a few sheets of loose-leaf paper is helpful and analyze what you've uncovered. You should have a good solid list of links that are helping your competitors rank high!


Step 2 Search for similar themed sites.


Look at your keyword list again. Do these words appear more so on any of the pages you have listed so far? Narrow down your list to sites that have at least the same theme or related content to yours. Even your competition will have quite a few odd links.


If there are 300 links to a site that sells pumpkins, it's natural to have a car dealer or an airline in there too. Chances are they were so pleased with their pumpkin purchase that they added the site to their own web page. You can disregard these right away.


Take your list and look at the potential link site for similarities to your topic. If in the pumpkin market your competition links to a site that tells all about how to cook pumpkin seeds, see if you can find other sites that tell how to make pumpkin pie, make jack-o-lanterns AND cook pumpkin seeds. Make a list of these sites as potentially better ones.


Step 3 Look at the Google Page Rank


You can find a page's page rank by looking at your Google tool bar if you have it installed or by going to a site like http://www.top25web.com/pagerank.php. The actual importance of Google Page Rank to Google searches in particular seems to depend on whom you talk with. It shouldn't be the make or break, but it can help to choose between several similar sites if your unsure of which one to go with.


Page Rank is more of a relative scale of the number and quality of links to a site. The higher the rank, the higher the number. The lower, the worse. It's not unheard of for a link from a site with a Page Rank of 6 or 7 to boost up a low score by a couple numbers. This seems to, at least in Google's case, get a site indexed much faster. And the faster you're indexed, the faster you can start climbing.


Step 4 How many sites link to your selections so far and how good are they?


The more links a site has pointing to it, the more important it appears to be to the search engines. Say your looking at company ABC to put a link on their site to you. Let's first see how many sites link to company ABC. (Just as you looked at your competition). We know search engines place more weight on sites linked to you that have similar content. Now the big search databases seem to know what kind of content is on those linking pages.


Using the pumpkin model, if your potential target is teaming with 100 inbound links from gambling, girls, horses, moons, leprechauns and horoscopes then throw it in the trash pile fast, even if it has a Page Rank of 8 (very rare).


Find that site that has 10 links good, quality links to it. From a pumpkin farmer, a vegetable recipe blog, Halloween and Thanksgiving festivities, how the first settlers used the pumpkin to build houses, etc and has a Page Rank of 5. This is the better choice. Quality, related themes and content to your site and keywords outweighs quantity of random, useless links.


Step 5 Your final list


Don't think you can do this in an hour, or a day! It's quite a bit of work just to find good, potential targets. Here's a bonus… when you have your finished your first wave a link possibilities, here's great way to give it a solid foundation.


Find some relevant directories to list with. Directories of a given theme will have many, many similar links pointing to it. Directories are usually considered to have Authority. (It's not uncommon to have to pay a fee for the good ones anyway.) Try to find a directory of pumpkin farms and pumpkin recipes to build your other links upon.


Another bonus. Avoid this mistake at all costs! Do NOT link to Link Farms or Free for all sites or any sites that will give you 1000 links for $10. These are not directories, but collections of completely unrelated links that exist solely to try to boost search engines rankings. Search engines ban many of these sites. The consequences of being listed with a banned site could ban you, and then you're doomed. The only way to succeed is to build your links honestly and strategically with a plan and method.


Note: Search engines give more weight to one-way links rather than reciprocal links. i.e. links that link to your site without asking for one in return. The easiest way to get these is to buy them. This plan will work on all the different kinds of links you can get.


So now you have some potential sites to link to. In the next article learn how to phrase your link for maximum effectiveness. The sites to link with is only the first half… the quality of the words you use that make up the link's content called anchor text are just as crucial! Hint: using the same link in every web site is a very bad idea. See you soon!


About the Author: John Krycek is the owner and creative director of http://www.themouseworks.ca. Read additional articles on identity, web and graphic design and logo creation in easy, non-technical, up front English!


 


 

How to Submit Your Site to Directories

Unlike submitting to search engines, submitting your site to directories and niche portals usually involves a lot more than simply typing in your URL. You often have to start by researching the various topic categories to find the most appropriate area to submit to. Then you generally have to provide some detailed information about your site, its’ content, your company and your contact details.


When selecting the most appropriate Directory category to submit your site to, conduct a search for your main keyword phrase and view the various related categories. Study the sites listed within these categories and choose the category that is the most relevant to or closely related to your site content. Some directories like ODP have specific Category Descriptions you should read before submitting, to ensure you have chosen the most relevant topic for your site.


Another way to choose your category is to search for sites belonging to your direct competitors. It is likely that the category they are listed in will be the most relevant to your site.


If your site targets or discusses a specific regional market, you will need to submit to a regional category. For example, if my site was about rental cars for hire in Sydney, Australia, I would need to submit it to the regional Yahoo category and not the general Yahoo rental car category


I find it useful to submit a slightly different description of my client’s sites for each directory submission. That way, I can gauge which descriptions are more effective in terms of encouraging people to click and also which directories are providing my clients with the most traffic. Many directories feed their database results to other engines and directories, so if I have a description unique to each directory and I see that description pop up on other search sites, I know it is the result of that original directory submission and immediately recognize the value of that original submission.


Remember that directory editors don’t care about your site’s ranking in their search results. If they are reviewing a site submission that contains an obviously keyword stuffed title and description, they are unlikely to find it appealing or beneficial for inclusion in their database! Always make sure your submission details are relevant, interesting and accurate. Try to highlight your site’s benefits for the visitor and unique content that makes it stand out from others in the same category. If your site sounds just like a cookie-cutter version of others of the same topic, there is no incentive for the editor to include it.


Submitting to the Yahoo! Directory


There are a couple of sites where you want to take extreme care and do advance research when submitting your site. One of these is the Yahoo! Directory. The way you submit your site to Yahoo! can make or break your site’s ultimate ranking in the Directory and if you’re not careful, could also cost you USD 299 for nothing.


With Yahoo!’s huge market share and popularity worldwide, I believe it’s vital that your site is listed in Yahoo!’s Directory. The best way to get listed quickly is by paying the fee for Express Submission. Yahoo! Express is an expedited fee-based site suggestion service for web sites submitted to the Yahoo! directory. A member of Yahoo!'s editorial staff will look at your site, consider your suggestion and respond to you within 7 business days.


Important: Payment does not guarantee inclusion in the directory, site placement, or site commentary. It only guarantees that Yahoo! will respond to your suggestion within seven business days, by either adding or denying the site.


The secret to obtaining excellent results via your Yahoo! submission is to choose the most appropriate category and include a carefully-crafted description that contains your main keyword phrase/s without being too verbose. For those of you offering a Yahoo! submission service to clients, be sure to charge a generous admin fee for your expertise in researching the category and writing the description for your client – a successful Yahoo submission can pay dividends for your client for years.


Example of a successful site description for Yahoo!:


ABC VIP Adventures - offers tailored adventure travel and vacation packages to New Zealand including day tours, exotic corporate trips, luxury travel packages, kite surfing, and extreme sports.


Example of an unsuccessful site description for Yahoo!:


ABC Travel – we are the best! We are the only company to contact for your vacation. Call now!


The latter does not use the actual company name, plus it contains lots of hype but no keywords and few clues as to what the site is about. In this case, the Yahoo! editor would have to visit the site submitted and come up with their own description and it’s doubtful the edited description will be something the submitter would be happy with.


 Submitting to Open Directory


Another Directory where submission is critical is the Open Directory. DMOZ is run entirely by volunteers and your site submission must be hand-reviewed by one of these volunteers before it can be considered for inclusion. DMOZ is extremely under-staffed (I know this because I’m a DMOZ editor!) and it can take 6 or more months before your submission is reviewed – you must be patient. When submitting to DMOZ, make sure you follow the directory submission guidelines above and prepare to wait, wait and wait some more.


Procedure to follow for a successful DMOZ Submission:


1) Submit site 2) wait for 3 months 3) follow up email to category editor 4) wait for 3 months 5) escalation email to category editor above your category 6) wait for 3 months 7) ask for assistance in the Open Directory Public Forum (http://resource-zone.com/) 8) wait for 1 month 9) escalation email to DMOZ senior staff & post to various forums seeking help


Rules of Submission


1) Do it once: Despite the hype, there is NEVER a need to resubmit to a search engine or directory unless your site is dropped entirely (which is a very rare occurrence).


2) Do it properly: Be very thorough when submitting, especially to directories. Take the time to research and locate the most appropriate category for your site.


3) Be brief: Don’t waffle on about your site in the description field. Get to the point and describe your site in a short sentence or two.


4) Be accurate: Don’t try to trick potential visitors by using vague or misleading descriptions about your products or services.


5) Be relevant: There is a fine line to tread between relevance and keyword optimization when creating your site descriptions for submissions. Try not to cross it by using descriptions over-stuffed with keywords.


6) Be humble: “Best Web Site in the World!!!!” is not going to convince anyone and may earn you the wrath of search engine editors.


7) Be patient: Search engines and directories can take up to 6 months to index and list your site. Re-submitting won’t help things and could result in your site being shoved to the bottom of the review pile.


So that wraps up the directory submission process. It can be time consuming, but taking a little bit of time and care with your submissions can pay dividends for your site for years to come.


About the Author:


Article by Kalena Jordan, one of the first search engine optimization experts in Australia, who is well known and respected in the industry, particularly in the U.S. As well as running her own SEO business Web Rank, Kalena manages Search Engine College, an online training institution offering instructor-led short courses and downloadable self-study courses in Search Engine Optimization and Search Engine Marketing


 

How To Build Link Popularity With Your Own Affiliate Scheme

If you have your own product then one of the most effective ways of generating sales is to encourage other people to promote your product for a share of the profit. For the right level of reward, lots of affiliates will work hard to promote your product to their subscribers and from their own web sites.


As well as being a great way of making sales, one of the other key benefits that this strategy offers you is in building incoming links to your web site.


Just as a matter of interest check the link popularity of Clickbank.com and you will see that there are literally thousands and thousands of incoming links from other web sites.


That's why its important that you set up your own affiliate management system to ensure that your affiliates use links that point directly to your web site rather than through a third party affiliate site such as Clickbank.com. Why give your link popularity away?


Not only does having your own affiliate management system help you to build your link popularity, it also means that you are able to build a list of affiliates. As I'm sure you know, anyone can register with Clickbank as an affiliate and they can promote your product completely without your knowledge.


It's much better for you to have complete control over who promotes your products, how they promote them and where they link to.


There are several products that will allow you to manage and control your own affiliate scheme, when making the choice ensure that you check out that the features include linking to your domain.


Copyright John Taylor PhD August 2005 - All rights reserved.


About The Author: To learn more information about Affiliate Management scripts I strongly recommend that you visit http://www.Link-Advantage.com


 

Anatomy Of A Reciprocal Linking Campaign

Reciprocal linking means forming partnerships with other sites who place a link from their Web pages to yours. You then give them a similar link in return.


When you look for people to swap links with, make sure that you don't reduce the quality or content of your own site. You don't want users to click straight through without reading your content; you want them to take action on your own site rather than have them leave empty handed.


One way to stop them from running away too quickly is to create a "Webmasters Resource Page" and link to that page from your homepage. This doesn't take away from the content on your homepage and the links are just one click away rather than being buried deep within the site, giving value to your partners.


In any case, you want to be sure that your site is more than just a page full of links. If your site contains more links than content, it will look like a link farm and it will certainly not be attractive to webmasters, search engines or users.


Picking your partner


Your link partners should be sites your target market will visit. Think about your product and its subject area and brainstorm to determine where people interested in your product might be looking online.


For example, if you're trying to shift your book about blackjack strategy, it makes sense that the people visiting online casinos would make great customers. Online casinos then could be good partners.


Identify top-ranked, high quality casino sites and find the email address, telephone number and snail mail address of their webmasters. You can also identify your competitors and see where they trade links. After all why reinvent the wheel when you can use your competitors hard work!


Seven Top Tips For Requesting reciprocal Links...


1. Before you contact webmasters, place a link to their site on your resource page to assure them that you will actually provide a quality link.


2. Create a subject line that will encourage them to read your message rather than deleting it - you don't want them to think you're spamming them. (Something about their site or product is sure to capture their attention; they will open it, thinking you're a potential customer.) Hint - subscribe to their ezine and then reply using the ezine subject line as the subject of your reply.


3. Begin your message by talking about your visit to their site and what you found interesting about it. Detail your product or service in one line and ask them to exchange links with you.


4. Tell them in detail where you have placed their link, include the precise URL, tell them where to find their link on the page and emphasize that it is only one click away from your homepage.


5. Tell them that if you don't hear back from them in a specific number of days, you will consider that to a negative response and that you will remove their link from your site. Give them enough time to respond but don't leave it open ended.


 6. Sending a reciprocal link request by email is becoming less and less effective due to Spam filters and the high volume of email traffic received by webmasters of busy sites. Try sending your request on a postcard or better still make a phone call.


7. Tell the webmaster how they will benefit from the reciprocal linking arrangement - explain what's in it for them and use your selling and persuasion skills!


Copyright John Taylor PhD August 2005 - All rights reserved.


About The Author: To learn more information about Reciprocal linking I strongly recommend that you visit http://www.Link-Advantage.com


Friday, August 26, 2005

 

Banned By Google And Back Again

The date: 29th July 2005. The time: early morning.


I got out of bed and fired up my PC. Opened my browser to check my site. Had a look at the third-party Google toolbar plugin on said browser (FireFox). It showed grey. Ice formed in my stomach. I opened my bugged version of Internet Explorer: my PageRank was 0.


By now I was frantic. I went to http://www.google.com and typed in 'site:www.tigertom.com': no pages listed. I did this for two other satellite sites of mine: ditto. What had happened? TigerTom.Com had been banned by Google.


I went to the WebmasterWorld forum, and found out the awful truth. Google was doing one of its periodic updates of its algorithm, and had filtered out my sites completely. Further research there, and a bit of soul-searching, revealed why. I had too many pseudo-directory pages with auto-generated external links.


Snippets from search engine results were used as descriptions of said links. Said links were run though a redirect script. These are hallmarks of pseudo-directories and 'AdSense scraper'* sites. Google is reportedly trying to filter these from its 'SERPs'**.


I say reportedly, because Google doesn't announce these purges. They are inferred. To compound my sins, these pages were also effectively doorway pages?. The theory was that legitimate sites had been hit as 'collateral damage'.


I say theory, in that Google rarely comments on individual cases. It won't tell you exactly why your site was banned. I guess this is for reasons of time, and to give no clues to spammers.


In my case the ban was justified for my two satellite sites; while not looking like spam, they were effectively doorway sites. My main site was different. It had offending pages, but was mostly a diverse labour of seven years; a personal site on steroids.


Google bans sites algorithmically: a site that fits their 'spammer' profile gets dropped via software from their index automatically. Real spammers shrug their shoulders and move on; honest webmasters write emails begging for mercy. Like me.


I did some searching via Google, to find out how to do a re-inclusion request. Here's how:


1. First, you check your site is truly gone, by going to http://www.google.com, typing 'site:www.yourdomain.com' without the apostrophes. If it returns no pages at all ...


2. You check Google's webmaster guidelines at http://www.google.com/webmasters/guidelines.html. These are not really guidelines; you should treat them as iron-clad rules.


3. You stop the offending content from being web-accessible, permanently. If you're familiar with Apache web-server mod_rewrite you can:

- Send a 410 'Gone' response to requests for the offending pages, or

- CHMOD them to 600, which will return a 403 'Forbidden' response, or

- Move them to a different directory if you need to keep them, or

- Just delete them.


Don't try to be clever. Just get rid of them.


4. You go to http://www.google.com/support/bin/request.py, tick the relevant boxes, and type 'Re-inclusion request' in the subject box of the form.


4a. You add the complete URL of your site i.e. http://www.naughtydomain.com,


4b. You state that you have read the webmaster guidelines above,


4c. You admit what you did wrong; simply, succinctly, with no carping or special pleading.


Don't try to be clever. Don't argue. Don't lie. Don't waffle.


Google has cached copies of your site. When an engineer checks your site, he'll look for the offending content, and compare it against their cache. He'll spend about two minutes on it; don't give him a reason to continue to exclude you.


5. You ask for re-inclusion.


6. You wait. In my case, it took about a week; a long, unpleasant, fretful week. I sent follow up emails saying what I was doing, and a fax, and I was going to write letters if that didn't work. That was probably excessive.


Once you have a ticket number, that's all that should be necessary. They emailed a standard reply saying "the problem had been passed to their engineers". That's good. I understand they send no reply to spammers.


A week later my site was back in. Lesson learnt. To make sure I'm not so vulnerable again, I'm splitting my content to different sites, on the principle of 'best not to have all your eggs in one basket'.


Have I learnt anything from this? Yes. Have more than one site as your 'money-maker'. Spend less time on search engine optimisation and more on traditional marketing.


Come up with a unique selling proposition that compels people to link to your site. Easy(!)


About The Author: T. O' Donnell (http://www.ttfreeware.co.uk/) is an ecommerce consultant and curmudgeon living in London, UK. His latest project is an ebook on getting a loan in the UK. His blog can be read at http://www.ttblog.co.uk/


Wednesday, August 24, 2005

 

Rock Your Rank With a Dynamite Text Link at Yahoo Directory

Yahoo Directory Explodes Rankings


Last week a client called me excitedly exclaiming that their Google PageRank had jumped a notch and their targeted keyword term now ranked #23 (up from #45) for their competitive search phrase. I asked the client if he'd been notified by Yahoo that his site was now included in the index after we had submitted it three weeks ago. "Yes," he said, "but why are you changing the subject?"


"I'm not changing the subject. Inclusion in Yahoo Directory is the most likely reason for the jump in both your search position and your PageRank. Remember when you doubted the value of inclusion in the Yahoo Directory and I pushed for submission anyway? Now you know why I insisted."


That seemingly expensive Yahoo Directory listing has one little known benefit to your website. It is the most important and valuable text link you could ever purchase. That one link from one source will do your ranking more good than any other single link (except possibly the Open Directory).


Many webmasters look at potential traffic referred from the Yahoo Directory as the determining factor for submissions, when that is not the best reason for inclusion - It's the link value that matters above all else in this case.


I've had several SEO clients see a leap in ranking for targeted search terms a week or so after that Yahoo Directory link goes live for them. Many clients have argued with me about the value of that Yahoo Directory text link. But at $299, the yearly fee is cheaper than many of those commercial text link ads sites and does far more for your ranking in search engines OTHER than Yahoo.


Why? It's purely the value of that link. Search engines know that only sites of a certain quality level will submit and get accepted into Yahoo Directory. They know that serious businesses will pay that yearly fee, while marginal or hobby sites will not pay that $299 every year. Surely there is some level of value assigned in search algorithms to inclusion in the Yahoo Directory.


It's a little known technique for gains in ranking which is based purely on empirical observation over time. But the result of inclusion in the Yahoo Directory is the same for every client, every time - their PageRank ratchets up and targeted search terms suddenly take a big jump just a week or two after inclusion. The same is true of inclusion in the free Open Directory Project at http://DMOZ.org .


I've seen client sites jump from positions on page three at MSN search to top 5 positions on page one of the MSN SERP's following inclusion in the Yahoo Directory. Now we are submitting this same client to the Open Directory for the 5th time in as many months hoping for elusive editors to add the site, leading to another jump in search position and PageRank if and when they get around to adding the site to the DMOZ database.


You do know that Google uses that Open Directory listing in their own directory, don't you? It's worth submitting and resubmitting until they finally include your site. It really is worth the trouble to keep trying, no matter how long they ignore your submissions.


One caveat always applies to Directory submissions! They must be done with great care applied to keyword phrases used in the site description. That single line of text you submit in the "Site Description" text box on the submission page will strongly affect your keyword phrase ranking in OTHER search engines for a very long time.


Take care in crafting a keyword rich and effective description for your site. I always request that clients either have me submit for them or use text I've written for them in that description. If you do it badly, it will be re-written by an editor who cares more for categorization than keyword rankings. Be Careful!


Before you run off begging for reciprocal links from slick webmasters or purchasing text links of dubious value from text link outfits, submit to Yahoo Directory and pay the $299 (or $24.92 monthly) for the most undervalued text link available. Then swallow your pride and re-submit to the Open Directory until they finally include your site.


Rock your rank with dynamite text links! Yahoo Directory and the Open Directory Project.


Copyright © August 23, 2005 by Mike Banks Valentine


Mike Banks Valentine operates http://Publish101.com Free Web Content Distribution for Article Marketers and Provides content aggregation, press release optimization and custom web content for Search Engine Positioning http://www.seoptimism.com/SEO_Contact.htm RSS: http://RealitySEO.com/atom.xml


Thursday, August 04, 2005

 

How Keyword Density, Frequency, Prominence And Proximity Affects Search Engine Rankings

By Michael Wong © 2005


In this article, I explain the difference between keyword density, frequency, prominence and proximity, and how they affect search engine rankings.


Keyword Density


Keyword density refers to the ratio (percentage) of keywords contained within the total number of indexable words within a web page.


The preferred keyword density ratio varies from search engine to search engine. In general, I recommend using a keyword density ratio in the range of 2-8%.


You may like to use this real-time keyword analysis tool to help you optimize a web page's keyword density ratio.


Keyword Frequency


Keyword frequency refers to the number of times a keyword or keyword phrase appears within a web page.


The theory is that the more times a keyword or keyword phrase appears within a web page, the more relevance a search engine is likely to give the page for a search with those keywords.


In general, I recommend that you ensure that the most important keyword or keyword phrase is the most frequently use keywords in a web page.


But be careful not to abuse the system by repeating the same keyword or keyword phrases over and over again.


Keyword Prominence


Keyword prominence refers to how prominent keywords are within a web page.


The general recommendation is to place important keywords at, or near, the start of a web page, sentence, TITLE or META tag.


Keyword Proximity


Keyword proximity refers to the closeness between two or more keywords. In general, the closer the keywords are, the better.


For example:


How Keyword Density Affects Search Engine Rankings


How Keyword Density Affects Rankings In Search Engine


Using the example above, if someone searched for "search engine rankings," a web page containing the first sentence is more likely to rank higher than the second.


The reason is because the keywords are placed closer together. This is assuming that everything else is equal, of course.


You may like to read my article, "How To Find Out What Keywords Your Customers Are Searching With." In this article I show you how to capture your most targeted visitors with the search engines by discovering what keywords they search with.


About the Author:


Michael Wong is an internationally recognized internet marketing expert, and the author of a leading SEO book, numerous marketing tips, and reviews of marketing tools and ecommerce software. Visit his web site at http://www.Mikes-Marketing-Tools.com 
 


 

How To Find Out What Keywords Your Customers Are Searching With

In this article I show you how to capture your most targeted visitors with the search engines by discovering what keywords they search with.














To get high search engine rankings, you must include the keywords that your potential customers are search with, in your web pages.

To find out what your most targeted keywords are, you need a keyword analysis service, such as Wordtracker.

Wordtracker will help you find all keyword combinations that bear any relation to your business or service. It does this by providing analysis of actual searches conducted in metacrawler search engines.

You may wonder why it's important that you need to know what keywords your potential customers are searching with. Let me show you why with some examples.

Here are the most popular searches for the different variations of the search term, "keywords," according to Wordtracker:















Search TermSearches
keywords273
Keywords103
key words68


Notice the different variations? The second keyword is capitalized. The third keyword is split into two words.

Here are the most popular searches for the different variations of the search term, "marketing strategies," according to Wordtracker:


















Search TermSearches
marketing strategies319
marketing strategy280
Marketing Strategy172
Marketing Strategies22


Why Is It Important?

Well, if you wanted to target search engine users, you would want to include all the different variations of the keyword you want to target. This will help to persuade the search engines that your web page is relevant for the different keyword searches.

Another reason is that some search engines, such as Inktomi, still take case sensitivity into consideration when ranking web pages.

Here's a table of Inktomi powered search sites that provide case-sensitive search results.


















SearchCase-Sensitivity
ICQFull
LookSmartPartial
MSNPartial
OvertureFull


Knowing exactly what your customers are searching for is very powerful.

I have used this strategy over the years to help me obtain thousands of top 30 rankings for my web sites, and those of my clients.

If you need to improve your search engine rankings, I highly recommend that you reconsider your keyword targeting strategies.

Read my review of Wordtracker to find out just exactly how it can help you growth your search engine traffic.

About the Author:


Michael Wong is an internationally recognized internet marketing expert, and the author of a leading SEO book, numerous marketing articles, and reviews of marketing software and ecommerce software.


 

Pseudo Directories: Do They Really Increase Page Ranking And Keep You Organized

In This Article you will learn about Pseudo Directories, a simple tool that will not only keep you organized but when used properly can actually increase your ranking points because you will have file names that are extremely optimized for the Page Ranking Bots.
 
Conventional Wisdom states you should use Sub-Directories to organize your web Pages. Unfortunately the Page Indexing Tools Penalize your Page Rankings when you Do this, Many of the Page Ranking Algorithms in use today deem that pages close to your Root are of a Higher Value then pages that are further away from your root. If you have 2 Web Pages with the Exact same content (web1.html and web2.html) http://yourdomain.com/web1.html and http://yourdomain.com/articles/training/web2.html
 
All things being equal Web1.html will have a higher ranking then Web2.html because Web1.html is in the Root Directory and Web2 is in a Sub Directory.
 
As a Web Designer or a Home Based Business owner wearing a Web Designer Hat it seems you have 2 Choices, do what seems right and in many ways best and use a Pretty Directory Structure and Loose Ranking points or Throw out the Directory Structure and Gain Ranking Points. Many People Choose the Directory structure either because they don't know about the page Ranking Algorithms or they feel a strong need to be organized and are willing to sacrifice ranking points.
 
Pseudo Directories are the Best of Both Worlds. They allow you to keep your Directory organized and not loose ranking points. In fact by using meaningful names for your Pseudo Directories you can actually gain Ranking Points.
 
A quick lesson on File Naming if you have 3 Web Pages with identical content and they are named page1.html,bathroom.html and Home-Repair-Bathroom.html which file do you think will have the Lowest ranking and The Highest Ranking for the phrase "Bathroom repair". Page1.html will be lowest ranked and Home-Repair-Bathroom.html will be the highest rank.
 
The Pseudo Directory takes advantage of this page naming structure to allow you to organize your Web Pages and still maximize your page ranking.
As an Example you have a Web Page that is Promoting products nd Services to Homeowners and Would be Homeowners. You want to uild some content rich web pages using articles that you find nd Reprint. Using Conventional Methods you would have a irectory Structure that looks like
 
root/articles/Home-Loan/file.html,
root/articles/Home-Repair/file.html and
root/articles/Home-Selling/file.html etc.
 
With the Pseudo Directory method you keep all your files in the oot Directory you just name them using the Optimal naming ules. Now you would simply have Pseudo Directories named
 
Home-Loan
Home-Repair
Home-Selling
 
So An Article on Roof Repair would be named home-repair-roof.html Similarly and article on Kitchen Repair wuld be named Home-Repair-Kitchens.html and on Bathroom repair wuld be named Home-Repair-Bathrooms.html. An Article on FHA oan Loans would be named Home-Loan-FHA.html and an Article on how to sell your home by owner (FSBO) would be called Home-Selling-FSBO.html
 
This next step is optional but it does have a few advantages and will help increase your page rank even more. Create a Table of Contents or Site Map for each Pseudo Directory. So continuing with the above example create a Files called


Home-Loan-Contents.html - Links to all Home-Loan Files
Home-Repair-Contents.html - Links to all Home-Repair Files
Home-Selling-Contents.html - Links to all Home-Selling Files
 
Next (If you have more then 3 or 4 sub topics) Create a Page called Home-Contents.html where you have links to Home-Repair-contents.html and Home-Loan-contents.html and Home-Selling-contents.html ...
 
Copyright © 2005-2006 Mike Makler


About The Author: Mike Makler has been Marketing Online Since 2001 and has Built an Online Organization of over 100,000 members. You can see My Web Page Here and be sure and Subscribe to my Newsletter http://www.ewguru.com/hbiz/home.html


 

Crash Course in Getting a #1 Google Ranking

By Jason DeVelvis (c) 2005


First, here's the rundown of some of the terminology I'm going to use in this article –


Inbound Links – Links coming into your site


Outbound Links – Links leaving your site


Cross Links – Links that you have "traded" with another site (i.e., they've got a link from their site to your and you've got a link from your site to theirs)


PR (Page Ranking) – Google's measure of how "important" your site is


 


SEO Is Not Dead


Ok, now lets talk about what you really want to hear – how to get those coveted 1-10 ranks for your keywords. Remember this - SEO is not dead. In fact, it is very much alive and important. The first thing to do in order to raise your site rank is target specific keywords. I say specific, because you need to target "keyphrases," meaning more than one word keywords.


Some people use the words interchangeably (me included) so just ignore one-word keywords altogether. You will waste your money if you shoot for these, because chances are, there are other, MUCH larger companies who already have you beat, and will continue to have you beat unless you've got a bottomless wallet.


Check Out Your Competition


Take this example, for instance, if you sell computers, you should not try to optimize your site for the keyword "computer" or "computers." First, think about all of the businesses that do ANYTHING with computers. Yeah, that's a lot. They'll all show up if you search for "computer."


Now try to think of who would show up at the top of that list. I'll make it easy, it's Apple, Dell, Computer World, Computer Associates, IEEE, Computer History Museum, Webopedia, ASUSTeK, WhatIs.com, and HP.


I'm going to go out on a limb here and say that I 99.9% guarantee you that you'll never get into that top 10 list. The HP link has almost 5,000 backlinks (discussed later) and a PR of 7/10. Good luck.


Then What Should I Do?


So what should you try to target? Lets re-visit your computer store. What types do you sell? PCs, ok, what types of PCs? Custom. Ok, that's a little better, "Custom Computers" is still a vague keyword, though. (How many people build custom computers?)


What kind of components do you use? Intel? AMD? SoundBlaster? GeForce? There you go, that's a little better – "Custom GeForce Computers." That returned 476k results instead of our previous 633 million with "Computer." Just a little bit less competition.


I Want More


Want to go further? Forget what types you sell, go for what do your customers want. What do they use your computers for? Gaming? Try "Custom Gaming Computers" - There are 672,000 results here, but the #1 spot has a PR of 5/10 and only 41 backlinks. That shouldn't be too hard to beat, we've just got to know who is linking to them, and beat them at the backlink game.


Oh yeah, before we move on to beating the pulp out of your competition, don't forget to SEO optimize your site for your chosen keywords before spending any time on backlinks. Otherwise, this next section won't mean much.


But I Digress...


Ok, now that your site is thoroughly optimized, how do we find out who their backlinks are? Well, you can do it by hand, or you can purchase a VERY helpful tool called SEO Elite that will analyze all of the backlinks to a site (and more). But, since you don't have SEO Elite yet, we'll do it the long way.


First, go to toolbar.google.com and download the Google toolbar, this will save you some time. Ok, now type in your keywords – "Custom Gaming Computers." The first link should be overdrivepc.com (if it's not, then someone may have already read this column and risen above them!) click to go there. When the page loads, go to your Google Toolbar, click on Options > "More" Tab > Make sure the "Page Info" box is checked.


Then, click on the blue circle with the i in it. (This is the aptly named: "Page Info") It should drop down and allow you to select "Backward Links," choose it. Now you should be looking at a Google search page again, but this one is different, it only shows pages that link to overdrivepc.com. (Wow, that's handy!) At the time of this article, there are 41 pages that link to the site, and you can view them all. Some are other pages in the site, others are third parties.


Get Your Site Some Friends!


Follow each third party link and check out the page. Does it have to do with your business? Would their visitors benefit from coming to your site? (The answer is probably yes) If so, email the webmaster - there should be an email address somewhere on the site. Ask him or her if they would link to your site. Be willing to trade links with them, or to pay for a good link with a high PR.


That reminds me – look just to the left of the Page Info icon on your Google toolbar, and you should see a green bar. That is the Page Rank of the page you're currently on. You want to target pages with higher page ranks than your own, because for each of those sites that link to yours, they effectively "give" you a little bit of their PR. Kinda like in high school when the head of the cheerleading squad flirted with the nerd in the hallway, she "gave" him more popularity.


By the way, if you can manage to get a link from the #1 site itself, do it!


Do this for all of these links you can, then move on to the #2 listing for your keywords. Then #3, and so on. Don't get discouraged if some webmasters don't reply to you, it may take an email or two. If they say no, thank them for their time and move on. I try my best not to burn any bridges – you never know when you'll need to contact that webmaster again, and if he remembers you were polite, that will make you look good.


Whew, Finally Done.


This is a very easy way to move up the Google SERPS, no "expertise" required, just good old-fashioned hard work. It will take some time for Google to re-index those pages and realize that they have a link to you now. And it will take even longer for your PR to go up (from what I hear, it's been around 3-4 months since the last PR change [Today is 7/14/2005]) But be patient, get links upon links, and keep adding great content to your site, and you will jump up in the SERPS by leaps and bounds.


To Your Success, and Your #1 Website!


Jason a long time web developer and the owner of Premier MicroSolutions, LLC. If you're looking for more articles about getting higher Search Engine rankings, go to http://www.Content-Articles.com and check out their great directory of articles.


 

Google Is Taking Descriptions From Alexa

By: Martin Lemieux

In a recent study looking at the "descriptions" of website search engine listings, we have noticed that your main SERP listing is being manipulated by Alexa.com.

If you take your top key word search engine placement within Google and look at your website's description, you will notice something similar with other websites, the descriptions match those taken from Alexa.com!!!

Take a peek for yourself.

1 - Search for your top key word in Google (usually the first key phrase within your title tag).
 
2 - Now copy & paste this description in a note pad or word doc.

3 - Once you have this, go to: www.alexa.com

4 - Type in your "url" in the address bar

5 -  Look at the description from Google and the description from Alexa, they should be an exact match.

NOTE: If you do not have a description in Alexa, Google will come up with its own version but in reality, wouldn't you want to be in control of your website's description within the SERP's?

Not having a description in Alexa - Could it directly affect your Google search engine results?

ABSOLUTELY ! ! !

I would suggest looking at your description in Alexa to make sure that it directly targets your top key phrase that you want performing well within Google.

Your Alexa description could very well determine better SERPS within Google! Google feeds results into Alexa which in turn updates information on website ranking. They are partnered up, why wouldn't they use each others results?!

Alexa's Description is the "text book" definition:

I don't blame Google for wanting to use these descriptions. Most descriptions in Alexa are the "text book version" of your company's description and are usually well written, and well thought out. This would give Google an advantage so that they can spend less time on delivering quality descriptions and more time delivering better results within their searches.

I really like this idea because if this were true across the board, Google would be giving the individual website owner more freedom to write a proper description for their SERPS without using it to spam or create false information. Again, it would have to be approved by Alexa's team anyways, they probably wouldn't allow key word stuffing for their site as it is.

About The Author:

Martin Lemieux is the owner of the Smartads Advertising Network who helps to increase your business online and offline.

Smartads Internet Marketing
Smartads Canada
Web Designers Directory

Copyright © 2005 Smartads Advertising Network - Reprints accepted as long as the resource box & entire article remains the same.


Tuesday, August 02, 2005

 

Playing in Googlebot's Sandbox with Slurp, Teoma MSNbot

Spiders Display Distinctly Differing Personalities


Copyright © Mike Banks Valentine


There has been endless webmaster speculation and worry about the so-called "Google Sandbox" - the indexing time delay for new domain names - rumor ed to last for at least 45 days from the date of first "discovery" by Googlebot. This recognized listing delay came to be called the "Google Sandbox effect."


Ruminations on the algorithmic elements of this sandbox time delay have ranged widely since the indexing delay was first noticed in spring of 2004. Some believe it to be an issue of one single element of good search engine optimization such as linking campaigns. Link building has been the focus of most discussion, but others have focused on the possibility of size of a new site or internal linking structure or just specific time delays as most relevant algorithmic elements.


Rather than contribute to this speculation and further muddy the Sandbox, we'll be looking at a case study of a site on a new domain name, established May 11, 2005 and the specific site structure, submissions activity, external and internal linking. We'll see how this plays out in search engine spider activity vs. indexing dates at the top four search engines.


Ready? We'll give dates and crawler action in daily lists and see how this all plays out on this single new site over time.


* May 11, 2005 Basic text on large site posted on newly purchased domain name and going live by days end. Search friendly structure implemented with text linking making full discovery of all content possible by robots. Home page updated with 10 new text content pages added daily. Submitted site at Google's "Add URL" submission page.


* May 12 - 14 - No visits by Slurp, MSNbot, Teoma or Google. (Slurp is Yahoo's spider and Teoma is from Ask Jeeves) Posted link on WebSite101 to new domain at Publish101.com


* May 15 - Googlebot arrives and eagerly crawls 245 pages on new domain after looking for, but not finding the robots.txt file. Oooops! Gotta add that robots.txt file!


* May 16 - Googlebot returns for 5 more pages and stops. Slurp greedily gobbles 1480 pages and 1892 bad links! Those bad links were caused by our email masking meant to keep out bad bots. How ironic slurp likes these.


* May 17 - Slurp finds 1409 more masking links & only 209 new content pages. MSNbot visits for the first time and asks for robots.txt 75 times during the day, but leaves when it finds that file missing! Finally get around to add robots.txt by days end & stop slurp crawling email masking links and let MSNbot know it's safe to come in!


* May 23 - Teoma spider shows up for the first time and crawls 93 pages. Site gets slammed by BecomeBot, a spider that hits a page every 5 to 7 seconds and strains our resources with 2409 rapid fire requests for pages. Added BecomeBot to robots.txt exclusion list to keep 'em out.


* May 24 - MSNbot has stopped showing up for a week since finding the robots.txt file missing. Slurp is showing up every few hours looking at robots.txt and leaving again without crawling anything now that it is excluded from the email masking links. BecomeBot appears to be honoring the robots.txt exclusion but asks for that file 109 times during the day. Teoma crawls 139 more pages. Another bad bot called aipbot crawled 2306 pages. Blocked 'em with robots.txt to keep them out.


* May 25 - We realize that we need to re-allocate server resources and database design and this requires changes to URL's, which means all previously crawled pages are now bad links! Implement subdomains and wonder what now? Slurp shows up and finds thousands of new email masking links as the robots.txt was not moved to new directory structures. Spiders are getting errors pages upon new visits. Scampering to put out fires after wide-ranging changes to site, we miss this for a week. Spider action is spotty for 10 days until we fix robots.txt


* June 4 - Teoma returns and crawls 590 pages! No others.


* June 5 - Teoma returns and crawls 1902 pages! No others.


* June 6 - Teoma returns and crawls 290 pages. No others.


* June 7 - Teoma returns and crawls 471 pages. No others.


* June 8-14 Odd spider behavior, looking at robots.txt only.


* June 15 - Slurp gets thirsty, gulps 1396 pages! No others.


* June 16 - Slurp still thirsty, gulps 1379 pages! No others.


So we'll take a break here at the 5 weeks point and take note of the very different behavior of the top crawlers. Googlebot visits once and looks at a substantial number of pages but doesn't return for over a month. Slurp finds bad links and seems addicted to them as it stops crawling good pages until it is told to lay off the bad liquor, er that is links by getting robots.txt to slap slurp to its senses. MSNbot visits looking for that robots.txt and won't crawl any pages until told what NOT to do by the robots.txt file. Teoma just crawls like crazy, takes breaks, then comes back for more.


This behavior may imitate the differing personalities of the software engineers who designed them. Teoma is tenacious and hard working. MSNbot is timid and needs instruction and some reassurance it is doing the right thing, picks up pages slowly and carefully. Slurp has addictive personality and performs erratically on a random schedule. Googlebot takes a good long look and leaves. Who knows whether it will be back and when.


Now let's look at indexing by each engine. As of this writing on July 7, each engine also shows differing indexing behavior as well. Google shows no pages indexed although it crawled 250 pages nearly two months ago. Yahoo has three pages indexed in a clear aging routine that doesn't list any of the nearly 8,000 pages it has crawled to date (not all itemized above.) MSN has 187 pages indexed while crawling fewer pages than any of the others. Ask Jeeves has crawled more pages to date than any search engine, yet has not indexed a single page.


Each of the engines will show the number of pages indexed if you use the query operator "site:publish101.com" without the quotes. MSN 187 pages, Ask none, Yahoo 3 pages, Google none.


The daily activity not listed in the three weeks since June 16 above has not varied dramatically, with Teoma crawling a bit more than other engines, Slurp erratically up and down and MSN slowly gathering 30 to 50 pages daily. Google is absent.


Linking campaign has been minimal with posts to discussion lists, a couple of articles and some blog activity. Looking back over this time it is apparent that a listing delay is actually quite sensible from the view of the search engines. Our site restructuring and bobbled robots.txt implementation seems to have abruptly stalled crawling but the indexing behavior of each engine displays distinctly differing policy by each major player.


The sandbox is apparently not just Google's playground, but it is certainly tiresome after nearly two months. I think I'd like to leave for home, have some lunch and take a nap now.


Back to class before we leave for the day kiddies. What did we learn today? Watch early crawler activity and be certain to implement robots.txt early and adjust often for bad bots. Oh yes, and the sandbox belongs to all search engines.


Mike Banks Valentine is a search engine optimization specialist who operates http://WebSite101.com and continues reports of case study chronicling search indexing of Publish101.com in theSecond Sandbox Case Study Article


Friday, July 29, 2005

 

SEO One-way Web Links: 5 Strategies

With so much talk about search engines putting a damper on direct reciprocal links, the hunt for the elusive one-way inbound link is on.


As someone who works with small business website owners, I've heard just about every inbound-linking scheme there is. In the end, I've only seen five strategies that really work consistently for getting hundreds of links.


Less Effective One-Way Link Strategies


Yet there's perennial interest in alternative linking strategies. They range from bad to OK, but none offer as much potential as the five major ways of getting links.


* Link farms never seem to die. The latest variations try to pass themselves off as viral marketing, but are really a sort of endless pyramid scheme: you link to me, so I link to someone else, who links to someone else, and on and on down the line. Link farms can get you delisted from search engine indexes, so don't even try them.


* Affiliates can provide you with one-way inbound links if you use affiliate software that links directly to your site rather than through a redirect. But many, many affiliates are now placing all their affiliate links in redirects of their own invention, to help protect their commissions from pirates who will simply apply to the program themselves to get a discount.


* Posting to web forums and blogs regularly will get you one-way inbound links, but they'll only have search-engine value a small percentage of the time. Many blogs and bulletin boards use search-engine-unfriendly dynamic file formats, automatically encase links in script, or use robot instructions to prevent spiders from following links.


* Many one-way inbound linking strategies fall into the great-if-you-are-lucky-enough-to-get-it category, such as winning a web award or being featured on a high-PageRank website just for being so great.


* Other one-way incoming link strategies are in the this-will-take-forever-to-get-anywhere category, such as offering to provide testimonials to all your vendors in exchange for a link to your site. (Hint: If you can get more than twenty links that way, you probably need to simplify your supply chain.)


Now, on to the five major ways of getting large numbers of one-way inbound links. Some are better than others, but they all have more potential than some of the more madcapped strategies. Of course, none is a good strategy all on its own. You have to understand all five strategies in order to really gain a distinct advantage in the one-way link hunt.


1. Waiting for Inbound Links


If you have good content you will eventually get one-way inbound links naturally, without asking. Organic, freely given links are an essential part of any SEO strategy. But you cannot rely on them, for two reasons:


* Unfortunately, "eventually" can be a very long time.


* Worse, there is a vicious cycle: you can't get search engine traffic, or other non-paid traffic, without inbound links; yet without inbound links or search engine traffic, how is anyone going to find you to give you inbound links?


2. Triangulating for Inbound Links


Search engines will have a tough time dampening reciprocal links if the reciprocation is not direct. To get links to one website you offer in exchange a link from another website you also control.


This would seem to be a mostly foolproof way of defeating the link-dampening ambitions of Google and the rest. If you have more than one website, you probably are already employing this linking method. There are only a few drawbacks:


* You need to have more than one website in the same general category of interest or the links won't be relevant.


* The work required to set up this kind of arrangement and verify compliance is not insignificant. The process cannot be automated to the same extent as direct one-to-one reciprocal linking.


* As with traditional reciprocal links, a very big drawback is that the links are mostly on "Resources" pages that are just lists of links. There's only a small chance of getting significant traffic from these links. Plus, any "Resource" page may well eventually become an easy target for link dampening, if that hasn't happened already.


3. Submitting to Directories


They are the legendary fairy lands of SEO: PageRank-passing, no-fee-charging, and actually well-run directories of relevant links. Yes, they really do exist. An SEO acquaintance tells me he knows 200 good ones just off the top of his head.


Plus, there are other kinds of directories: directories of affiliate programs, of websites using a certain content management system, of websites whose owners are members of this or that group, of websites accepting PayPal, etc. etc.


Ah, a link in a PageRank-passing link directory: it's a good deal if you can get it. But let's say you do get links from all 200 such directories and a hundred more from the little niche directories--now what?


4. Paying for Inbound Links


Buying and selling text links on high-PageRank web pages has become big business. Buying good traffic-generating "clean" links is a great alternative to pay-per-click advertising, which confers no SEO benefit. But, there are a number of pitfalls of relying primarily on paid links for SEO:


* The cost of the hundreds of links required for substantial search engine traffic can become prohibitive.


* As soon as you stop paying, you lose your link--you are essentially renting rather than owning, with no "link equity" building up.


* Google is actively trying to dampen the impact of paid links on rankings, as revealed in various patent filings. A website can try to mask the fact that the links are paid, but how well it does that is out of your control.


* Given Google's mission to dampen paid links' effectiveness, paid link buyers have an interest in verifying that a potential paid link partner is "passing PageRank." But identifying appropriate PageRank-passing paid link partners is quite a task in itself.


* Google also has a stated mission of dampening the value of any "artificial" links. Having most of your links on PageRank 3 or higher web pages would seem to be a dead give-away that your links are "artificial," since the vast majority of web pages (note: not necessarily websites, but their pages) are PageRank 1 or lower. Meanwhile, buying PageRank 0 or 1 links would have so little impact on a site's PageRank that it would not be worth the expense.


5. Distributing Content


All of the above four inbound-link-generating methods really do work. But it is the fifth method of getting one-way inbound links that is the most promising: distributing content


The idea is simple: you give other websites content to put on their sites in exchange for a link to your site, usually in an "author's resource box," an "about the author" paragraph at the end of the article.


The beauty of distributing content for links is that the links generally generate more traffic than links on a "resources" page. Plus, your article will pre-sell readers on the value of your site.


The downside, of course, is that it's no small amount of work to create original content and then distribute it to hundreds of website owners. But nothing good ever came easy. And on the internet, one-way inbound links are a very good thing.


About The Author: Joel Walsh is the head content writer for UpMarket Content. Get more information on website content promotion


Thursday, July 28, 2005

 

Monitor Your Visibility in Google and Yahoo with these DIY SEO Tools

Copyright 2005 Tinu AbayomiPaul


This is the second part of an article series in which you'll find many tools that you can use to monitor your site's search engine position and see how your do-it-yourself search engine optimization efforts are coming along.


The following tools are for monitoring your search results in the three major search engines. It isn't an all-inclusive list, but rather a highlight of some of the tools you can use. (I'll point you to one of the master lists when we get into more general tools in part three.)


Using Your Google Site Information Page


I've covered this in an earlier article, but just in case you missed it, we'll go over it again briefly here. (If you need more help following along, you can listen to one of my recent podcasts for a convenient audio walk through.)


Open up your browser and go to Google's home page. Type in info:yoursitenameandsuffix. So if your site was ExactSeek.com you'd type info:exactseek.com. You can also use site:yoursitenameandsuffix to find out which pages have been indexed by Google's search engine spider.


This search will tell you pages that Google considers similar to yours. It will also show sites that it considered linked to you, and show sites that carry your full url, hyperlinked or not. It's not 100% accurate as far as telling you all the sites that are linked back to yours, but what you can learn from this is which backlinks matter.


>From here you can also see the last day Google spidered your home page.


To see this in action, click on the first group of information links, "Show Google's cache of yoursitename.com" If you look next to the word "cached" one the first line, the date is expressed also.


Sometimes it seems that the cached time for yoursitename.com and www.yoursitename.com are different, so be sure and check both.


Finding Information About Your Site In Yahoo


This document will tell you how to find out what sites are linking to you, give you the results for how many pages of your site are in Yahoo, and more. Once you get to the results page, you'll be able to view your cached pages, etc.


Discovering Your Site's Status on MSN


As the page in the help section states, you can use site:www.yoursitehere.com to find out if a document at your site has been indexed. The results page will also give you the date of last caching.


Google Rankings


You'll need a free Google API key for this one, and the site has the direct link telling you where to get one. You'll have to enter this key in order to query the site for information on Google.


With Google Rankings, you'll be able to see where you rank within the top 40-1000 results in Google for a given keyword. I recently noticed that it also displays results for MSN and Yahoo, with links to each search engine.


They also have some other tools that will track your keywords over time, as well as one they call the "Ultimate SEO Tool" that will measure your site's keyword density.


Google Backlinks Checker 


LilEngine.com's Backlink Checker will measure the number of links you have pointing back to your site against competing sites. Handy if you just want a quick comparison of how many links you have versus others, though how much getting more links back will help varies, depending on other factors.


Yahoo Search Rankings


>From the same folks who brought you Google Rankings, using Yahoo Search Rankings, you'll be able to see where you rank within the top 1000 results in Yahoo for a given keyword. If you just want to see your Yahoo rankings, it's quite helpful.


You can find more Yahoo tools that use the Yahoo Web API at their developer's site


In the next part of the article, we'll take a closer look at other tools that give you more specific information about the links pointing back to your site, keyword research, and more.


About the Author:


Tinu is a website promotion specialist who can teach you many do-it-yourself ways to bring more traffic to your site in addition to DIY SEO. Subscribe to her ezine at http://www.freetraffictip.com/thebook/ for more details.


 

Google's Good Writing Content Filter

The web pages actually at the top of Google have only one thing clearly in common: good writing.


Don't let the usual SEO sacred cows and bugbears, such as PageRank, frames, and JavaScript, distract you from the importance of good content.


I was recently struck by the fact that the top-ranking web pages on Google are consistently much better written than the vast majority of what one reads on the web. Yet traditional SEO wisdom has little to say about good writing.


Does Google, the world's wealthiest media company, really only display web pages that meet arcane technical criteria? Does Google, like so many website owners, really get so caught up in the process of the algorithm that it misses the whole point?


Apparently not.


Most Common On-the-Page Website Content Success Factors


Whatever the technical mechanism, Google is doing a pretty good job of identifying websites with good content and rewarding them with high rankings.


I looked at Google's top five pages for the five most searched-on keywords, as identified by WordTracker on June 27, 2005. Typically, the top five pages receive an overwhelming majority of the traffic delivered by Google.


The web pages that contained written content (a small but significant portion were image galleries) all shared the following features:


Updating: frequent updating of content, at least once every few weeks, and more often, once a week or more.


Spelling and grammar: few or no errors. No page had more than three misspelled words or four grammatical errors. Note: spelling and grammar errors were identified by using Microsoft Word's check feature, and then ruling out words marked as misspellings that are either proper names or new words that are simply not in the dictionary.


Does Google use SpellCheck? I can already hear the scoffing on the other side of this computer screen. Before you dismiss the idea completely, keep in mind that no one really does know what the 100 factors in Google's algorithm are. But whether the mechanism is SpellCheck or a better shot at link popularity thanks to great credibility, or something else entirely, the results remain the same.


Paragraphs: primarily brief (1-4 sentences). Few or no long blocks of text.


Lists: both bulleted and numbered, form a large part of the text.


Sentence length: mostly brief (10 words or fewer). Medium-length and long sentences are sprinkled throughout the text rather than clumped together.


Contextual relevance: text contains numerous terms related to the keyword, as well as stem variations of the keyword. The page may contain the keyword itself few times or not at all.


SEO "Do's" and "Don'ts"


A hard look at the results slaughters a number of SEO bugbears and sacred cows.


PageRank. The median PageRank was 4. One page had a PageRank of 0. Of course, this might simply be yet another demonstration that the little PageRank number you get in your browser window is not what Google's algo is using. But if you're one of those people who attaches an overriding value to that little number, this is food for thought.


Frames. The top two web pages listed for the most searched-on keyword employ frames. Frames may still be a bad web design idea from a usability standpoint, and they may ruin your search engine rankings if your site's linking system depends on them. But there are worse ways you could shoot yourself in the foot.


JavaScript-formatted internal links. Most of the websites use JavaScript for their internal page links. Again, that's not the best web design practice, but there are worse things you could do.


Keyword optimization. Except for two pages, keyword optimization was conspicuous by its absence. In more than half the web pages, the keyword did not appear more than three times, meaning a very low density.


Many of the pages did not contain the keyword at all. That may just demonstrate the power of anchor text in inbound links. It also may demonstrate that Google takes a site's entire content into account when categorizing it and deciding what page to display.


Sub-headings. On most pages, sub-headings were either absent or in the form of images rather than text. That's a very bad design practice, and particularly cruel to blind users. But again, Google is more forgiving.


Links: Most of the web pages contained ten or more links; many contain over 30, in defiance of the SEO bugbears about "link popularity bleeding." Moreover, nearly all the pages contained a significant number of non-relevant links. On many pages, non-relevant links outnumbered relevant ones.


Of course, it's not clear what benefit the website owners hope to get from placing irrelevant links on pages. It has been a proven way of lowering conversion rates and losing visitors. But Google doesn't seem to care if your website makes money.


Originality: a significant number of pages contained content copied from other websites. In all cases, the content was professionally written content apparently distributed on a free-reprint basis.


Note: the reprint content did not consist of content feeds. However, no website consisted solely of free-reprint content. There was always at least a significant portion of original content, usually the majority of the page. Recommendations


Make sure a professional writer, or at least someone who can tell good writing from bad, is creating your site's content, particularly in the case of a search-engine optimization campaign. If you are an SEO, make sure you get a pro to do the content.


A shocking number of SEOs write incredibly badly. I've even had clients whose websites got fewer conversions or page views after their SEOs got through with them, even when they got a sharp uptick in unique visitors.


Most visitors simply hit the "back" button when confronted with the unpalatable text, so the increased traffic is just wasted bandwidth. If you write your own content, make sure that it passes through the hands of a skilled copyeditor or writer before going online.


Update your content often. It's important both to add new pages and update existing pages. If you can't afford original content, use free-reprint content.


Distribute your content to other websites on a free-reprint basis. This will help your website get links in exchange for the right to publish the content. It will also help spread your message and enhance your visibility. Fears of a "duplicate content penalty" for free-reprint content (as opposed to duplication of content within a single website) are unjustified.


In short, if you have a mature website that is already indexed and getting traffic, you should consider making sure the bulk of your investment in your website is devoted to its content, rather than graphic design, old-school search-engine optimization, or linking campaigns.


About The Author: Joel Walsh is the owner, founder and head-writer of UpMarket Content. To read more about website content best practices, get a consultation with Mr. Walsh, or get a sample page for your site at no charge, go to the SEO website content page


 

10 Minutes to Your Google Sitemap

Copyright 2005 Ron Hutton


Google's calling your name...


Hi, Google here. We want to index your website...


Is anyone there?


This article has a free corresponding online video tutorial that shows you how to summons the magic Googlebot to spider and index every page on your website, and it will only take about 10 minutes of your time.


Go here now to watch exactly what to do, step-by-easy-step...


http://www.gothrive.com/google-sitemap-video.htm


What you have access to with the new Google Sitemaps program is truly a gift from the Google gods. They've offering you a tool that you can use to keep your site constantly indexed and updated in the search engine database. With Google Sitemaps, webmasters can now take charge and make sure that their entire site is crawled and indexed.


One important note to make is that the Google Sitemaps program will not necessarily improve the ranking of your site's pages. It only ensures that Google knows what you've got online for them to look at.


Before reading the following statement, promise yourself that you won't stop reading if you see a term that seems a little scary. OK? Promise? Good. Go ahead and keep reading then.


The format specified by Google for "their" sitemap is XML (extensible markup language). Did I loose you yet? No? Good again.


You do not need to understand how to code XML to participate in the Google Sitemap program. There are plenty of free online tools that will create your XML sitemap for you with no XML knowledge required on your part. More on this in a second.


What information is including in this XML sitemap?


1) The URL for every file on your website.


2) A relative priority rank that you can assign telling Google which pages on your site are most important for them to look at.


3) The date last modified for each page.


4) Anticipated change frequency per URL. This again is a variable that you control.


According to Google, your XML sitemap can include up to 50,000 URL's. If your site is a real monster and has in excess of 50,000 URL's, then you'll need to create a hierarchy of sitemaps with one leading to the next. This way you'll be able to lead Google to all of your pages.


The options for generating and maintaining your Google Sitemap range from complex systems that are highly automated to very simple systems using online sitemap generators that require nothing more than clicking a few buttons.


Google now keeps a list of these 'third party suppliers' of generators on their site. Find them here: http://code.google.com/sm_thirdparty.html


The program that's demonstrated in our free video tutorial is found here: http://www.auditmypc.com/free-sitemap-generator.asp


In a nutshell, here are the steps involved with using online generators:


1) Start the program.


2) Enter your site's URL


3) Click the "Start Crawling" button


4) Customize URL priorities and change frequencies


5) Save the site map to your local hard drive


6) Upload your new Google XML sitemap to your website in the root directory (where your home page resides)


7) Validate your new sitemap (can be done here: http://www.smart-it-consulting.com/internet/google/submit-validate-sitemap/)


8) Submit your XML sitemap to Google.


You can access the pages for the Google Sitemap program here: https://www.google.com/webmasters/sitemaps/login


8 steps in about 10 minutes. That's all there is to it.


One question that you might ask is whether or not you still need an HTML sitemap, and the answer is "Yes, you still need your conventional sitemap". XML sitemaps are not intended for human visitors. To see what I mean, take a look at the two following sitemaps:


HTML sitemap: http://www.gothrive.com/sitemap.html


XML sitemap: http://www.gothrive.com/sitemap.xml


Which version do you prefer? Your visitors will like the HTML and Google prefers XML.


When you add pages or new content to your site and you want Google to go back to have a fresh look, just log in to your Google Sitemaps control panel, select the sitemap to revisit, and click "Resubmit". It's never been easier to get Google to spider and index a website. Don't miss your opportunity to use this tool to your advantage.


About the Author:


Ron Hutton is a 20 year sales and marketing veteran with a passion for coaching and training. Subscribe to "GoThrive Online", for big juicy marketing tips in small, easy-to-chew, bite size servings. Free Video Tutorial Archives Here


 

4 Tricks For Lightning Fast Indexing

Copyright 2005 Kurma Group


The biggest problem that most are running into seems to do with getting INTO the search engines. Rankings aside, you need to first get them to index you. Here are the four main ways to assure yourself fast indexing:


Indexing Tip #1: Never launch a new site with a lot of back links. Build natural links over weeks and months.


Let's face it, there is no guaranteed method of getting indexed by Google fast, and buying links from high-ranking sites does not guarantees anything either.


The most immediate red flag you should watch out for is your number of incoming links. According to Google, it takes time to get link popularity and sites should not have more than 100 incoming links.


It's okay to launch with a quality link or two. But beyond that, you are pushing it!


Here is the scoop! It takes 30-45 days for Google to deep index new sites. Instead of sitting around during "sandbox" time, use that period to build a strong set of natural back-links with a variety of sites.


Indexing Tip #2: Register your domain name at least four months before you plan to launch the site.


No I am not kidding! Whether you agree with it or not, history shows that Google takes older URLs far more seriously than newer ones. So register your domain name as soon as you plan on developing a site.


Indexing Tip #3: Blog and ping carefully.


Blogging & pinging is one of the fastest ways to get into the Yahoo index - it can literally get you into Yahoo overnight - helps with MSN as well.


Will blogging and pinging help get you into Google? Maybe. But over-pinging can set off red flags on the ping servers, and if you're using automated blogging software, overdoing it can cause Blogger.com and other services to shut down your blog.


Blogging and pinging intelligently can get your blog indexed in Yahoo quickly, but ping carefully.


Does Yahoo de-index junk blogs? Absolutely. Especially since the creation of all this software, the search engines are watching closely for red flags (use software wisely).


So what can you do about Yahoo Search? Not much. You can be smart about blogging and pinging or even better, you can create real (not by software) blogs. You still have to be careful with pinging though.


In the end, blogging and pinging should be part of every beginner's indexing strategy.


Indexing Tip #4: If you build bulk directory/portal sites - keep them in the 200-300 page range.


We know it's such a blast to build those monster 1000-5000 page sites, even with growing evidence of Google bots' tendencies to stall after indexing the first 200 pages or so.


So if you're into blasting out those gigantic directories and sick of waiting months for them to get indexed, experiment with building smaller sites around more targeted niches.


In a nutshell: Divide those mega-keyword lists, spend a little time grouping your sub-lists, and build smaller sites.


All in all, the best way to get indexed, stay indexed and eventually get ranked is to recruit incoming link partners. See, blogging and pinging could be gone tomorrow. But, linking is an integral part of how the internet works - it will never go away.


Concentrate on building sites and recruiting links - the links get you indexed, ranked and even bring you free traffic from those who click the links!


About the Author:


Anik Singal is a 21 year old successful internet marketer who has developed his own affiliate marketing system which helped him earn over $10,466 in just 60 days. Join his FREE Course at: http://www.AffiliateClassroom.com


 

How To Use Flash In Your Site And Still Be Search Engine Friendly

Do you want to have a feature rich web site with animations, sound, and an opening splash screen? Do you want to use Flash extensively on your site but you are afraid of hurting your search engine positioning? Do you require an opening splash page but were told that the search engines could not index your site if you had one?


Well there are ways to have the benefits of Flash on your site and still be attractive to search engines. It is all in how the Flash is used. Some of the methods are simple while others require more programming experience. But we all can use Flash without driving away the search engines.


First let's look at the problem. Search engines index a site by looking at the content. In this case we are talking about text content, not pictures, video, sound, or animation.


A basic rule of thumb on having a very search engine friendly site is to have high quality, targeted content in text form and limiting the use of anything that can get in the way of the search engines analyzing this content.


But what about Flash? Flash converts everything into a Flash file, or SWF file playable by the Flash Player plug-in. All the text in this Flash file will be converted from text to vector graphics, and since the search engines cannot read text in a graphic, they will be unable to read the text in a Flash file. Therefore they will be unable to index the information in the site.


So how to we get around this seemingly insurmountable problem? It is all in how we use Flash in our site. The trick is to either wrap the Flash inside normal html coding, or by using xhtml you can have Flash display text from an external source.


I will limit this article to the easier methods for using Flash in a search engine friendly manner. Let's look at the three main ways we may want to use Flash on a website:


1. Splash Page
2. Flash navigation
3. Flash content


Starting with the Flash Splash page there are a couple of ways to handle this. If we just have the Flash Splash page as the opening page to our web site there will be no way aside from the meta tags for the search engines to index the page, let alone find any more pages in your site.


The trick here is to give the search engines something to work with. This can be easily done in two ways. You can put a text only navigation bar just below the splash screen, this way the search engines can at least find the other pages in your site.


But to allow the search engines to index the actual splash screen here is a neat trick. Place the Flash file in a layer, you can then float that layer over the web page allowing you to hide all the plain html text you want underneath the Flash layer.


This has the added benefit of giving content to visitors who do not have the Flash player installed on their system. With a Flash navigation bar things are much easier. As long as only the navigation is in Flash the rest of the page can use search engine friendly text, plus you can put a text only navigation bar across the bottom of the page so that the search engines can find the rest of your pages.


This is a good idea anyway and one that I always follow. Plus when a visitor reaches the bottom of your page they have some links to follow without having to scroll back up the page to your navigation bar.


Using Flash content in your web site follows the basic principles outlined above. As long as you have some text based content for the search engines to index you will be OK.


I have found that having Flash animation on the top of my page where it fills the screen, followed by additional text is a nice hybrid approach. It gives your visitors the content rich experience you want to provide plus gives the search engines all the text based content they need for proper search engine indexing.


If you really want or need a complete Flash site there is one more thing you can do. This requires more work, but it will keep your site in the search engines.


Create two versions of your web site, one in Flash and one using regular HTML. Have a simple home page that allows visitors to select either the Flash or HTML version of your site.


Include on this home page your main keywords and navigation links to at least your site map and main pages so that the search engines can find their way around your site. This gives you the best of both worlds, the fully feature rich Flash site, and a search engine friendly HTML web site.


One final note, Google now is able to index Flash files, pulling out the text content from the SWF files. This is a great advance and allows us to use Flash more freely on our web sites, but note that they are pulling the TEXT out of the Flash file.


So in order to make your Flash files Google friendly you need to include your search engine optimized text as text inside the Flash file, just like you would in a regular html web page.


So with proper planning and a few tricks we can have a very rich Flash site and still benefit from easy search engine indexing.


About The Author: George Peirson a professional software trainer and consultant. He is the author of over 30 multimedia based tutorial training titles. To see training sets and other articles by George Peirson visit http://www.howtogurus.com Article copyright 2005 George Peirson


 

Blogs Are Great For Google SEO

by Wayne Hurlbert


The most recent Google results, whether involving incoming links, Google PageRank levels, or positions in the search engine results pages (SERPs), there is one common denominator.


That similarity lies in the value that Google is placing on theme and topic related content.


The watchword of the day, in the land of Google, is relevance.


The Google algorithm, which is the mathematical calculation that determines a site's position in the search results for any given keyword or phrase, has been revised to stress the importance of relevance. That importance not only affects the search results that are found in the SERPs, but in Google's PageRank and link valuation formulas as well.


With the need for more relevant content and linking partners, website owners are looking for a possible solution. The requirements for help include theme and topic relevant content and linking partners who are also theme and topic related. While there are several ways to achieve relevance, one such method is adding a blog component to your website.


Blogs are regularly updated postings of information, usually related to the theme of a website, and includes incoming and outgoing links on the same topics. Once thought to be merely online journals and diaries, blogs have moved far beyond the personal realm and into the world of business and information.


Blogs are becoming an important component in many business owners' toolboxes for marketing, public relations, and search engine optimization.


Businesses in almost every industry can benefit from the blog boost in the search engines. There is little doubt that all websites will receive a healthy injection of relevance, simply by posting regularly to a business blog.


Since the posts will be on topics, related to the overall website theme, and incoming links will arrive from similarly themed blogs, relevance is an obvious and natural result.


In the early days of the internet, it was thought that related sites would link naturally to one another. Little thought was given to placements in the various search engines at first. It was generally agreed, that good content would attract natural related incoming links, from similarly themed websites. In fact, what was being described was what is currently happening with blogs.


Blog posts are generally written on one, but just as often on two, three, or even more related topics. If the topics are not related, they become so by virtue of there being a large number of posts on those formerly unrelated topics. On occasion, the unrelated topics will even appear in the same post. The important aspect of the blog is the overall development of powerful theme relevance.


Bloggers are also free and generous linkers. By regularly linking to interesting blog posts, bloggers provide value to their own readers, by offering them the best of other bloggers. These added incoming links provide additional Google PageRank, as well as boosts in the search engine rankings from the link.


The clickable links are often rich, in the receiving blog's most important keywords, and are contextually surrounded by theme related content as well. This combination of strong link anchor text and theme relevant content gives the blog exactly what the search engines are seeking.


The Google algorithm could have easily been written with blogging in mind.


Blogs and content relevance


The primary benefit of maintaining a blog is the regular infusion of fresh theme related content to a website. Either as a cross linked stand alone blog, or with the postings sent by FTP to the main business website, a blog provides abundant content. Since the posts will be mainly about the business and related industry topics, all of the content will feature that highly sought after relevance factor.


Blog posts can be written about industry news and events, new company products, marketing campaigns, public relations efforts, and general conversations with current and potential customers and clients. The flexibility of blog topic options is very wide, and as they are related to the main theme of the website, they are highly relvant.


As the fresh content, in the form of blog posts, is added to the site, the overall number of website pages gradually increases. Google and the other search engines tend to prefer larger sites over smaller ones. This makes sense, as Google is rewarding what it considers to be authority sites and hub sites. Both categories require larger numbers of pages to achieve.


While some exceptions to that rule do exist, and find their way into the SERPs, the tendency is for larger sites, especially those deemed by Google to be authority or hub sites, to achieve better search engine rankings. This is especially true when the added content is both fresh and theme related. Blog posts fill that bill quite nicely.


Because of the high comparative frequency of blog posts being added to a website, the search engine spiders visit a site more often. That regular and frequent reindexing tends to raise a site's rankings in the search engines. Google almost certainly provides a boost to frequently and freshly updated sites.


Blogs are updated so frequently, often on a daily basis, that the maximum fresh content bonus is usually applied. Sites featuring a blog component definitely have a head start in the SERPs, thanks to frequent updating.


Each blog posting has an individual title. By placing keywords in the post title, a powerful search engine ranking boost can be achieved. The posts are often, but not always, archived by post title making good titles very important. With many blog hosts using the titles as part of individual page URLs, some added benefit is also achieved. Be careful to place the keywords early in the title as the URL will often abbreviate the post title.


The regular blog posts are usually right on the site's main topics and themes. As a result, the posts also tend to be very rich with the site's most important keywords and keyword phrases. Along with Google, heavy keyword rewarding search engines like Yahoo and MSN Search rank sites with blog components very highly.


Because of the powerful and frequently updated themed content, many blogs are moving toward authority site status, for their most important keywords. Some blogs have already achieved that exalted level, demonstrating the power of blogging.


Blogs and link benefits


Bloggers are generous natural linkers. They readily and frequently link, not only from their blog links list, but from within posts themselves.


Note that most, but not all, bloggers link to other blogs and websites, from their blog's home page. That has some benefits, and some drawbacks. The upside is the blog's home page usually has good to very good Google PageRank. The downside is that like many other links pages, a blog's transferred PageRank is often divided among a large number of blogs and traditional websites.


Being linked from a page with many outgoing links, provides more value, if the sending page is relevant to the receiving page. In the case of blogs, that is very often the case. Bloggers like to link to other blogs and websites in their special topic area. Because of this relevance, the maximum amount of value, from a site to site link of that type, can be reached.


While theme relevant links pass along more Topic Sensitive PageRank and more link popularity boost, keep in mind that all links pass value as well. The only difference is unrelated links tend to pass along a lower amount of link benefit to the receiving page.


Bloggers are well known for providing interesting and informative links to their readership from within the blog posts themselves. Keep in mind that bloggers are many times more likely to link to other blogs than to static websites. By not having a blog component on your website, that enormous linking potential is almost entirely lost.


These context sensitive blog links are especially valuable in three ways. The first and obvious benefit, is what accrues from any link, in the form of link popularity and transferred Google PageRank.


The second benefit is from the potentially keyword rich link anchor text from right in the post itself. Often, the blogger will use an entire sentence about the linked material as the link anchor text. Keywords are often found in those linked passages.


The third benefit is from context. Surrounding the link is usually very theme relevant content, containing similar important keyword phrases. That context ensures that the sending page is very theme relevant. Since bloggers tend to read and link to blogs, focussing on their main area of interest, their outgoing links are most likely highly relevant.


Link exchangers will find bloggers, who share similar topics, to be very receptive link traders. While some people believe that reciprocal links have no value, that is not the case.


All incoming links have value, but some simply provide more net benefit than others. Reciprocal links, between similarly themed blogs and websites, are very good for both parties. Links to and from entirely unrelated sites and blogs offer far less value, but still add some limited benefit.


Fortunately for many bloggers, one way incoming links from similarly themed blogs are very common events. Very few bloggers fail to link to other interesting blog posts. In fact, linking is part of the power of blogging.


Some blogs are well on their way to achieving hub site status, as a result of their generous linking habits. In that sense, linking pays off handsomely for many bloggers.


Conclusion


By adding a blog component to a website, the addition of relevant content is very easy. Simply by adding posts, discussing the main topics of the website, a webmaster can reap the benefits of fresh, keyword rich content. At the same time, the search engine bias towards fresh content, also boosts a site higher in the search rankings.


Bloggers like to link out to other sites, and especially, to other theme related blogs. The additional link power, provided through the natural generosity of blog owners, will help to elevate a site in searches for the site's most important keywords. Many searchers will find the main site by entering through the blog.


Because the blog postings often include links, valuable keywords regularly appear as link anchor text. As a bonus, those in post links will be surrounded by highly relevant content, giving the link maximum the maximum power available from the blog page.


Along with the added link popularity, there will be a corresponding upward hike, of a page's Google PageRank. It is not unusual for a site's blog component to have a higher PageRank than the website's home page.


Take the expressway to more relevant links and content.


Test drive a blog today.


About the Author:
Wayne Hurlbert provides insigtful information about marketing, promotions, search engine optimization and public relations for websites and business blogs on the popular
Blog Business World.


Thursday, July 07, 2005

 

Increase Your Web Traffic By Using Keyword Articles

Copyright 2005 Timothy Spaulding


If you have an online home based business you know that routing traffic to your web page is incredibly important. Not only in order to make sales and increase revenues but to continue your business. H


however, you probably have realized that getting your web page noticed and getting a high ranking result from the search engines is difficult. In light of that, here are a couple of tips you can use that will help you increase the traffic to your web page.


First, you will need to do some research to see where your page ranks in several different search engines. Do this by performing searches on the keywords you think are relevant to your web page.


Once you have this information, you will be able to take action and become better than your competitors whose pages are being returned as higher relevant results.


No matter where you rank in the results, if it is less than number one you have some work to do. So, take a look at all the web pages that are ranked higher than yours and see what these pages have that yours does not.


Evaluating your competition will help your Web page become stronger and more competitive, something that is important to your bottom line.


You need to do this kind of research for all the different keywords that pertain most to your Web page and that people search for the most often. You want your web page to rank high in the results for a number of keywords, so be diligent about researching what other web pages are doing and what you can do to get your page returned higher in the results.


One of the best ways to improve your web page ranking for a variety of keywords and phrases is to use keyword rich articles on your web page. This will allow you to provide useful information for web surfers as well as include keywords that will help your page get noticed.


When it comes to keyword rich articles, you can either write them yourself or have them written for you. The first thing you need is a list of the most popular keywords and phrases people search for, and then write articles that include useful information which uses the exact keyword phrase multiple times.


When someone searches for that particular term, your page will be returned as a high result as long as you are outdoing your competition. The most important thing to keep in mind is that you need to provide relevant, useful and pertinent information.


When it comes to outdoing your competition regarding keywords and search engine results the relevance of your article is essential a higher web page ranking. If you are creating relevant, keyword rich articles, people will get to your page and will be more likely to stay.


Finally, if the keyword articles get your web page ranked higher in search engine results, as they should, do not expect that it will stay that way. There is more competition online every day, which means you will have to be diligent about continuously tracking your competition, what they are doing, and where your web page ranks in the search engine results.


You will have to make changes in order to keep ahead of your competition, but as a home based business entrepreneur you know this is an essential part of business.


About the Author:


Timothy Spaulding is the owner of the Work At Home Business Resource Center at http://www.workathome-awesomeopportunities.com and Home Made Profits at http://www.homemadeprofits.net which provide valuable tools, articles, affiliate programs and products for the home based entrepreneur.


Sunday, July 03, 2005

 

Free Google Sitemaps Generators

A Simple White Hat Technique To Get Indexed By Google


Everybody knows that getting indexed in Google is getting more and more difficult each day and every body is looking for that edge over the competition.


Most "white hat" SEO's frown upon methods like cloaking, blog and ping and other such "black hat" techniques and never had any special technique that they could use to help get their pages indexed better.


Well, presenting Google Sitemaps, Googles latest offering which is still in the beta stage, and which won't make the purists frown.


Google sitemaps is a service that allows webmasters to define how often their sites' content is going to change, which is supposed to give Google a better idea of what pages to index.


By placing a specially formatted XML file on your web server, you inform Google of whenever your pages change, and then the googlebot crawls the updated pages making the necessary updates to its database.


Google has provided the format your xml file has to be in at https://www.google.com/webmasters/sitemaps/docs/en/protocol.html


An interesting point is that the xml file has 2 tags, changefreq and priority with which you can also indicate how important each page is, and how frequently the page changes.


The valid values for changefreq are "always", "hourly", "daily", "weekly", "monthly", "yearly" and "never" and similarly the priority can vary from 0.0 to 1.0, where 0.0 identifies the lowest priority page(s) on your site and 1.0 identifies the highest priority page(s) on your site.


Once you have the xml file in place on your server, you need to inform Google about it by opening this URL in your browser


http://www.google.com/webmasters/sitemaps/ping?sitemap=URL


where the URL part in the above URL should be the URL-encoded location of your Sitemaps xml file.


Now Google has provided an open source script that will automatically generate the xml file for you. The only drawback being its in a scripting language called Python.


There are however several Free Third Party scripts and tools available.


1) Softplus GSiteCrawler: This is a windows software and is extremely easy to use.It has been coded in Visual Basic 6.0.


2) SiteMaps Pal: This is a online service that generates the sitemap for you.It has a limit of 1000 links. So if you web site has more than 1000 links, this won't work for you.


3) Google Sitemap Generator: This is another free online sitemap creator.This service lets you crawl sites 3 levels deep and limits the number of links to 400.


4) phpSitemapNG from enarion: This is a php script that you need to upload to the root of your web site and the script generates the sitemap file on the server.


It also lets you submit the sitemap to Google by clicking a link.


The drawback of this script is that you will need to upload it to each of your sites and it also doesn't recognize subdomains.


5) Google Sitemap Generator for Dreamweaver: This dreamweaver extension by George Petrov lets you quickly create Google Sitemaps for your dreamweaver sites.


6) Google Sitemap Generator for WordPress: Here is a plugin for wordpress users


7) SecretSpider generator: This is a paid software priced at $97. Its advantages are that it also lets you gzip the xml file thereby making it smaller in size.


So, go ahead and make your website more Google-friendly.


About The Author:

Satyajeet Hattangadi is the Owner of
Novasoft Inc, creators of Adsense Cloaker, a unique php script that hides your adsense ads from the robots and helps prevent de-indexing by yahoo.


This article may be reprinted provided the resource box is kept intact.


 


Saturday, July 02, 2005

 

How Google Indexes Content From Your Web Directory

In a fluke, I was able to notice something about the way Google indexes content from web directories. Excluding your template, the most important line of code is the first title you add to your main body.

Search through Google and see for yourself!

Try searching for "something" in "yourcity","province/state" and look for a web business directory that you recognize. Once you find a directory, take a good look at the description of  that particular listing (not the title). It may be a good idea to write it down. Once complete, click on the "cache" of that page within Google to highlight the content and view the web directory page.

9 out of 10 times  the description of your website listing within Google is partly taken from the first line of code you have within your main body of content (excluding your header, footer, & sidebar).


You will notice that this only applies for a web directory. Any personal or business related website gets indexed differently. If you take a look at the Google directory, we find the same thing: Take a look here: http://www.google.com/dirhp?hl=en

Browse to any sub-category and look at the first line of text. You will find that the title within the main body of content before anything else, is within an H1 tag.

H1 tags & H2 tags are nothing new to the development community but, there may still be many directories online that can increase their search engine rankings by changing a few things.

Here's an example we see very often online; (I am also guilty of this)

You have just developed an impressive web directory and you are very proud of your creation. In the process of organizing your massive directory you were faced with a problem on how to allow people to browse your website and how to let search engines browse through your categories with ease. So with that in mind, you create the "alphabetical solution".

THE ALPHABETICAL SOLUTION IS THIS:

Search Categories By Alphabetical Order:
A,B,C,D,E,F,G,H,I,J,K,L,M,N,O,P,Q,R,S,T,U,V,W,X,Y,Z

The problem with this alphabetical solution (I am also guilty as charged) is that we tend to add this development solution to the top of our page so that our visitors and possibly search engines can find these extra categories easily. << This is probably hurting your results in many ways.

1) Your alphabetical solution is probably necessary but instead, you should add it a little lower below some more important page specific content.

2) No matter where you add your ABC's, search engines will find them anyway.

3) You don't want 10,000 pages to be indexed with a description that goes... abcdefg...

The Solution:

If you own a directory and you are faced with this problem, let's get our development hats on and switch a couple of things around. Try adding the main "topic" description to the top of your main body of content and create this description within one of these tags: H1,H2,H3,H4, etc.

2nd: Once you have your main title description, try adding more related content to that specific page within "bold tags" BEFORE you add your alphabetical solution. At least this way, when search engines browse through your massive web directory, they do not leave thinking that you like singing the alphabets.

One Last Thing:

If you are seriously targeting specific local markets on the web, try adding the city, province/state, & country! Being in Canada, we are faced with many brick walls when it comes to promoting certain cities.

Perfect example of this is my home town of  "Hamilton, Ontario, Canada". If you don't promote this city properly, you might actually be targeting people from "Hamilton, Ontario, California" ! ! ! Did you say ouch??? As you can see, this can be easily mistaken by many visitors coming to your web directory and probably won't help your conversion rate whatsoever.

Making sure that you target the right industries and the right locations could be crucial for the success of your web directory.

I hope this article helps you out!

About The Author:

Martin Lemieux is the president of the Smartads Advertising Network. Smartads is here to help small to large companies grow online and offline. Visit the Smartads Network today!

International:
http://www.smartads.info
Canada: http://www.smartads.ca

Content Management: http://www.thingsdigital.ca
Canadian Search Engine: http://www.smartadsearch.ca


 

7 Search Engine Resources You Should Be Using Now

Ask any business person who's website is at the top of the search engines if his/her site is making money, and the answer is almost always "yes".


An example is Glenn Canady, the author of "Gorilla Marketing" who employed only one of these strategies, and it made him over $1 million dollars.


The fact is, search engines can get you an enormous amount of traffic, and it's traffic to your sites that's free. However, in order to ethically and effectively market in the search engines, you need to use strategies that actually work.


Below are three different ways to effectively, and ethically, raise your rankings in the search engines. I've included seven different resources that you can use that will help you implement these strategies, and do it quickly and easily so that you can begin to see an increase in your traffic almost immediately.


1. Optimize your site. To make sure that you are properly targeting your market, you need to make sure that you are marketing using the right keywords. This means optimizing your site to make sure that the keywords you have on your site are the keywords that your site is actually optimized for.


There are two tools that you can use to help you with search engine optimization:


a. Search Engine Optimization Fast Start Ebook - http://www.seoresearchlabs.com/seo-book.php - will teach you simple and effective techniques for optimmizing your site. This ebook is now in its 4th edition, is completely up to date, and is one of the best ebooks I've seen on search engine optimization.


b. Web CEO - http://www.smallbusinesshowto.com/search.html - This is a complete search engine optimization suite that offers 10 different tools to help you optimize your site for the search engines. It offers the most comprehensive, and step by step, set of instructions I've ever seen with any software package. According to the instructions, you can get started in one hour. The free version of this software will work for most, and it also includes a $97 search engine optimization course as part of the package.


2. Develop a linking strategy. One factor that influences how well you are ranked in the search engines is linking. The more inbound links that you have pointing to your site, the higher you will be ranked in the search engines.


For each link that you have pointing back to you, that's another opportunity for your potential customer to find you. With credibility being such a big problem on the internet, to have someone recommend you increases your chance of making the sale.


To help you develop an effective strategy, I recommend that you read "Linking Matters" - http://www.linkingmatters.com/. This ebook shows you how to develop an effective linking strategy for your site, and do it very quickly.


3. Develop a content strategy. The truth is, "Content is King". Most people online are looking for information. The more information that you provide for your customers, and the more valuable it is, the more likely you will make the sale.


Below are three different ways to develop content for your site.


The first and most effective strategy are articles.


Articles actually work for you in several ways.


a. Brands you as an expert so that customers come to you.
b. Provides valuable content to your potential customers.
c. Builds a relationship between you and your potential customers.
d. Creates a viral marketing strategy for your site.
e. Builds a linking strategy for you every time a webmaster publishes one of your articles.


No other strategy that I have employed has brought me more business than this one.


To find sites that accept articles, go here: http://www.jogena.com. This is one of the sites that I use because it's one of the oldest, and most reputable sites online for finding information on ezines. Unfortunately, there's no search, but everything is done by category, and the information is comprehensive, so you will locate rather quickly what you are looking for.


However, if the thought of writing your own content gives you nightmares, there's a way around this.


You can use public domain information. Public domain information is information that is free to use because it's in the public arena, or the copyright has expired.


To help you easily locate this information, I recommend that you download the public domain toolbar. You can get it at the Public Domain Forum - http://www.publicdomainforum.com/forum. You have to register for an account to get the toolbar, but both the forum account and the toolbar are free. The toolbar is a very comprehensive resource of public domain sources.


The third content strategy you need to consider is blogging.


What a blog does is allows you to create search engine friendly content on your topic of interest. Combine this with an RSS feed, which the search engines love, and you have a winning strategy.


Not only will it help you build content for the search engines, but it can also help you raise your traffic and sales.


Here is a comprehensive ecourse that you can take that will drastically reduce your learning curve, as well as provide you with the resources you need to implement this strategy.


Marketing With Blogs Course


Finally, to keep abreast of what's happening with search engines, you need to subscribe to Search Engine Watch. This site offers tons of resources, news, and a newsletter on search engine optimization.


Apply one, or all, of these search engine resources to your search engine strategy, and you can expect a major increase in your traffic and sales to your site.


You can get Jinger's best free internet marketing and small business resources, including free software, ebooks, newsletters, and more when you visit her blog at http://www.askjinger.com


More Tips and Resources on Article Marketing


 


 

One Well-Placed Article Nets 616 Mentions in Google

Copyright 2005 Off the Page


Evaluation of a Home-Run Article


I've been writing articles and posting them online for several years. But it took a while before I learned writing well and developing a long list of places to post them weren't enough. Articles that deliver fresh, specific, how-to are a solid plus for readers. But writing each around carefully-defined keywords is a must for the search engines.


Other factors influence how successful your article marketing efforts will be - like the Page Rank of the posting site, whether they provide a LIVE LINK back, and how specific their niche or readership is. Some of that is beyond my control. But as a writer, it's up to me to craft each article to cover as many of those bases as possible.


Take the time to think through your article marketing strategy, rather than sending them out willy nilly http://www.promotewitharticles.com/strategy100.html There's more pay-off to write a number of articles, each adding greater depth, around a recurring theme.


My articles raised my name from 100 Google mentions to over 3,000 in a relatively short time. They established my expertise in several niches - article marketing and Yellow Page ads. These abilities come together in this example.


You don't Know the Winner until After the Horse Race


One never knows when sending out an article, which ones will get the most play. So write each one like your reputation depends on it (because it does). The article described below got widespread attention because it's timely. And there's considerable interest (and pain) on the topic.


It addresses a serious problem that no one is talking about - the declining response rates to Yellow Page ads. Advertisers feel they're paying too much for the amount of business their ads bring, but didn't know about their choices. Since this article went out, there have been so many additional changes working against Yellow Page advertisers that an updated article needs to be written. This level of online visibility indicates there's considerable interest.


Keep track of how widely each article you write is received. That's one of the ways to stay on the pulse of your readership - so you deliver more of what they want.


Your Title is the Hook for the Article


Most readers (skimmers, actually) won't get past the title. So make it a grabber. Give them the reason to keep reading. In this case, it's long (which I'm convinced works best). This title has three sections - which track with the body of the article.


Yellow Page Advertisers: Your Calls are Going to Decrease - Here's the Remedy


A. Tells who the information is for - Yellow Page Advertisers People can tell whether or not it applies to them. I increasingly write audience-specific articles and address them (consultants, speakers) in the title, rather than writing for the less-focused "everyone in business."


B. States the problem - Your Calls are Going to Decrease It backs up the claim with a bulleted list stating why


C. Tells there's a remedy for the problem - if they just keep reading The article provides a 4-step list stating how to get ready and protected


That's a lot to accomplish in 750 words - read it yourself at: http://www.yellowpagesage.com/article253.html Since the website itself provides visitors helpful free resources, it needn't all be included in the article itself. Interested readers can explore further at http://www.yellowpagesage.com Besides, I've sent a barrage of related articles, which further build on each other.


Build Your Professional Reputation


616 mentions in Google is an impressive yield for a single article. Admittedly, not all cites provide live links or appear on high Page Rank sites - but some do. (And there are duplicates in that number.) But given the amount of my effort involved, I'm well repaid.


Now, think past any single article, to the impact that niche-specific articles can have on your professional standing. Let me prove it. Search Google for "Yellow Page ads" (in quotes). When the query results show 38 million pages, enter "Lynella Grant" in the Search within Results box. The outcome: 5,000 pages related to Yellow Page ads refer to me. That's a ripple in the Internet universe, but certainly positions me for other activities in that arena.


Don't you think you or your website can profit from similar online visibility? Writing articles is the way to go.


About the Author:


--Dr. Lynella Grant Consultant and Author - Promote yourself, business, website, or book with online articles http://www.promotewitharticles.com Free how-to. Or let me write and submit your articles online for you. No learning curves (719)395-9450


More Article Marketing Resources and Tips


 

Understanding Back Links

There are no hidden secrets on how to rank high with the major search engines. All that is needed is a basic understanding of how search engines work and a bit of know how.


Perhaps the biggest contributing factor to a successful web site is incoming links or Back Links. Without links, your website will more than likely go unnoticed. So how should you accumulate these links? Below are a few basic methods to accumulate quality back links.


Before you get started


You MUST understand how search engines work. Over 90% of your business will likely come directly from search engine results. Therefore, it is absolutely essential to optimize your site for search engines.


You could have the greatest deals in the entire world, but if no one knows about them then your efforts are wasted. Do a search on google for Search Engine Optimization (SEO).


You will find tons of great information on how to create a website that is both user friendly and search engine friendly. There are also countless numbers of companies and freelancers out there who offer SEO services.


Just be cautious of their offers and do your research first. Remember, NO ONE can guarantee top placement in major search engines, no matter what they say.


Where should back links come from?


1. Articles are a fantastic source of links and additional traffic to your site.


There are countless sites online that want your articles. It's a win win situation for everyone. When you write articles and submit them online, you are able to leave a link to your own site and sometimes even anchor text.


I know what you're saying, "I can't write about anything." I use to feel the same way. But the truth of the matter is, anyone can write. You just have to find a topic that interests you.


You can write about anything from dogs, to computers, to personal training, to web hosting, heck you can even write about writing. The bottom line is, articles are a valuable source of "relevant" back links to your site.


2. Submit to directories.


There are literally hundreds of "free" and "paid" directories online. With 3 hours of painless work, you can have your site submitted to hundreds of great directories.


There are many quality lists of directories that are regularly updated. One of my favorites is http://www.best-web-directories.com/ which is always updated and maintained.


3. Develop link exchanges with relevant sites.


The major search engines such as google see incoming links from relevant sites, and give them more rank compared a link from an unrelated site. For example, if you have a website about pet care products, your link strategy should target pet related sites. Again with a bit of hard work and determination, you can develop a great deal of "relevant" back links.


4. Forums are a great source for additional traffic and links.


Many forums allow their users to display signatures in their posts. These signatures can consist of both text and links. When you post a new message on that forum, your websites link will be displayed for everyone to see. This not only helps build valuable back links, but will bring additional traffic to your website.


5. Join a link co-op.


Co-ops such as the one found at http://www.digitalpoint.com/ are an amazing source of valuable back links and free advertisement. What is a co op network? A Co op network is a network of site owners/webmasters that offer ad space to the network.


In return, the ads they define are displayed across the entire network. The best part of ad networks such as the one found at digital point, is, they are free. Having your site in a co- op can literally mean thousands of back links for your site.


6. Buying ezine ads is a paid method of advertising, but it works, because it is the most targeted of these approaches.


Write a concise ad for your business, and link to either a lead-capture page or directly to your autoresponder. Then, begin contacting ezines to place your ad. (Do a search for "ezine directories," and you will find enough lists to keep you busy.)


When you contact an ezine, look for rates for both classified ads and solo ads. The advantage to a solo ad is that your ad goes out by itself in a separate email, but they can be more expensive.


Because these ads are targeted (you should NOT advertise in an unrelated ezine), you will definitely see traffic, but you will need to track your results to see which ezines bring you the most and best traffic.


7. Ad Swaps can be an exellent source of advertising.


Start by writing some killer copy (to coin a phrase...there are many free articles and ebooks writen on the subject). Simply use your favorite search engine and type in...how to write ad copy. I googled this as I am writing this and it showed 12,400,000 results. WOW!


Once you have your ad written, search out Ezines that are in the same catagory as your business or service. Again use your favorite web search. Build yourself a list of ezines and then email them asking if they except ad swaps. Include your ad in your corrospondance with the ezines so they can see what you are offering.


Another place to check for ezines is to type in Ezine directories in your seach engine. Look up ezines in your catagory of business or service. Most listings will tell you if they except ad swaps and the peramiters of the ad such as how many lines and how many characters per line. Don't forget to ad your website url in your ad.


If you want to have a successful site, whether-it-be a hobby or for business, you have to understand how search engines work. Part of that understanding, is knowing about link development.


Maybe in the early 90's the idea of "if you build it they will come" might have been true. But in the year 2005 the internet is a far more competitive and complex place. To be successful today, you have to work at it. Without a quality source of back links the search engines will all buy ignore your site.


Richard D. Moore is the President and Founder of IncomeNow! Marketing Masters. We offer Free support to every subscriber to IncomeNow! A No Hype, No Bull, Honest and Real source to help you start or grow your business and/or Ezine


http://www.incomenow.9k.com


 

Professional SEO: Hand Off to Bob or Outsource the Job?

By Scott Buresh


We are often asked if professional SEO (search engine optimization) can be done effectively utilizing in-house talent. Despite our obvious self-interests on the subject, our answer is always a qualified "yes"- you can achieve professional SEO results using existing talent.


However, for every company we have known that has met with great in-house SEO success, we know of many more that have seen their in-house efforts fail. We have also discovered the companies that have succeeded share some common traits.


If your company is considering doing SEO in-house, there are some critical questions that you should address before you proceed.


1. Do I have the proper resources at my disposal to achieve professional SEO results?


Search engine optimization takes time, and your internal SEO expert will need to have a great deal of it at his or her disposal - especially at the project's outset when target audiences, keyphrases, and optimization schemes are first being established.


Even after the initial optimization effort, the nature of SEO will require this person to spend ample time keeping up with industry trends, monitoring campaign progress, performing A/B testing, and expanding the campaign as new product and service areas are added.


Perhaps even more important than time, achieving professional SEO results requires a unique set of aptitudes. The person responsible for your internal SEO initiative must possess the ability to learn quickly and to look at your website from a macro-perspective, marrying together the needs of sales, marketing, and IT.


He or she can not be an aggressive risk taker, as this is often a surefire way to get your website penalized and potentially removed from the major search engines. These gifted people exist in many companies, but given the unique attributes that these individuals possess, their time is often already spent in other crucial areas of the business.


Without enough time to invest in the project or the right type of person to execute it, an internal SEO initiative is likely doomed to fail.


2. Do I know which departments of my company should be involved, and will they work with an insider?


As mentioned above, professional SEO, by necessity, involves marketing, sales, and IT. The SEO expert must work with marketing to find out what types of offers and initiatives are working offline to help translate them effectively online. He or she must work with sales to identify the types of leads that are most valuable so that you can target the right people in the keyphrase selection process.


And, finally, your SEO expert will need to work with IT to determine any technical limitations to the SEO recommendations, learn of any past initiatives based on a technical approach, and get the final optimization schemes implemented on the website.


Sadly, in many businesses, these departments have a somewhat adversarial relationship. However, it is the duty of the SEO expert to act as a project manager and coordinate the efforts of all three departments if you are going to get the most out of your campaign.


No professional SEO project can be completed in a vacuum. For whatever reason, it is often easier for an outsider to get adversarial departments on the same page, in the same way that a marriage counselor might convince a woman of her undying love for her husband while the husband is still grimacing from a well-placed knee in the parking lot.


3. Will someone be held accountable for the results?


This may seem like a small consideration, but it can have a tremendous impact on the success of the campaign. If you have added this responsibility to some poor soul's job description with the direction that he or she should "do the best you can," you'll be lucky to make any headway at all (especially if the person is not enthusiastic about SEO).


Whether SEO is done in-house or outsourced, someone will have to take responsibility for showing progress, explaining setbacks, and continually improving results. Without this accountability, it is very common to see an initiative fade as the buck is passed.


4. Can I afford delayed results based on a learning curve?


It's a reality - professional SEO expertise has a steep learning curve. While the information on how to perform the basics of optimization are freely available on the web, much of the information out there is also contradictory, and some of it is actually dangerous.


It takes time for someone unfamiliar with the discipline to sort the SEO wheat from the SEO chaff (on a side note, a "quoted" search of Google reveals that this may actually mark the first occasion in human history that the phrase "SEO chaff" has been used - we're betting it's also the last).


Simply put, if the person you are putting on the job has no experience, it will take longer to get results. This may not be a consideration if you aren't counting on new business from SEO any time soon. However, if you are losing business to your competition due to their professional SEO initiatives, time might be a larger factor.


5. Will it cost me less to do it in house than it would to choose a professional SEO firm?


Often, companies will attempt this specialized discipline in-house in order to save money, and sometimes this works out as intended. However, accurate calculations of the cost of in-house labor that would be involved versus the price of the firm you would otherwise hire should be performed to make an accurate comparison.


When making this calculation, also factor in the opportunity cost of the resource - the tasks that your in-house people are not able to perform because they are involved in SEO.


In addition, if worse comes to worst and your in-house SEO expert is led astray by some of the more dangerous "how-to" guides available, it can cost even more to repair the damage than it would have to hire a professional SEO firm to perform the optimization from the outset.


And an internal SEO campaign gone wrong can cost even more than the stated fee - websites that violate the terms of service of the major search engines (whether intentional or not) can be severely penalized or even removed, costing you a lot of lost revenue when potential customers can not find your website for a period of time.


6. Do I believe that the end result I'll get in-house will be equal to or greater than the results I would have gotten from a professional SEO firm?


Search engine optimization can create huge sales opportunities, and slight increases in overall exposure can have not-so-slight increases in your bottom-line revenue.


If you believe that your talented in-house resource will, given enough time, achieve results equal to or greater than those that could have been achieved by the professional SEO firm you might have chosen, it may make sense to do it internally.


However, in addition to a better knowledge of industry trends, one clear advantage that search engine optimization firms have is the benefit of the experience and macro-perspective that comes from managing many different websites over time.


Professional SEO firms can watch a wide range of sites on a continual basis to see what trends are working, what trends aren't, and what formerly recommended tactics are now actually hurting results.


This macro-perspective allows professional SEO firms to test new tactics as they appear on a case-by-case basis and apply those results across a wide range of clients to determine what the benefit is. It is harder for an individual with access to only one site to perform enough testing and research to achieve optimum results all the time, something that should also factor into the equation.


7. Do I have at least a slight tolerance for risk?


Neophytes to SEO can make mistakes that can lead to search engine penalization or removal. This happens most commonly when they have an IT background and treat SEO as a strictly technical exercise.


We are often called in to assist companies who have had an internal initiative backfire, leaving them in a worse position than the one they were in before they started.


The simple truth is that you cannot perform effective SEO without marrying your efforts to the visitor experience, but this is not something that is intuitively understood when people approach SEO for the first time.


However, professional SEO firms are not perfect either. Some firms use those same optimization methods that violate the search engines' terms of service and can get your site penalized.


So, if you do decide to outsource, educate yourself on SEO and do some research on the firm. Know the basics of the business, find out who the firm's clients are and how long they've been in business, and ask for professional references - just like you would do with any major business purchase.


If you have considered all of the above questions, and your answers to all seven are "yes," your company may be uniquely equipped to achieve professional SEO results in-house.


If you answered "no" to any of the first three questions but "yes" to the rest, it does not necessarily mean that you can't perform SEO in-house - just that you may not be in a position to do so at this time.


Taking the actions required to get you in the right position to answer in the affirmative might be worth your while. However, if you answered "no" to any of the last four questions, you may want to consider outsourcing the project to a professional SEO firm.


A professional SEO firm has the resources, the time, the expertise, and, most importantly, the experience, to launch an SEO initiative for your website that will have a positive effect on your bottom line.


Whichever option you choose, it is important that you fully embrace the channel. A half-hearted initiative, whether done internally or outsourced, can be as ineffective as taking no action at all.


About the Author


Scott Buresh is the CEO of Medium Blue Search Engine Marketing. He has contributed content to many publications including Building Your Business with Google For Dummies (Wiley, 2004), MarketingProfs, ZDNet, SEO Today, WebProNews, DarwinMag, SiteProNews, ISEDB.com, and Search Engine Guide. Medium Blue, an Atlanta search engine optimization company, serves local and national clients, including Boston Scientific, DuPont, and Georgia-Pacific. To receive internet marketing articles and search engine news in your email box each month, register for Medium Blue's newsletter, Out of the Blue, at www.mediumblue.com.


 

Google Patent Application - SEO Highlights

The recent patent application filed by Google details numerous items the search engine uses to rank web pages. The specific application is summarized as:


"A method for scoring a document, comprising: identifying a document; obtaining one or more types of history data associated with the document; and generating a score for the document based on the one or more types of history data."


The patent application sheds significant light for those pursuing search engine optimization with Google. Patent applications can be difficult to understand, so following are highlights that you should consider for your SEO efforts.


Update Your Site


Updating your site is important when it comes to maximizing your rankings on Google. In addition to the manipulation of keyword density and meta tags, the patent application reveals that Google places significant value on how often your content is updated. The more often you update, the timely and relevant your site will appear to Google. In turn, this leads to higher rankings.


To appease mighty Google, consider the following plan of action:


1. Update pages frequently,


2. Add new pages to your site,


3. Interlink the new pages with others on your site, and


4. Add new pages on a weekly basis instead of all at once.


When Google returns to the site, you want to make sure that there is new content. The high rankings of blog sites are evidence of this approach.


Google's Looking at Your Domain


In a new twist, Google claims that it analyzes the number of years of domain registration as part of the ranking process. The application suggests that domains that are registered for longer periods of time are given more value because such a commitment shows the site is not a fly-by-night jump page.


It is recommended that you extend all domain registrations for as long as possible as part of your search engine optimization efforts. It is difficult to tell how much the registration process impacts the ranking process, but every little bit helps.


Google claims that it also digs deeper into domain names to evaluate the legitimacy of the site. Factors in the evaluation include the web host and the "who is" information. According to the patent application, Google maintains a database of hosts that facilitate spamming of the Google search engine.


While such hosts are not detailed in the application, pray to God that you are not using one. You should evaluate your host if your optimization efforts are not producing results.


If your search engine optimization efforts for Google are failing, the patent application may provide answers. Talk about a perfect E-book!


Halstatt Pires is with Marketing Titan, an Internet marketing and advertising company comprised of a search engine optimization specialist providing meta tag optimization services and Internet marketing consultant providing internet marketing solutions through integrated design and programming services.


 

Let's Kill All the SEO's

by Stephan Miller


Yes, it does seem to be open season. And it does seem that anyone who tries to make their website rank higher in search engine results is lumped together with search engine spammers.


Now, I would understand this a little more if search engine technology was at a level where I type a phrase in and find what I am looking for. But people's minds works differently and no search engine will ever be able to account for this factor.


A newbie to a certain technology, doesn't matter which, doesn't use the same terminology as an expert. He will use terms that relate to his areas of expertise and as he progresses, this terminology will evolve until it is more precise. Therefore, there will be many levels of vocabulary that he will use in this progression. As SEO's, we recognize this.


A person may type in "change belt in my car", not knowing that there are multiple belts in a vehicle and each vehicle may have different procedures for changing this belt.


The more "authoritative" website may never use this exact term. They will use "change a timing belt in a 94 Toyota Camry" or "changing the drive belt in a 95 Chevrolet Corsica." While the SEO, who has researched search terms on internet users, will know the vague terminology that newbies may use.


This is the job of an SEO. To bring people the information they are looking for. Maybe make a sale while they are at it, but still an SEO knows the pulse of what people are looking for more accurately than the run-of-the-mill textbook webmasters.


People looking for something on the web are looking for something quick. They do not care that the language they may use is inaccurate. They don't want to read a manual before they type words into Google. They just want the info. As SEO's we bring it to them and reap our rewards.


To do this, we learn a little about the search engines we submit to. It's a necessity. And I think this makes the internet a better place to find things. To suggest otherwise would be like saying that people born with bad eyesight should be forced to live without glasses.


Of course, I am only speaking for those on the white-hat side of SEO. Presenting a false front to search engines is not good practice. Just give people what they are looking for and add value to your site.


Let the blind continue to lynch SEO's. Let search engine's change their ranking algorithms. Let spammer's continue their games and get banned. Let the non-SEO webmasters continue to website equivalent of Siberia. Then sit down, know your audience, know your competition, know your search engines, and write.


-------------------------------------------------------
Stephan Miller
http://www.profit-ware.com
-------------------------------------------------------


 

The Search Engine Soap Opera

By Kalena Jordan


The history of search engines is a bit like the plot of a soap opera. You know - Bo finds Hope, Bo loses Hope, Bo finds Hope again only to discover it's actually Hope's long lost evil twin Princess Gina and so on.


Just like the TV soaps, the search industry has a strange and illogical history. We started with a particular cast of search engines, new ones soon rose up and tried to usurp market share from the originals, some engines jumped into bed with each other, some of the well known characters died or were killed off by the newcomers, "good" engines decide to turn "evil" in the grab for market share, new industry darlings were born and so on.


Those of us who have been watching this particular soap opera for the past few years are quite addicted to all the plot twists and turns. The thing is, search engines seem to have finally come full circle. Most started up originally with a simple premise: to provide a useful service to persons surfing the Internet; a way to search the millions of web sites and find specific, relevant information, 24 hours a day.


However once a few key players became heavily trafficked, search engines became viable advertising vehicles, attracting mega bucks from companies willing to pay them for the privilege of displaying banner ads to the significant number of eyeballs viewing their sites on a daily basis.


Soon everyone wanted in on the act. New search engines developed overnight, driven mainly by profiteers, hungry for their piece of the Dot Com boom. The "Who's Got the Biggest Index" game began and the searching public began to demand more relevancy and fresher results.


Under pressure from over-inflated company valuations, the Dot Com bubble soon burst and everyone was left covered with the sticky mess of financial accountability.


Meanwhile, savvy webmasters had begun to study how search engines worked in order to understand how to structure their web site code to improve their ranking for target search queries. A whole new industry developed from this activity: search engine optimization.


Webmasters who didn't have the time or inclination to learn search engine optimization techniques simply paid others who did. Popular directories such as Yahoo! and LookSmart took advantage of consumer demand for listings by introducing the first paid submission services.


Industry players took note of the developments and introduced commercial search engines where web site owners could simply pay their way to the top of the rankings rather than rely on ranking algorithms - voila! - the first pay per click search engines were born.


It wasn't long before smaller search engines and directories followed the lead set by the larger directories and introduced services to assist webmasters to ensure a place for their sites in the search listings - either via a third party partnership with pay per click search engines, or by introducing a new guaranteed indexing service which became widely referred to as Paid Inclusion.


Soon it seemed everyone was partnering with everyone else in order to get their cut of the deals being done. Some search engines were cannibalized by others or bought out by inexperienced companies and sacrificed at the altar of mis-management. Search veterans left cash poor by the dot com bust, or unable to cope with the competition, fell by the wayside.


At this point, you could say that the search industry was almost exclusively driven by profit and share price. At many of the majors, the needs of the searcher were temporarily replaced by (or mistaken for) the needs of the shareholders.


But in the background a relatively small search engine had been slowly building their database and gaining market share since 1998. Increasing numbers of searchers, disappointed with the irrelevant or outdated results they were receiving at other sites, began to flock to this newcomer with the curious name: Google.


Despite still being in BETA mode, the search engine began to get a reputation for the quality of sites in its database, the lightening fast results it produced and the simple, no-nonsense site design. Media attention arrived, as did more market share.


The major search engines and directories now had no choice but to sit up and take notice. Almost too late, they realized what they had been doing wrong for the past few years and why they were losing market share so easily to this young upstart firm.


Searchers have always wanted fast, relevant, up to date results from their chosen search engine. The fact was that very few directories and engines were offering this any longer. Their sites had become portals packed with advertising and third party information sources; the original search function seemingly lost in the forest of information.


But Google had made a point of ALWAYS offering searchers what they wanted, hence their success. The penny dropped and the majors scrambled to get back to basics. Yahoo! took things one step further and embraced Google as their new third party results provider, taking a small investment stake in the company and dumping industry veteran Inktomi in the process.


So where are we now in the plot of "Days of our Search Engines"? Over the past 12 months, some search engines and directories lost their way completely, yielding to the pressure in the boardroom to become more profitable and in so doing, losing forever the trust of their market.


Others simply slipped a long way off the radar and are desperately trying to claw their way back up the mountain. But with the KISS example set by Google and the glaring evidence that you CAN be profitable by listening to users rather than cash registers, the search industry storyline is finally getting back on track.


Yahoo! has recently combined Google results with their main search listings, new technically advanced engines such as FAST and Teoma are making an impact on the market and AltaVista appears to be making solid efforts to improve their index refresh rate and quality of results.


The end result is a richer user experience for searchers and a more promising future for the search industry where content is, once again, king.


Copyright © Kalena Jordan 2002


About the Author:


Article by Kalena Jordan, CEO of Web Rank. Kalena was one of the first search engine optimization experts in Australasia and is well known and respected in the industry worldwide. For more of her articles on search engine ranking and online marketing, please visit http://www.high-search-engine-ranking.com


Tuesday, June 28, 2005

 

A Simple Secret To Seducing The Search Engines

By Jason Potash © 2005


In the Internet Marketing community, "traffic" and search engine mastery are hot topics these days...


People will do almost ANYTHING just to get their site indexed by Google faster... to boost their Page Rank quicker... to drive more traffic immediately... and to get a ton of incoming links to their website right away...


Yes, even if it involves handing over $137 for the latest push-button traffic software or search engine trick.


Why all the fuss?


In this article, I'm going to reveal to you a simple, no cost way to make the search engines crawl all over your website like ants on a melted popsicle.


PLUS... you will start to get laser-targeted traffic and boost your search engine rankings almost instantly. Not to mention, you'll have (at least) 100 quality links back to your website within a week.


And this doesn't involve any SEO hocus-pocus, buying into some new "killer" traffic booster software, or trying out the latest scheme to fool Google, Yahoo, or MSN.


Nope. This stuff is 100% legit, above-board, and has already worked for years.


Let me give you a quick example. Recently I created a little "test" website.


The search engines didn't know about it. It had no incoming links. No traffic. No SE rankings...ZIP!


Fast forward 7 days... My website was indexed by Google, Yahoo and MSN, it got well over 200 incoming links and started to generate traffic... shortly thereafter, Google granted my site a Page Rank 3.


Again, this did not cost a penny, just a few hours of my time. I created a play-by-play video that shows the website and how I did it (see the weblink below).


Here's another example: An "average guy" decided to try this same strategy. He now has #1-#8 rankings on all the major search engines. You can listen to him explain how de did it during a recent audio interview


(Note: After you click above, look under the "June 20th" post and click the MP3 audio button and video button on the site)


So, how can you get these same kind of results?


I wish I could make this sound more complicated but, all that you need to do is simply submit one article to a handful of Article Directories.


Writing and submitting articles has long been a proven, time-tested strategy for increasing traffic, subscribers and sales.


But now in 2005, articles have re-emerged in a BIG way, thanks to recent information leaked by Google within their US Patent Application 20050071741.


We now know that quality content and incoming links are essential (now, more than ever) to the survival of your websites within the search engines.


Bottom line. You need to start creating and submitting articles to get all the benefits I just outlined above.


In fact, Google is practically telling you to do so!


But, where do you start? Where should you submit your articles?


Here's a blueprint that outlines the 7 keys to writing successful articles.


Follow these 7 keys and you are guaranteed to get more traffic, more incoming links and higher search engine rankings and page rank, by simply distributing your articles across the Internet.


Let's get started.


Key #1: Choose a HOT Topic


You'll need to do a little research on this one. If you are already familiar with the target market for your article, this should come easy. Research, spy, observe. Do whatever it takes to understand your target market. What issues are hot? What topics currently appear within top ezines, messageboards, ebooks? What keywords are they using to search online?


It's also a good idea to frequently visit article directories and content sites. These sites contain current articles on a variety of popular topics. Often, you can view the most popular article topics (or clicks) as well.


Here's a short list of article directories:


http://www.ezinearticles.com
http://www.certificate.net/wwio
http://www.ideamarketers.com
http://www.goarticles.com
http://www.netterweb.com
http://www.jogena.com


Key #2: Choose a "Magnetic" Title


Magnetic title? That's right. Your article title is your headline. If it doesn't pull the reader into the article, nothing will. Just think ... what if I called this article:


"Get More Traffic In The Search Engines", "Easy Search Engine Tips", or "Free Traffic Generation Strategies".


BLAH! See what I mean? These don't pack much "punch", do they?


Your title is just like a classified ad. Look at it this way. If your article title is crowded on a webpage with 100 others, what will make it jump off the page?


Key #3: Use The AIDA Principle


Attention
Interest
Desire
Action


This universal formula applies to your articles as well. Once you've enticed your reader to read your article (via a great title), you need to keep them reading.


The first paragraph of your article is critical. If it's dull, boring and lifeless, your reader will surely bail out. You must keep their interest right from the start. Try using short paragraphs, sentences and words. This will keep the tempo upbeat and make your article much easier to read (or skim, as most do online).


Key #4: Create Several Sub-Headings


To better organize your thoughts, divide your article into sub-headings. At times, writing a 500-700 word article can seem insurmountable. Sub-headings make things easier. Instead of tackling the entire article at once, try writing one paragraph at a time.


Key #5: "Close The Sale" With A Resource Box


One of the worst things you can do is to leave your reader hanging. They've just read your great article, now what? A resource box appears at the very end of your article. It's the last line that your reader sees. It's your chance to set the next step.


Make sure that your resource box is compelling. Make the reader jump over to your website, sign up for your ezine, download your latest ebook, claim their free gift -- get creative!


Also, the resource box allows you to include a direct link back to your website. Sit back and watch your link popularity soar!


Key #6: Poofread Your Work


Did you catch that? It should have read, "Proofread". Nothing hollers out amateur, newbie, or just plain unprofessional than an article filled with typos and bad grammar. If you don't have an English major in the family, hire a proofreader. Their rates are reasonable and they can save you from damaging your reputation online.


Key #7 - Promote, promote, promote


Let's make one thing clear. Without key #7, keys 1-6 are a waste of time. Bottom line, no one will read your article if you keep it stored on a floppy disk in your desk.


You've got to promote it!


How can you promote your article? There are close to 90 article directories, article banks, free content sites, article announcement lists, and article syndication services out there. And the best part is ... 95% of them are FREE!


Plus, there are thousands of ezines, websites and blogs that are looking for article submissions. Develop a list of contacts in your target market. Next, e-mail each ezine publisher, blog owner, or webmaster a copy of your latest article. There are countless stories of individuals who have succeeded using this exact same approach.


Wouldn't you like to get your next article picked up by an ezine with 32,000 subscribers or have 8,000 websites visitors view your article tomorrow?


Now go on, and get writing!


About the Author:


Jason Potash makes it easy to create and blast out your articles all over the Internet using his ingenious new ArticleAnnouncer Article Marketing System. Click below now to see what's happening on Tuesday, June 28th:


Friday, June 24, 2005

 

Get Listed In Google By Making An XML SiteMap and Without Spending A Dime

Copyright © 2005 Richard D. Bailey
Client By Design


If you have been unsuccessfully trying to get listed in Google or just hitting roadblocks when trying to get more of your pages listed in Google, then you need to read this short article. I am about to reveal a simple SEO secret that can save you a lot of time, money and effort.


Google has a preferred search submission format that it actually asks webmasters to use, It's called a Google SiteMap.


Admittedly, creating and using XML is no easy task for anyone who is non-technical or inexperienced with web coding, however there is a site or two on the web that can actually help you create an XML sitemap and then submit it to Google so that this venerable search engine can crawl your previously unknown web site and get you listed.


Of course there are no guarantees that your site will get high ranks or that it will meet Google's guidelines for inclusion, so be sure to make sure that your site is properly optimized and meets their guidelines before using these tools.


Before I reveal these tools and show you where to go to find out how to use them, let's take a look at the basics.


XML (Extensible Markup Language) is a special document formatted created to allow communication between applications and also between organizations. XML is a practical system that structurally defines the format and composition of intricate documents and data such as invoices, news feeds, inventory reports, catalog listings and other complex documents. A seasoned programmer who understands XML can easily create XML applications that know how to pull data from XML sources and then format it for presentation to end users.


In the case of Google, this same XML data format can be used to define your site's pages and their position in relation to each other. So for example, your "about_us.html" page is usually connected only one click away from your "index.html" page. When used in this manner to define pages and their positions we are creating what is commonly known as a sitemap.


Google says in the own words, "Google Sitemaps is an easy way for you to help improve your coverage in the Google index. It's a collaborative crawling system that enables you to communicate directly with Google to keep us informed of all your web pages, and when you make changes to these pages."


So in essence, Google is asking us to help them index the web by using this simple technique that will no doubt become a major help to struggling webmasters everywhere.


Google, by the way, will accept simple text file based sitemaps. Please consult their site for more information.


How to get your sitemap indexed.


Once your sitemap has been created and uploaded to the main directory of your web site, simply use this URL to submit it:


www.google.com/webmasters/sitemaps/ping?sitemap=sitemap_url


Just replace the parameter, "sitemap_url" with the actual URL of your sitemap. Example:


www.google.com/webmasters/sitemaps/ping?sitemap= http://www.mywebsiteabc.com/sitemap.xml


You can also open a Google account before submitting to make sure that you can actually track your submission to check your sitemap status.


https://www.google.com/accounts/NewAccount


I promised to reveal the tools used to facilitate the creation of XML sitemaps and here they are...


The Tools Revealed:


SiteMapspal:


Use this Google recommended online tool to generate a Google friendly xml sitemap that you can simple cut/paste and then upload to your site. Simply provide your site URL and select a few optional settings and with one-click ease you will have a sitemap, ready to go.


http://www.sitemapspal.com/


Google SiteMap Generator:


Provided by Google themselves, this is not for the faint of heart, it requires some knowledge of working with Python scripts (a web coding format) and will requires installation on your site.


https://www.google.com/webmasters/sitemaps/docs/en/sitemap-generator.html


SiteMap Validator:


Use this Google recommended tool to validate your sitemap for accuracy.


http://www.smart-it-consulting.com/internet/google/submit-validate-sitemap/


Certified Guerilla Marketing Coach and speaker, Richard Bailey is an Internet Marketing consultant who develops methods and technology to attract customers. Serving large and small business from Real Estate Marketing to the Chemical Industry. Contact Richard by visiting http://www.ClientByDesign.com or call 914-206-9625


Thursday, June 23, 2005

 

Of Spam and Sandboxes

About a month ago I had the privilege of giving a demo of the next version of Sonic Page Blaster to the attendees at Yanik Silver's "Underground Online Marketing Seminar". I fielded a couple questions afterwards that bear a better treatment than I could manage on the spur of the moment and in less than 60 seconds.


Q: If we create "feeder" sites that point to our main sales page or "money site", won't they be adversely affected by Google's sandbox?


A: Let's first define what we mean by the "Google sandbox". Over the last seven months or so it has become apparent that new web sites do not tend to show as high a page rank as older established sites. The reason for this is not a raw prejudice against new sites.


According to my sources, it is instead an attempt by the search engine giant to discount the effect of reciprocal linking, especially paid linking. If links cost you money and they have no immediate effect, chances are most people will abandon the practice.


And that's exactly what Google is hoping for. Frankly, I understand and support this move. The reason is that Google's motives and mine coincide. Google is trying to make sure they return the most relevant and highest quality results available for a given search term.


If I have the most relevant and authoritative web site for a given subject that encompasses those same search terms, I want Google to return my results at the top of the heap. I don't want spammy link farms to change this.


The key is quality. Really, over time, the best search engine marketing strategy is to create a killer web site. Wouldn't it be nice if all the energy we direct towards search engine optimization could instead be focused on the quality of our web site?


Google feels the same way, believe me, and the refinements they make to their alogorithms are designed to move in that direction. For that reason alone, the quality and focus of your web site is your best long-term SEO play.


Q. Could automatically generated web pages be penalized as "spam" by search engine spiders?


A. I call Sonic Page Blaster "SPB" a lot, and I definitely don't think of the "S" as standing for "spam". On the contrary, Sonic Page Blaster simply saves you time in creating search engine-friendly web pages that contain really good articles that pertain exactly to the content of your web site.


No "automatic" content system can find the content that best fits your niche. You need to either write or find the articles that will help your web site visitors or subscribers the most.


I know that a few of the seminar attendees I talked to had spam-filled stars in their eyes when they saw SPB churn out a bunch of pages at the push of a button.


Trust me, you don't want to go there. Google will eventually punish you in a big way.


Here are some rules that I believe will not only help your search rankings, but also drive the right kind of traffic to your primary web site (at the seminar Jeff Johnson called these "money pages").


1. Do not post duplicate content at multiple web sites, especially if you own them all, if they are on the same server, and if they link to each other. SPB makes it so easy to generate article mini-sites, why would you want to duplicate content, anyway?


With SPB you have a huge advantage over those who have to manually create web pages. Use your advantage. Create many web sites that focus on narrow subject matters, each having their own set of articles.


Worried about duplicate content and potential search engine punishment? Good. You should be. Don't do it.


Ah, but what about duplicate content on other people's web sites? If they don't link to you, you don't have anything to worry about. I'll save a further explanation about that for later, but I don't believe it makes sense for Google to punish you for something that is not giving you any advantage. Besides, they understand content syndication. Google's developers and designers are anything but stupid.


2. Your money site does not necessarily need to be extremely narrowly focused on a few key words, but your feeder sites should be. For example, I will soon be starting a web site for those folks trying to develop an online business in their spare time. That is, they hold down a regular job and do this stuff at night.


The site is called MidnightMarketer.com and it is not live yet (but the sign-up page works). Anyway, that will be one of my "money sites". It will cover a plethora of topics related to internet marketing, time management, technology, and even health.


In order to "feed" it potential customers, I am also developing "feeder" sites that will focus on each of those more focused topics. The feeder sites will contain as many highly focused articles on their subject matter as I can find. My goal is that the search engines will (rightly) see them as quite valuable and relevant results for some important search terms. Then visitors will see the links and ads for MidnightMarketer and head on over. I can even make a little money off those that don't click through to MidnightMarketer.com, thanks to Adsense ads mixed into each page by Sonic Page Blaster.


3. Don't use reciprocal links, especially between your feeder sites and your money page. Yes, I know that flies in the face of conventional wisdom. But try to understand Google's motivations--that is the key to predicting what they will eventually do. They understand that one-way links are usually more meaningful than reciprocal links, which are often just trades between webmasters. A one-way link usually points to something useful.


OK, I'll back off on this just a little: When you can, get one-way links. When there is no other choice, reciprocate. And yes, you can be sure Google keeps track of all links into and out of a web site.


4. Do use a blog, hopefully even more than one. Blogs don't have to be on your server(s), they're not owned by you, and it is going to drive Google's software gurus nuts trying to sort the wheat from the chaff in the blogging world. Even though I support Google in most things, it is kinda fun to do something that makes them a little crazy. [I mean that in a good way, Sergei.]


== Rossaroni, no baloney ==


The MidnightMarketer


Copyright 2005 Ross Lambert


About the Author:


Ross Lambert is a senior software engineer for a fast-growing telecommunications firm in Kirkland, WA. He is also the founder of MidnightMarketer.com and TheVentureForge.com.


 

Your Search Engine Optimization Strategy: Make Love, Not War

When it comes to search engine optimization strategy, there are basically two camps - those who view search engines as adversaries to be conquered at any cost and those who regard search engines as partners in their online marketing efforts.


Long-time readers of my articles probably already have a good idea of which camp I fall into; however, I believe both approaches can be effective optimization methods.


Adversarial Optimization Methods


Service providers who have this "adversarial" philosophy will tell their prospects that the formulation of a search engine optimization strategy is much like a high-stakes game of chess. It's an "us vs. them," "winner-take-all," and "every man for himself" mentality.


It's also rooted largely in technology - under this philosophy, success is defined as unraveling the latest search engine algorithm to find new optimization methods and exploiting its technical aspects for immediate benefit.


The underlying premise of this search engine optimization strategy is that you must use optimization methods that trick the search engines into showing a website predominantly in the results since the site isn't currently offering attributes that the search engines consider valuable.


The primary benefits of this approach are that it doesn't require much work on the part of the client and that results can be realized more rapidly. These qualities both stem from the fact that there isn't a large amount of additional content needed, nor are there many wholesale changes to make to the website when using such optimization methods.


While this is not the methodology that I recommend, it is a valid - albeit potentially volatile - search engine optimization strategy.


Partnership Optimization Methods


Those who view search engines as partners have a very different search engine optimization strategy. These service providers embrace the idea that the attributes and optimization methods that give a website high rankings in search engines are, by and large, the same ones that make the site more valuable to website visitors and potential customers.


This theory makes sense. Every search engine needs to return results that their users find to be the most relevant and useful. If search engine R&D people operated in a vacuum, they would probably find their market share rapidly diminished while they lamented about how "people are stupid".


This means that each of the major search engines spend endless research dollars to determine exactly what it is that search engine users find valuable, and each has a high stake in the results of the research.


No search engine marketing or web design firm has the resources or motivation to conduct studies of this magnitude. It is, therefore, highly advantageous to use the findings of these studies, deduced from common algorithm traits of multiple search engines, to improve your search engine optimization strategy and website.


I consistently hear from companies who are puzzled as to why their expensive, cutting-edge website is perpetually outranked by a site of perceived inferior quality - "our website is better than theirs" or "we are a much bigger company" are common remarks.


Beauty is, as always, in the eye of the beholder. The sites that consistently rank highly are almost always using optimization methods that offer something of value to people who entered the search query.


Search engines care as much about the size of a company or how much it spent on its website about as much as they care about what you had for breakfast this morning (I had blueberry muffins, but Google hasn't called to ask).


The advantages to the "partnership" search engine optimization strategy are numerous. Rather than chase the ever-changing technical attributes that can get you short-term results, you instead use optimization methods that leverage your company's knowledge of your industry to create something useful for the searcher.


You can improve your website and offer the information and products that prospects are seeking, even if those prospects are in the earliest stages of the buying cycle. In general, you will not have to watch your rankings swing wildly based upon new spam filters and algorithm shifts, and thus will enjoy a higher level of predictability when it comes to your website (although with search engines, there are never any guarantees).


Since you aren't constantly forced to re-address your site's search engine optimization methods, you'll have more time to focus on other online marketing areas that need attention, such as the website's conversion rate, an e-newsletter, or online PR.


Conclusion


It's a fact that websites rise and fall in the rankings all the time. The only real constant is that the sites of TRUE value, the ones that offer something relevant and important to the searcher, are generally always near the top - even after the latest algorithm shift has sent the "adversarial" crowd into a frenzy of activity as they attempt to reformulate their search engine optimization strategy.


While it may take a little extra effort, I like to think of the relationship with search engines as a "partnership" in a real sense. We use optimization methods that apply the attributes search engines have deemed to be valuable to a website, which improves both the website and the website's search engine rankings.


The search engines, in turn, send highly-targeted visitors who have shown an interest in your industry, products, or services. Sure, it may seem that we get more out of the deal, but the engines don't complain. They haven't even acknowledged our partnership.


About the Author


Scott Buresh is managing partner of Medium Blue Search Engine Marketing. His articles have appeared in numerous publications, including ZDNet, WebProNews, MarketingProfs, DarwinMag, SiteProNews, SEO Today, ISEDB.com, and Search Engine Guide. He was also a contributor to the recently released Building Your Business with Google For Dummies (Wiley, 2004). Medium Blue is an Atlanta search engine optimization company with local and national clients, including Georgia-Pacific, DuPont, and Boston Scientific. To receive Scott's monthly articles, sign up for Medium Blue's e-newsletter, Out of the Blue, at www.mediumblue.com.


 

Google Does RSS or How You Can Benefit From Google's New Sitemaps

By Titus Hoskins (c) 2005


Has Google finally embraced RSS with their new XML powered Sitemaps program? Well, sort of, but it seems more like a hug than a strong impassioned embrace!


It does use XML technology which allows for the crawling and updating of your site's web pages. You can even include your entire web site (all urls) with this indexing program. For anyone targeting the search engines, especially Google, this program (still in beta) is a MUST HAVE.


If you require timely updating of your most popular pages Google's new Sitemaps may prove indispensable. It's a little premature to assess the importance or impact of Google's new program but anyone wanting to give their site a competitive edge should be gearing up.


How it works:


There are several ways to set-up a XML Sitemap, perhaps the easiest way is to use the open-source Generator which you can download from Google. This is a Python file that you can upload to your webserver and this generator will create a sitemap from your 'URL lists, webserver directories, or your access logs'.


It would probably be wise to check with your hosting provider to see if they can accommodate this Generator on your webserver. It you have a small site there should be no problem but if your site runs into the 1,000's of URLs or pages - check to see how much bandwidth such a system will take up. It's better to be safe than sorry!


Once done, you have to then submit your newly generated XML sitemap to Google - the search engine will use this XML sitemap to update and index your site whenever you make changes on your site. You will need to have a Google account.


You may also submit text files containing URLs from your web site to be included in Google Sitemaps but these text files will have or will be given low priority for the time being.


To get started on your own Google Sitemaps Account you can click here: https://www.google.com/webmasters/sitemaps/login


What's great about it:


Besides seeing Google finally grab the RSS wildcard, it gives you better control of how and when the search engines update your web site pages. Perhaps, the most important aspect for Internet Marketers, you can now assign the importance that's given to any of your particular pages.


As most marketers know, certain pages on your web site are more important than others; these pages earn money, build your contact list, or direct your site's visitors in the right direction. In other words, you can now place more emphasis on your web site's 'bread and butter' pages. A BIG Plus!


With Google Sitemaps you can decide the importance placed on these pages by using the priority XML tag. This rating system is relative - it only relates to the pages on your own site.


Likewise, you can also indicate how frequently your pages change by using the changefreq XML tag. More or less instructing Google when your page will be updated or changed.


This is a win-win situation for everyone - Google gets the freshest content for its users and you gain more control of the frequency of the updates done with your site or web pages. This may have a direct influence on the profitability of your web site.


For those who are actively marketing thru the search engines and keywords - Santa may have come a little early this year. Of course, the jury will be out for awhile but Google Sitemaps will probably have a positive impact on your bottom line.


What it means for Google:


For those of us who have been following and watching the RSS wildcard for the past couple of years - it takes away some of the frustration and a little of the puzzlement from Google's seemingly total disregard of RSS.


RSS is not a fad, it is not a trend and it's not going away. Instead, its importance is growing. It is fast becoming 'the' way data is moved on the web.


One could even speculate that in the very near future all web pages will have an RSS component - perhaps a hybrid of 'XML/HMTL' or an embedded XML code that will work with all browsers, search engines and servers.


For Google to ignore the growing importance of RSS, blogging, podcasting, broadcatching, the RSS featured Firefox browser, MyYahoo, not to mention all those orange XML logos popping up on most of the major sites on the web - is beyond comprehension.


Why Google does not have an RSS search on its main search engine page still seems baffling. Bringing out a homepage and not including an RSS feature is just foolhardy (They may introduce this feature later).


For those firmly in the RSS corner, Google's continued disregard for RSS became more than a little frustrating to observe. It was downright rude! Perhaps Google was waiting to incorporate RSS in a program like this new XML sitemaps?


Can this mean that Google has finally accepted the importance of RSS and they're starting to make amends? More importantly, could there still be a few more RSS goodies in the Google Jar left to be announced?


One can only speculate but when it comes to RSS and Google, let's just hope this is the start of a beautiful friendship.


To add RSS to your Site within minutes -download this simple RSS Report and Guide. Copyright ? 2005 Titus Hoskins of BWMagic's Free Marketing Tools & Guides. This article may be freely distributed if this resource box stays attached.


 

Great Ways To Obtain Link Popularity

I hate reciprocal linking.


It is such a hassle. Even specialized reciprocal linking sites don't really help all that much. Many of the link requests that I do get are from poker sites that are totally irrelevant to my site's content.


Most of the sites that I would like to link to don't respond. Some sites supposedly do respond, but, when I search their sites, I can't find my link anywhere. It is just so totally and completely frustrating.


Link popularity is a major factor in search engine rankings. How does someone like me, who despises reciprocal linking, go about improving my site's link popularity?


Fortunately for me, I have discovered two ways to do exactly that. They are blogging and submitting articles.


I added the Work At Home Ideas Blog to my website two weeks ago. Search engines love content and blogs are about the easiest way to add fresh content to your site.


But, how can a blog improve your site's link popularity? Most blogs feature their own RSS or XML feed. If a visitor to your blog likes your content, then he just might decide to syndicate your content on his website.


In the past two weeks, my blog's XML feed has been hit more than 150 times. I expect to get at least some new links out of that.


Writing articles and submitting them to article directories is another great way to get more links pointing back to your site. You can place your own "resource box"(similar to the one at the bottom of this article) at the bottom of your articles.


 Your resource box wiil contain a link back to your website. The article directory will publish your article and --BAM– you have an instant link back to your website. Submitting articles is a great way to get your site's popularity going like gangbusters.


Here are some outstanding article directories that you should be submitting to: IdeaMarketers
EzineArticles
GoArticles
ISnare
Internet Home Business Articles


Here is a little tip for your article submissions. Many article directories allow you to enter HTML code in your articles. This gives you the ability to control the anchor text of your site's link in the resource box of your article. The anchor text of your link serve as keywords for the search engines.


Use a keyword research tool, such as Good Keywords, to vary your anchor text from week to week. By doing this, you will rapidly build up a number of keywords for your site.


By using your blog's RSS or XML feed and submitting articles, you can increase the link poularity of your website without all of the hassles of reciprocal linking.


About the Author:


Ronald Gibson is a Web Designer and Web Marketer. He is the Webmaster of AffiliateUtopia.com, which offers information about some of the best money making opportunities on the Web. For more information, visit: http://www.affiliateutopia.com/


 

How to improve your website's search engine rankings

One of your main goals when building a site is to receive a lot of traffic so that you can make some reasonable money out of your website. One of the main and best ways of getting more traffic to your site is for you to do some seo (search engine optimization).


There are many things involved when doing seo for your website. Some of these include the following:


Designing your website


This is a very important thing to consider when doing seo. This is because the better your site is optimised for the major search engines, the better your chance will be of you getting an improved ranking for your chosen keywords within the major search engines. To optimise your site you will need to use metatags, which contains your chosen keywords. You will also need to design your site so that it is not just full of images.


You will need to add some good quality content to your web pages in the form of text. When writing text/information for your web pages you must consider using your chosen keywords throughout the content so that it has a good keyword density for your chosen keywords.


It would probably be best to aim for a keyword density of around 7% - 9%. If you have a keyword density any higher than this, then the search engines may penalize you for spamming their search engines with your chosen keywords.


The navigation of your site is also very important as when people arrive at your site you will want them to easily be able to navigate through your sites content. If they can't do this then they may leave your site and go to another one, which could even be one of your competitors.


Link Popularity


Link popularity is very important in making your website rankings improve. This is because the more sites that link to your site the more important your site looks to the major search engines. But when getting sites to link to yours it is best to make sure that your link is on a page of a site that is based on the same topic as your site. Doing it this way will benefit your site much better than you having your links on pages of other sites that has topics that have nothing to do with your sites topic.


So you are probably now wondering how you can get sites linking back to your site. Well there are many ways of doing this. Some of these include the following:


1. Doing link exchanges


Doing link exchanges with other sites is a good way to increase your sites link popularity.


2. Writing reprint articles and submitting them to article directories


Writing reprint articles that is on the same topic as your site and then submitting them to free reprint article submission sites is one of the best ways for you to increase your link popularity plus visitors.


This is because you are writing an article about your sites topic and then you are placing your sites link in the authors' resource box, which then means that your link will be on the same page as information about the same topic as your site.


Placing your sites link in your articles also means that it is a one-way link, which is an advantage to link exchanges.


Some article directories have so many articles that they also include an article achieve so that your article can stay in view of the major search engines for some time and also so that you can easily navigate the articles more quickly by using the achieve.


For example, the articles directory at: http://www.simplysearch4it.com/article/articledir.php has an article achieve at: http://articles.simplysearch4it.com , which lists all of the articles listed within the SimplySearch4it! database in an easy to navigate and read format.


3. Using forums and including your sites link in your signature


Using some forums will increase your sites link popularity, but using these is not as good as using some other forms of getting links to your site. This is because some forums use some sort of redirect so that the search engines can't see your link in the forum.


4. Adding your sites link to free to submit general and specialty web directories


This is a slow, but good way of increasing your sites link popularity and also increasing your sites visitors. This is because many web directories have good rankings within the major search engines, which means that your site will also benefit from it.


5. Writing press releases


This is a very good way to increase your sites traffic and link popularity. As longs as you can write a good press release you should see a good increase in your websites traffic and link popularity.


 About the Author: Jonathan White has been involved in online marketing for over three years now and is the Webmaster of http://www.simplysearch4it.com where he also operates a large free to play online games directory at http://games.simplysearch4it.com


 

Writing Web Page Titles to Enhance Your Site Exposure

Pay attention to the titles of your web pages. They are more important than most web authors realize. If you understand why the title is so important, you can write web page titles that will enhance your site exposure and will attract more visitors to your web site.


1. Web Page Titles


In your web page HTML code, the page title is the text that is enclosed by the opening and closing TITLE tags.


The title should be placed between the HEAD tags, ideally just after the beginning HEAD tag and before the first META statement. It should be one of first things that a search engine spider sees when crawling your site.


2. Why are Titles Important?


The title is important for a variety of reasons.


- Most browsers will display your page with the title at the top of the browser window.


- If someone bookmarks your page in their browser, their bookmark list will show your page using the title. So if the title of your web page is "Home Page", as many are, your visitors' bookmark lists will contain a lot of "Home Page" listings. How will they determine which "Home Page" is yours two months from now?


- Google and other search engines present the results of a search by displaying page titles as links in the first line of each query result. Search engines doesn't like to display "Home Page" as the best they can do for a user searching for "purple people eaters," for example.


- Most search engines will order the results of a search engine query based on the relevancy of your page to the keywords used for the query. One of the factors in determining this relevancy is how closely your title matches these keywords. If your small startup company makes purple people eaters, don't give your home page the title "Unknown Business, Inc." It's not relevant to the search. Of course, many people will consider "CNN" or "Time, Inc." as relevant for the keyword "news." When you get to be well known, you can use your name as your title.


You can see that the title of your web page is highly visible to others, and it can impact the search engine ranking of your web page. It is therefore worthwhile to spend some time carefully writing each page title.


3. Practical Title Writing Tips


Here are some practical tips you can use for crafting an effective web page title.


- Start by thinking hard about how your potential visitors will search for your site. What keywords or keyword phrases will they use in a search engine query? Use one or two of the most important keyword phrases for your title. In our example, the home page title could be "Purple People Eaters."


- Don't just use the same title for all your web site pages. Your About page title could be "About Unknown Business, Inc., your Source for Purple People Eaters." Your Order page title could be "How to Order Purple People Eaters."


- Don't include your company name in the title unless it is a commonly recognized name or the page is about your company. Use the limited real estate in a title for relevant keywords. You can include your company name in the description META tag of your web page.


- Make sure the title does not exceed 66 characters. Google will not display more than 66 characters of a title in the search results page. Truncated titles irritate search engine users.


- Don't use more than 7-10 words in your title.


- Be careful when using some web page generators or editors. Many will either ignore the title or make up an ineffective title like "your title goes here." You may need to dig into the underlying HTML code to actually see your page title.


Understand what the title element of a web page is, why it is important, and follow these practical tips for writing your web page titles. Remember: it's all about your keywords. As a result, your internet visibility will be improved, you will improve your search engine ranking, and you will get more visitors to your web site.


About the author:


Kempton Smith is a web marketing specialist who helps internet businesses increase their site exposure and gain more visitors. He operates the Ad Buddies banner exchange network at http://adbuddies.com


Visit http://kemptonsmith.com for web versions of this and other articles by Kempton Smith.


 

Search Engine Optimized (SEO) Copy: The Down and Dirty Details

Today, being on the first page for your most popular keyword phrase is like having the most memorable prime time television commercial in 1973.


Essentially, that's where the power of advertising is going. It's all about Search. And Search is only going to become more important over the next ten years. If you can get on that coveted first page organically, well then, more power to you!


I know you probably have read other articles about writing SEO copy and how it relates to achieving high search rankings, there are plenty of them to go around. But some of the articles are complicated; some are too long, others boring.


Still others don't explain that great SEO copy is rarely effective just on it's own - in order to get those high rankings (which is the goal after all), you have to do other things too.


So, we decided that what most people really needed was a "down and dirty", easy to understand, ten step method. You ask, and we deliver. Here are the official ten steps in order:


1) Check your Competitors


Who are your competitors? Do you know? If you don't, you may want to go online and do a search for your product or service. Who is on the first two pages? That is your target. Those are the companies that you want to compete with. Because right now, they are getting your customers.


Take a look at their website. Notice the copy. Analyze their business. Are they successful? What are they not doing right? Look for the holes. You're going to meet the needs of their customers (that they are not fulfilling) so they become YOUR customers.


2) Research your Keyword Phrases


Remember, Keyword Phrase research is critically important. It can also be a little tricky. Do you know what keywords or keyword phrases your customers search for when they look for you? Are you sure? See what your competitors are using. You can do this by right clicking on their home page, selecting "View Source" and then checking their keyword meta tag.


Next, go on to Overture.com (Now Yahoo), click on "Visit the Advertiser" section, and use their Keyword Selector Tool. Its great - and its free. You can also use Wordtracker.com, but it does cost $7.50/day.


3) Write Good Copy


Now its time to start writing. Or re-writing. And if you can't write, you can hire a website copywriter to do it for you. The bottom line is to write about benefits, not features.


Don't tell them how great your company is. They will discover that for themselves when you overwhelm them with your service and deliver the perfect product that meets their every need.


Make the copy to the point and snappy. Make it sure it has impact, and asks the potential customer for their business.


4) Integrate Your Keyword Phrases


After you figure out which keyword phrases you want to use on your site, you need to integrate them into the copy of your site. Think Home Page and Services page as the most important pages to use them on.


Essentially, you want to them to make up 5% of the total words on the page. It's not that hard actually. Just don't use them all over the place like some stupid copywriters do - you could get de-listed from the search engines.


Definitely use them in your headlines and sub headlines. That will get you extra points, so to speak.


5) Check your Links, Build if necessary


Do you have any inbound links pointing to your site? If not, no amount of awesome copy is going to get your site high rankings. Links are very important. And even more so for Google. Go to linkpopularity.com and see what you have. Also check your competitors. If they have more than you, you need to get some high quality links.


You can either do that yourself or hire someone to do it for you. It can be time consuming and expensive. But you gotta do it. It makes you look important to the search engines. A good way to do it is to make sure you are listed on directories, including DMOZ and industry related sites.


Go to incominglinks.com to see which ones you should be listed on. Then write articles and submit them to article submission sites. That could potentially give you hundreds of links for free.


6) Use Go Rank's Analytical Tools


GoRank.com is a fantastic free resource for SEO. They have a Keyword Density Analyzer, Link Popularity Analyzer, Top 10 Keyword Analyzer, Research, News and lots of other great stuff. It's a great tool and something I use every day. It will help you in your SEO copy efforts.


7) Submit to the Search Engines


After you've ensured that your website copy and all the other important SEO considerations have been completed, it's time to submit to the Search Engines. Do it manually.


Go to the search engine and do it yourself. Definitely don't pay someone to do it for you. You may have to resubmit a few times, but eventually, your site will get noticed.


8) Tweak the Copy as necessary, Add New Content


After your site has been up for a while, go back and take a second look. Ask friends and customers what they think of it. Can anything be improved? You'll be surprised what you may hear. If something isn't working, fix it. Make it sound better.


The other thing that you should also be doing is adding content regularly. Build free resources into your website. This will make your customers and potential customers happy, and it will make you appear to be more important in the eyes of the search engines.


Adding articles to your site is a great way to give your customers new, free content. Write about things that they would find interesting.


9) Partner with a Great SEO Firm


You may want to also consider partnering with a great Search Engine Optimization firm. There are probably a hundred very good ones out there. There are also 1000 bad ones.


For the good ones, check out topseos.com. and marketingsherpa.com. They both list reputable firms. Whoever you go with, make sure they have a list of client success stories complete with stats to back it up.


10) Measure Your Success!


Finally, measure your success. (or progress) Get a software program that will provide you with some web metrics as it pertains to your site. WebTrends and Statcounter are great ones. There are quite a few others as well.


As you can see, Search Engine Optimized copy is only part of the overall SEO story. There is a lot to it, so make sure you cover every area. After all, we want to get your web site to the very top!


C2005 Jon Wuebben. Do you need Search Engine Optimized (SEO) Web site copy that moves customers to buy? Are you looking to create an effective newsletter/e-zine article or ad for your business? We provide world class copy that helps you to be found on the web. 10 years experience providing superior copy to businesses nationwide. Contact us for a complimentary Website Copy analysis. Subscribe to our Better Business Writing (BBW) Newsletter and receive 2 free reports. http://www.customcopywriting.com


Wednesday, June 22, 2005

 

Fresh Content Improves Search Engine Optimization

Copyright 2005 ArteWprls Business Class


Many search engine optimization companies will sell you a search engine optimization package that addresses many of the major aspects of search engine optimization.


These aspects include, but are not limited to, use of file names, alt tags, h1 tags, keyphrase density, meta tag optimization, link analysis and the like. These are all key aspects of a good search optimization.


However, one problem is that the major search engines (especially Google) not only rank pages upon relevant content (which is determined by the factors listed above, and more), but by fresh content as well.


What this means to you is that, even after your site has been "optimized to the max", your rankings will increase to a certain level and then not go much higher.


 To get to the top and stay there, your site should deliver fresh, relevant content on a regular basis. Depending upon the nature of your business, your competition, and targeted keyphrases, the rate at which you should add content to your site can vary from monthly to daily.


The delivery of fresh content to your site, in a form that is readable by search engines (i.e. not through the use of javascript, iframes, or the like) requires a dynamic, database driven content management system.


The most cost effective way to achieve this is through the use of a weblog that sits on your server and resides under your domain name. Updating the weblog with rich articles or commentary, broadcasting this information to the internet, and allowing users to post comments, achieves the following:


1) Increases the number of inbound links to your website


2) Increases the frequency at which major search engines will spider or crawl your site


3) Increases interactivity for the web user


4) Improves your search engine ranking


About the Author:


Matt Foster is the President and CEO of ArteWorks Business Class. Mr. Foster has been providing search optimization solutions since 1995. Please visit http://www.arteworks.biz for further information. You may also call toll free 877-336-8266


 

Valuable Content Equals Links

by Stephan Miller


Every day you want to wake up to a thousand more hits than the next. It doesn't matter if you are selling something or are just writing a blog. You want traffic. Why have a website if you can't get visitors?


But it doesn't matter how valuable your content is. It doesn't matter if you created the greatest thing since sliced bread and you're selling it exclusively from your website. No one will find you if you don't reach out. And how do you reach out? With links.


In the brick and mortar world, a retail business depends on demand and location. If you put a store selling something that somebody wants in a high traffic location, you will be successful. And sometimes it doesn't matter if your price is a little bit higher than everyone else's. Look at convenience stores that set their prices twice as high as any grocery store. So they make a profit? You bet they do.


Now if we use this model for an website, location can be compared to your search engine rankings. The higher you rank for your chosen keywords, the more people come to your store. If you provide something that your visitors want, valuable content, they may buy something or they may just remember your site and be back for more.


Getting people to link to your site is one way to increase your ranking with the search engines. But a lot of people go about this the wrong way. By posting to free for all links pages, spamming blog comments, or worse. I had a blog that got about 10 comments a day from people that were just trying to get a link to their site. The site was a family photo blog. It was an easy target, but do you think that anyone looking at a picture of my family wanted to click on the Cialis website link in the comments?


Well, search engines don't think so either. Links like this are being downgraded in many search engines, including Google. It's a good thing. And this brings us back to valuable content.


Valuable content will convince people to link to you. Did you know that most blogging software can put a "blog this" link on Internet Explorer's toolbar? This button puts a link to your site right from their blog. And once your site gets into the blogosphere, there's no telling where is will stop. Maybe with top rankings for keywords you never optimized for, but that fit your site perfectly.


Did you know that there are site rating groups like stumbleupon.com that can send your site to thousands of people within a week? If people like your content they will pass it on to other users. You could have 500 people linking to you by the end of the week.


What about the "Link to Us" page? I know. I thought it was "old school" too. Until I tried it. It works. Just give your visitors the code they need to copy and paste to their site. Make it easier for them and they'll link to you.


Linking membership sites are great also, especially when you get hundreds of requests for link exchanges because of the caliber of your content.


Yes, things are changing with search engines. But if you have been providing the type of content that people look for, you have nothing to worry about.


Stephan Miller http://www.stephanmiller.com


 

Title Tags - How to Make Them More Effective

By Kempton Smith


Pay attention to the title tags of your web pages. They are more important than most web authors realize. Once you understand why the title is so important, you can easily write more effective title tags.


What is the title tag?


In your HTML code, the page title should be placed between the beginning and closing HEAD tags, ideally just after the beginning HEAD tag and before the first META tag.


Why is the title tag important?


The title tag is important for a variety of purposes.


- Most browsers display your page with the title at the top of the browser window.


- If someone bookmarks your page in their browser, their bookmark list will show your page using your page title. So if the title of your web page is "Home Page", as many are, your visitors' bookmark lists will contain a lot of "Home Page" listings. How will they determine which "Home Page" is yours two months from now?


- Google and other search engines present the results of a search by displaying page titles as links in the first line of each query result. Search engines doesn't like to display "Home Page" as the best they can do for a user searching for "purple people eaters," for example.


- Most search engines will order the results of a search engine query based on the relevancy of your page to the keywords used for the query. One of the factors in determining this relevancy is how closely your title matches these keywords. If your small startup company makes purple people eaters, don't give your home page the title "Unknown Business, Inc." It's not relevant to the search. Of course, many people will consider "CNN" or "Time, Inc." as relevant for the keyword "news." When you get to be well known, you can use your name as your title.


You can see that the title of your web page is highly visible to others, and it can impact the search engine ranking of your web page. It is therefore worthwhile to spend some time carefully writing each page title.


Practial tips for writing title tags


Here are some practical tips you can use for crafting an effective web page title.


- Start by thinking hard about how your potential visitors will search for your site. What keywords or keyword phrases will they use for a search engine query? Use one or two of the most important keyword phrases for your title. In our example, the home page title could be "Purple People Eaters."


- Don't just use the same title for all your web site pages. Your About page title could be "About Unknown Business, Inc., your Source for Purple People Eaters." Your Order page title could be "How to Order Purple People Eaters."


- Don't include your company name in the title unless it is a commonly recognized name or the page is about your company. Use the limited real estate in a title for relevant keywords. You can include your company name in the description META tag of your web page.


- Make sure the title does not exceed 66 characters. Google will not display more than 66 characters of a title in the search results page. Truncated titles irritate search engine users.


- Don't use more than 7-10 words in your title.


If you understand what the title tag is, why it is important, and follow these practical tips for writing your web page titles, your internet visibility will be improved, you will improve your search engine ranking, and you will get more visitors to your web site.


Copyright © 2005 by Kempton Smith, All rights reserved.


About the author:


Kempton Smith is a web specialist who helps internet businesses increase their site exposure and gain more visitors. He operates the Ad Buddies banner exchange network. Visit http://adbuddies.com


 

The Google Patent and SEO

by Stephan Miller


Google's Patent Application contains a lot to read and reading it may take some time, but if you own any type of website, this is all information you need to know. It also brings some interesting points up. While I go over some of the important points, know that no one knows which of these factors is given more weight than the others.


Domain Name Registration - Google is now going to track when a domain is registered among other things. An older domain will get a higher ranking. No more throw away domain names. No more jump to the top of Google results in thirty days.


They will also be tracking the length of renewal on the theory that a person that renews for ten years will be more likely to build a worthwhile site than someone who only holds their domain for a year.


Google will also be keeping a blacklist of known spammers and will be using this list when checking dns records of websites. So spammers who make sure to get their new throwaway domains with different nameservers in order to throw Google off may have to try something new.


Google Spyware? - They are using "user behavior" to rank sites. In my book, if spyware removers try to remove Alexa every time I run it, then this function of the Google toolbar can only be called spyware. Yes, you may check the box on the terms of service for the toolbar, but it still tracks your internet browsing.


But, I think the theory will make search engine results much better.


Google will be tracking the number of times a document is selected from the search engine results. This is a great idea. It means you now have to write the titles of your pages to grab the searcher's attention. And since the search terms are highlighted in the results, maybe placing them at the beginning of sentences in your page may make then stand out due to capitalization. But I also see a way that this can be spammed by a network of "search and click" spammers.


They will also be tracking the amount of time a person spends on the page that they find. I don't know about you, but I have been around long enough to notice a spam page and I am gone in two seconds. This may help drop them out of legitimate results.


Content Changes - I think this comes down to just updating your information the way it should be updated. If you have a forum that hasn't been active in a week, the one that is very active with new posts every minute will definitely rank higher.


But the document also mentions that some stale sites may not be ranked lower if not updated that much. For example, a site on the Civil War will not be expected to change as much as a news headlines site and an older, more stable site may get the rank boost.


Query Analysis - A search for "American Idol Winner" will produce different results than it did last year, even if a page on last year's winner has more links pointing to it.


Google will be following trends by the increase or decrease in the usage of certain search terms or phrases. I am not sure how this will be implemented. Will there be a quicker ranking algorithm for new trends? Or will sites that have a tendency to break new topics get top billing for such terms?


The search engine will also be sensitive to terms that could be used for different subjects. When you search for "Deep Throat" are you looking for Mark Felt or a Linda Lovelace movie? Google will track what searchers are actually looking for and changes in searching trends.


A Google Browser? - Google also says that they will attempt to track bookmarks and favorites files along with cache files to help determine the ranking of sites. The only way I see this happening is through their own browser and again, this brings up the question of spyware.


Topics - Pages will now be tracked for the topics they cover. Maybe this is what Site Flavored Search is all about. Google says that changes in topic will traced for scoring. So a drastic change in a site may drop in down in the search results. I think this must already be in effect, just for some of the things I have seen with my own sites.


Anchor Text - Google says that links to pages from other sites tend to have differing anchor text if they are obtained naturally. Artificial linking campaigns tend to produce anchor text that is the same.


Anchor text that changes when the page the link is on changes will be counted as being more relevant.


Anchor text that changes with time may indicate a change in topic on the site.


Anchor text that is no longer relevant to the site linked to may be discounted.


Traffic - Google will track traffic to a page to determine if the content is stale or not. This is a cue that sites will no longer be create and forget. Google will also factor in Advertising traffic.


Linking - Google says that legitimate sites attract links back slowly. Whether this is true or not depends on the definition of "slowly". I know of sites like stumbleupon.com, where users comment and rate sites constantly and one site sent into the mix can get hundreds of links to it within a day just from comments posted about it.


Google also says that exchanging links, purchasing links, or gaining links from documents where their is no editorial discretion are all forms of link spam. Does this mean that if you link to someone and they link to you, that is spam? Than a lot of bloggers out there who aren't really trying to spam may get accused of doing so.


They will also be measuring the authority of the page that the links are on, mentioning government documents specifically. This smacks of information control. Who assigns this authority and what makes one person more of an authority of another? If a political issue is searched for, will a Democrat's or a Republican's page come up first?


The freshness of the page that the link is on will also help determine the freshness of the linked-to page. This is a good argument for using a blog and pinging after your entries.


A page that is updated while the link on that page remains the same is a good indicator of the relevant link.


Ranking History - Ranking change is another feature that Google will use to detect spam. Not that all sites will be flagged as spam sites if they see a huge jump in ranking. Some of these sites could be topical. The authors of the site may have caught onto a new trend just as it was rising.


But Google also will measure the change in a sites ranking to determine if the content is becoming stale, i.e. a drop in links to the site.


Now this must mean some sort of balance and I hope they have leeway for traditional SEO. For example, If you have written new software and have created a PAD file for it, you can literally get hundreds of new links in a weeks. It only takes a second to submit.


What about if you started your own affiliate program. You can get a lot of links quickly that way? Will Google see this as spam? We will have to wait and see.


Finally Hope - Competition always inspires a better product and more options for internet users. Despite the focus on Google in search engine forums and its name being used to define "search for something on the internet", i.e. I Googled him, Google hold on the market has actually dropped.


When once you could optimize for Google and leave it that, now the combined use of MSN and Yahoo is greater than Google, with Yahoo nipping at Google's heels.


This leaves options for us as search engine marketers and internet searchers. If one search engine doesn't suit us, at least we know that it isn't the only one we have to choose.


Stephan Miller http://www.stephanmiller.com


Monday, June 13, 2005

 

Monitor and Increase Your Search Engine Visibility with the DIY SEO Tools

Copyright 2005 Tinu AbayomiPaul


In this three part article, you'll find many tools that any webmaster can use to monitor your site's search engine position, and use to increase the visibility of your site in major search engines like Google, Yahoo and MSN.


URL Trends


Most of the coverage I've seen focuses on the ability of UrlTrends to allow you to "View Any URLs Google PageRank, Alexa Rank, Popular Search Terms and Incoming Links".


And that's a great thing, to be able to see all of that from one place. But one great thing missed about this tool are that you can subscribe to changes to the results via RSS -hands off monitoring of your site.


Fagan Finder's URL Info


This online gadget is like the Swiss Army knife of site information, giving you one-page access to dozens of pertinent check-ups. But monitoring relevant search engine information like your backlinks, or the cached pages in a search engine are just the tip of the iceberg.


You can use URL Info to check that your HTML code is validated, translate your page, and if you're a blogger, discover where your site is mentioned in the blogosphere.


Spannerwork's Spider Simulator


Ever wondered what your site looks like to the search engine spiders that crawl the web, looking for information to include in their databases? Go to this page to see what information is seen by the spider and what it skips over.


Spannerworks.com can also help you figure out how to troubleshoot content that seems like it should show up to a spider but doesn't, with its HTTP viewer. They also have a tool that will analyze your keyword density.


GoRank.com's Top Ten Comparison


If you've been banging your head against the wall in an attempt to figure out why you haven't hit the top ten results in Google, GoRank.com has a page that can give you important clues to help you figure it out. One of my favorites, the Top Ten comparison report, will scrub the raw data of the top ranking results for a given keyword.


In studying the results, you may find it easier to understand where your own optimization efforts are going wrong. Don't forget to stop by Google for your API key at as you'll need it to create your free account.


Search Guild's Keyword Difficulty Checker


This one's an old favorite of mine. When you find what you may think is an ideal keyword, before you start tweaking your pages, it's a good idea to run it through this tester. Using the Google API, it analyzes whether or not a given phrase will be worth your efforts.


You'll already have to be well-versed in how to find good keywords to plug into the tool, but once you have that nailed, it's pretty reliable in telling you whether it's worth your time to target that phrase. If you use flash on your site, check out the flash viewer on their utilities page as well.


In the next article in this series, you can read more about tools specific to Yahoo and Google that will help you track your rankings and study your site.


About the Author:


You can find hundreds of pages of tips and news on traffic generation, search engines, blogs and RSS every day at www.freetraffictip.com


 

The Ultimate Free Google Ranking Tool

Copyright 2005 Torgeir Sunnarvik


The first months my website was online,I was constantly checking the search engines to see if my site was listed under the keywords that I was targeting. And always with the same negative results.


The truth is that the keywords that you are targeting are often not showing your site the first months at all on the first 200 search listings. OK,if you try to get your site listed for hoooohjgaagga or something like that,it could get you a first place in no time. But who wants to target that keyword?


In fact, I get a lot more traffic from keywords that I haven't thought of using as a keyword in the first place. One tool that I have found online can easily show you the keywords that your site is ranking well for.


You can find it at: http://www.googlerankings.com/ultimate_seo_tool.php


Here you simply enter your domain name and hit the "Analyze Keyword" button. If you want to,you can change the different settings on this page before you hit the button. But I usually leave this at default.


Then on the next page you can see a detailed report on keywords that are found on your site. You can see their numbers and density. It shows the number of 2 Word Phrases and 3 Word Phrases.


Step 2 Create Position Report


Now to the cool part of this tool. When you hit this button, the tool will search Google for the keywords and keyword phrases that it have gathered from your site. After a few seconds it shows your position for each of them. So now you don't have to wonder if your site is listed in Google.


But as I said,the first months you shouldn't be worried if you don't find your site under any keyword at all. Google use a lot of time to index new websites. I have found that MSN are a lot faster to show my pages under the keyword that I want.


I hope you'll find this tool as useful as I have.


About the Author:


Torgeir Sunnarvik, Norway mailto:webmaster@everypleasures.com Torgeir Sunnarvik is the owner and webmaster of http://www.everypleasures.com/ His site offer free ebooks, ebooks with reprint rights and review of business ebooks.


 

Why Your Website's Search Engine Rankings Are Important

Copyright © 2005 Matthew Rotterman
Keyword Text


I recently spoke to someone who operates a website that sells motorcycle parts and asked her how good her search engine ranking is. She told me that search engine ranking is not a concern for her because she operates a very niche market and that those who are actively seeking the parts that she sells "know where to look". She seems quite content with the number of visitors that she is getting to her site and the amount of sales that she is making each month.


I attempted to explain to her that she could dramatically improve her website ranking and thus her sales if she were to employ some keyword optimized articles, like those we create for our clients, to improve her search engine ranking results. She did not seem to care about this and did not want to spend any money on getting keyword articles to improve her search engine ranking because again, those that are seeking what she sells "know where to look."


If the above true story sounds like a conversation that you might have with a person like myself, then there are a number of things that you need to reconsider. Let's review my conversation mentioned above to show the importance of search engine rankings.


First of all, she is convinced that the people seeking her product can find her website. This may be true if the customer spends a great deal of time surfing through multiple sites and has a monumental amount of patience. It is true that word of mouth probably does send her the majority of her customer base and this is fine if she does not want any more customers. Her search engine ranking is so far down on the list that for all practical purposes, her website does not exist to those who type, "motorcycle parts" into google or yahoo.


I encouraged her to think about the lack of patience that she has when searching for things on the Internet and how that transfers to those searching for her products. I am just as guilty as anyone in this, if I am searching for something on the Internet, I will likely stop reading the 400,000 results after about the first 20-30 websites. If the item I am looking for is not in those first 20-30 results, I will most often type in a different search phrase such as "motorcycle accessories" vs. "motorcycle parts". This concept was lost on her as well.


This is another area where keyword articles come into play to improve search engine rankings. What EXACTLY does a person type into the search engines when looking for her motorcycle parts? One person will type in "motorcycle parts" while another types in "motorcycle accessories" and a third person may type in "motorbike parts" and a fourth types in "motorbike accessories". A fifth person might even type in "motorcycle carburetors", with or without the "s" on the end of "carburetor." Basically, there needs to be an article for each of these five terms to help direct search engine traffic to her site so that no matter what the potential customer types into the search engine, they are directed to her website. This concept was lost on her too.


The third idea that she did not consider was the basic financial concept of return on investment (ROI). Return on investment simply means that if you spend $100 on marketing and as a result of this investment, you increase profits by $1,000 then you would have a 900% ROI. This percentage is calculated after subtracting the initial $100 spent on the marketing campaign, then dividing the remnant by the $100 invested, and finally multiplying your result by 100 to get your ROI value.


 {ROI = [(Payback - Investment)/Investment)]*100}


So, would you like to make $900 by spending $100? I would. The woman that I spoke to failed to realize the potential return on her investment, because she did not want to invest her cash on hand into the future of her business. Most successful website operators have realized the importance of marketing and search engine rankings. They know that they need to be listed in the top 20-30 results if they want to get the clicks, and they know that keyword optimized articles help them to achieve these high rankings.


Decide where you want your website to be located in the search engine rankings. Do you want to make "some" money like the woman in my true story? Or, do you want to dramatically exceed your current profit margins by increasing your website traffic? Just as it was with her, it is up to you to decide if you want to make ends meet with your online business or if you want to really make some money by investing in your marketing.


Matthew Rotterman is a writer and editor at: KeywordText.com. "Keyword Text" managed to bring together several writers and editors to provide a few low-cost writing services for those who are working hard to become more profitable. They offer Content Creation Services which include: Exclusive WebPage Content and Reprint Articles. Keyword Text also offers very attractive Volume discounts. Compare us to our competition, you will be surprised.


 

Cosmetic Changes at Google Precede Larger Overhaul

By Jim Hedger, StepForth News Editor, StepForth Placement Inc. (c) 2005


Google is undergoing some of the most sweeping changes in its short, seven year history. As of next week, Google will have finished sorting what might be its largest algorithm shift ever as the final points of the 3.5 part Bourbon Update were installed last Monday. This update has been staggered into three and a half sections in order to avoid a massive amount of dislocation in established rankings as was seen in previous major updates. While changes stemming from the Bourbon Update have not actually manifested into a full reordering of Google's search engine results pages (SERPs), many individual webmasters have reported fairly significant losses or gains in ranking over the past few days.


There are dozens of factors behind changes at Google but the greatest is the enormous valuation of the company itself. With share prices nearing the $300 mark and current market capitalization topping $80 billion, Google is considered the most valuable media company in the world, surpassing the $78 billion value of Time-Warner and rising far above Yahoo's estimated value of $56 billion. Most of Google's riches are newly found, having been generated after their August 2004 IPO. In their race to outlast, outperform and outsmart their competitors, Google has changed its PR strategy and its appearance to suit the legions of suits swirling in and out of their Mountain View offices.


While money may move mountains, it takes a community to change an institution. The search environment has changed substantially over the past three years and in that time, every major player in the search sector has changed as well. Today, Google has become a lot more complicated, so much so that it has stopped trying to look simple. This change in corporate attitude is best reflected in two places, the homepage and the About Google section.


Google's homepage used to be quite simple. Recently, Google created a personalized portal interface google.com/ig offering users instant access to several of these new features. For folks with Google accounts such as Gmail users, subscribers to Google Groups, Google desktop users and other account holders, personalized versions of the once sparse homepage now presents instant entry points to the various applications the individual uses. Many industry observers have suggested Google's adoption of so many new features and an all-in-one interface show they are moving towards presenting themselves as more of a portal like Yahoo or MSN. Google has always been a bit different than its competition. Even when borrowing and innovating on competitors' ideas, Google has, until now at least, managed to keep itself at an arm's length from the mainstream in appearance and operation. The maintenance of that image gave Internet users an alternative view of Google, one that propelled Google to a position of almost total dominance of the search engine sector. While that dominance might have slipped over the past year, Google is still the most popular search appliance in the world.


One of the ways Google has acted differently than others is in the appearance of not taking itself too seriously. Its corporate ethics policy was limited to the three word phrase, "Don't be evil". Its front page interface retains the double-entendre induced "I feel lucky" button, even though the button is rarely used. The prospectus issued during their August 2004 IPO was specifically written to appear idealistically anti-corporate. Since its introduction, Google has practiced projecting a simple, youthful image that required very little in the way of explanation, so long as their search engine lived up to users' expectations.


Google strives to live up to user expectations and, for the most part, has met and exceeded them time and time again. There is one long-held expectation that Google may not be able to live up to any longer though. Many of us assume Google's relatively informal public attitude will continue to carry over into the later part of the decade. It won't. By comparison, Google will almost certainly continue to be perceived as the search engine driven by youthful energy. Whenever competitors such as MSN or Yahoo try to appear as down-to-Earth as Google does, their efforts seem obvious and forced. Does anyone remember that poor-fellow in the butterfly suit wandering aimlessly around New York last year? Google's communication style is maturing and the best place to view these changes is on the About Google section of their site.


Google has published information about itself on pages found behind the "About Google" link for several years. While documents found in the About section have never been totally static, a facelift over the past few weeks has radically altered the look and feel of the section. Along with the traditional organic search engine results and highly targeted paid-ads, Google is actually a series of 30-someodd search-based applications ranging from alerts and answers to wireless search and weather information. Driven in part by an inventive entrepreneurial spirit and in part by a desire to keep up with products offered by competitors, Google has been rapidly adding new features and tools to their core search service for the past three years.


Google's About Google page was once much smaller than it is today. It has grown slightly larger every time Google adds another offering to it. The biggest changes are found behind the increasing number of links on the About page. Today's version of the About page has five boxes added to the left hand side of the page advertising Google Desktop, Blogger, Google Code, Google Mobile, and My Search History. In the center column, Google continues to show four main site sections labeled, Our Search, For Site Owners, Our Company, and More Google. Collectively, those sections contain a larger number of links than they did previously and the number of documents found behind those links has grown as well. Serious Google users should take an hour or two to tour these changes and learn more about the staggering range of features, services and search-enhancements Google now offers.


For webmasters and SEOs, an examination of the new Google Webmaster Guidelines is a definite must. Google has recently changed its webmaster guidelines which are also considered to be a primer on "ethical SEO" practices in relation to Google placements. Google has recently updated its webmaster guidelines to include information on "supplemental listings", crawling frequencies and prefetching. Google has also posted information on its new Google Sitemaps experiment.


Google Sitemaps is perhaps the most important new feature for SEOs offered by Google in a long time. Said to be an experiment in spidering, Google Sitemaps invites webmasters to feed site data directly to Google through an XML sitemap page. Webmasters and SEOs can now tell Google exactly which sections of their sites to crawl, and providing they are keeping their XML sitemap current, when and where to look for changes to their sites. This experimental initiative will especially help webmasters working with database driven sites or large Ecommerce sites where documents are subject to frequent change and are often found behind long-string URLs. Google has been kind enough to provide detailed information on establishing an XML feed and setting priorities for Googlebot.


As it grows, Google appears to be running into the same problem other webmasters with numerous sites or services encounter, the rapid dilution of a domain's unique topic focus. In order to keep themselves accessible, understandable and relevant, Google's teams of engineers, programmers and public relations specialists are involved in what appears to be a massive overhaul of the interface, public documents and the basic sorting algorithm that produces organic results. As in previous years, how this all plays out in the end is entirely up to the searching public. From the SEO/SEM perspective, it is a good thing Google is in the midst of this update. Web workers have been demanding a greater degree of transparency from Google for some time now and perhaps these updates are the beginning of a new commitment to communication from the Googleplex.


Jim Hedger is a writer, speaker and search engine marketing expert based in Victoria BC. Jim writes and edits full-time for StepForth. He has worked as an SEO for over 5 years and welcomes the opportunity to share his experience through interviews, articles and speaking engagements. He can be reached at: jimhedger@stepforth.com


Tuesday, May 31, 2005

 

Content Sites are Great Marketing Tools

By S. Housley


Highly targeted, focused sites that are related to specific market segments are highly advantageous and can often be created using existing web content. The key is to provide value.


Think of the time spent surfing the web gathering resources and information. By creating a topic-centric resource compiling information, webmasters are providing a service or value. In many cases that value is simply the compilation of topic-specific information in a single resource. The compilation of this information in itself is the value.


These highly focused content sites can be great supplemental portals that are invaluable as a marketing tool for niche products. Niche portals help define expertise in a specific market segment, not to mention the added benefit of providing valuable topic-specific links.


The topic-centric portals also tend to achieve high search placement and will often provide advertisers high quality exposure, allowing webmasters the opportunity to capitalize on their efforts.


What to put in topical portal?


ARTICLES


Many article writers allow webmasters to republish their articles.


Search the large article directories for quality topic-specific articles using keyword searches. The articles contained in these directories often allow publishers to freely reproduce the article's contents as long as the hyperlinks in the article and article resource box remain intact.


Resource for finding articles for publication: GoArticles - http://www.goarticles.com


RSS FEEDS


By nature, RSS feeds are designed for syndication. Most RSS feeds can be freely reproduced. Locate topic-specific feeds using keyword or category searches. The contents of the feeds can be used to populate web pages. There are a number of free scripts available that allow webmasters to display the contents of an RSS feed (see http://www.rss-specifications.com/displaying-rss-feeds.htm for instructions on displaying feeds).


Resources for finding RSS feeds for syndication: RSS Network - http://www.rss-network.com RSS Locator - http://www.rss-locator.com


LINK DIRECTORY


While this takes a little more time, compiling a collection of niche websites on related topics can significantly enhance the value of a portal. Topic-specific directories and search engines can achieve high search engine rankings with the larger engines like Google and MSN, and can easily be optimized for a collection of search terms. The process can even be automated if you have programming experience.


Sample Link Directory - http://www.investing-partners.com


TOPIC SPECIFIFC FORUM


The most successful forums are those that are highly focused and niche-oriented. Establishing a community of individuals with common interests will result in return visitors. Managing a forum is not overly complex and there are free forum scripts available that will provide the forum structure. Many of the forums have scripts available that will allow for search engines to spider the contents and forum posts. As the content flourishes, the site will increase in value.


Free Forum Scripts: http://www.phpbb.com


More on topic-specific portals or information radars can be found in Robin Good's book 'Newsmastering'. http://www.masternewmedia.org/reports/newsmasterstoolkit/


A site that is focused on a relatively narrow range of goods and services will find that there is less competition. Topic-centric websites that provide a gateway to niche information related to a particular industry, sector, topic or market segment are becoming increasingly valuable and popular.


Compiling the resource using free resources will minimize the capital investment. Regardless of whether you are marketing a product, service or advertising, narrowing the topic focus will attract a targeted audience who genuinely are interested in the website topic, allowing you to monetize the portal, and minimize the expense.


About the Author: Sharon Housley manages marketing for FeedForAll http://www.feedforall.com software for creating, editing, publishing RSS feeds and podcasts. In addition Sharon manages marketing for NotePage http://www.notepage.net a wireless text messaging software company.


 

The Business Case for SEO

Author: Herb Osher, ExclusiveConcepts.com


It's interesting how potential clients have preconceived notions about which aspects of search engine marketing have the most value. In fact, they tend to fall into two camps that are 180� apart.


The first camp believes completely in the value of pay-per-click marketing (PPC). It's easy to understand why. PPC provides immediate and measurable benefits. The ROI of PPC marketing is obvious. This group doesn't understand why it's necessary "to bother" doing SEO.


The second camp believes the only way to go is SEO. Clicks are free and the branding benefits of high rankings have been well documented.


The right answer is that they are both valuable. Each has its benefits and when you can afford to, you should implement both.


Pay-per-Click


PPC makes sense if you want immediate benefits and like the idea of paying for performance. SEO provides branding benefits and longer- term will provide an ROI that is compelling. But unlike PPC, SEO revenue results aren't as directly measurable and manageable.


Pay per click (PPC) gives you the ability to have complete control over your search traffic. With PPC programs you select the keywords and write the listings. You control where you're listed and what the listing says. You decide what your budget is and can adjust your spend rate based on results or events (e.g. announcements, promotions).


By tracking results from a PPC campaign, you can build up a knowledge base with respect to your business, including which messages perform the best, which search terms have the best conversion rates, and what destination URL is best for specific users to land on. Over time, this knowledge can help you to improve and define your business.


One of the greatest attractions of PPC is the ability to easily track clicks and costs allowing you to understand your ROI from a specific marketing initiative. This gives you confidence to spend money and drive volume. You may have thought that spending $5,000 a month on a PPC campaign is way outside your budget, but once you measure the ROI, you may realize that it's well worth the investment.


Search Engine Optimization


So, if PPC is so great why bother with SEO? Basically, because you will be missing out on a large number of potential clicks. How large? A number of recent studies have demonstrated that there are still a lot of users that do not click on the "paid" listings but rather will search through the regular editorial search results.


The accompanying chart shows that 60% of the search users prefer (some exclusively) organic over paid listings. The only way to get optimized (high) rankings in these regular editorial results is through an effective SEO program. In most cases, once you have good positioning in the regular search results, you will continue to receive "free" traffic.


Again, based on data from a number of marketers the increase in traffic due to SEO averaged 73%. Consider search engine optimization the same as you would word of mouth advertising or public relations. It's exposure that comes with a very high degree of credibility and trust. Traffic coming from traditional search listings tends to have high conversion rates.


There's another advantage to traditional search listings. They are considered unbiased and non-commercial. Traditional search performs very well at certain points in the buying process. When consumers are gathering information about a purchase, they show a marked preference for traditional search listings. When they are ready to buy online, they seem to have less bias against paid placement listings and their likelihood to click on one of these listings increases.


The Dollars and Cents of SEO


Perhaps the most compelling reason not to exclude SEO from your online marketing strategy comes down to dollars and cents. In an attempt to quantify the business case for SEO I have gone back and done some analysis on three recent SEO engagements and the results they achieved. 


I chose ecommerce clients that we had optimized and reviewed their average sales before and after SEO was implemented. In two of the situations the only change made was the optimization of the site. In another the optimization occurred at the same time we implemented a PPC campaign. In the first two cases the store sales rose 64% and 75% after the SEO was implemented.


In the third case the store revenue actually went up a staggering 169%, but if you back out the sales that were a result of the PPC campaign, the store revenue that could be attributed to SEO improved by 49%. In other words, the average improvement in store revenue that was apparently due to SEO was 62%.


Can we be sure that all of this was a result of SEO? No. There could have been product, seasonal and other effects that contributed. But I think it's safe to say that there was a significant increase that resulted directly from the SEO.


The bottom line: search optimization has a real and measurable impact on traffic, conversions and revenue (or lead generation) improvement. Given that these clicks begin to approach "free" after amortizing the cost of SEO over time, the ROI for SEO is compelling. Added to the branding benefits no marketer or business owner should doubt the value of search engine optimization.


About the Author


Herb Osher is the Chief Operating Officer for Exclusive Concepts (http://www.exclusiveconcepts.com). Herb has over 20 years of operating and marketing experience and has developed a number of business cases for new ventures and technologies. Exclusive Concepts has over 8 years of experience in designing and implementing Internet marketing sites and strategies.


 

Optimizing Flash: Can it be Done?

Since the inception of Flash, it has been the programming medium of choice for many companies. Most professionals recognize the superior visual aspects that Flash has to offer. Unfortunately, Flash is also very hard to optimize.


Many SEO firms would rather tell you Flash can't be optimized than to try and optimize it. No, optimizing a Flash site is not simple by any means, but it is entirely possible.


The absence of quality content that cannot be indexed is a huge factor regarding any flash presentation. You can add META and keyword tags. Unfortunately, many search engines such as Google do not use these tags.


Large companies shell out big bucks for Flash sites. They don't want to hear that their site can't be optimized because of the format. They love the effects of Flash, but more importantly require search engine optimization for their content. Here are some things you can do to increase a sites ranking when dealing with a flash format:


Ideally, you want to get in on the ground floor of a Flash site being developed. Try to persuade the client to have flash headers and the rest of the site HTML based. This would be the most cost effective option. The other option is to create a duplicated HTML. If neither of these is an option, move on to the next step.


Add your META, keyword and description tags to the opening page to be displayed while flash is running. While some may feel that it can clutter a page, the results are hard to argue with. Follow standard SEO protocol. Use keywords in your TITLE tag. Build your link popularity to boost your client in the search engines. Try to link with sites that are pr 4 or more. Never link to sites that have more than 100 links on their link pages. Also, link to sites that allow you to use descriptions. Use keywords in your description. Stay away from sites that only allow banners or no description in linking. Those types of links do not help at all. Remember, links pointing to your site are more important then links going out, so concentrate on those first and foremost.


The next and final tip is controversial. The infamous page re-direct. There are some re-directs that work better than others. My understanding is that Google allows redirects.


Google states on their site that a using a "301" code in HTTP headers site direct is recommended if you have moved your site. So, using this to redirect an HTML page to a Flash site should not be a problem.


In other words create an HTML page. Fill it with relevant content. Do not stuff it full of keywords. You will then need to create an .htaccess file. To learn more visit http://www.freewebmasterhelp.com/tutorials/htaccess/ and follow the directions. You will need to download the .htaccess file in the root directory of the location where all your web pages are stored.


redirect 301 /old/old.htm http://www.you.com/new.htm


That is all you need to do. Save the file and upload it. Typically you would type in the old url (old domain name) and it would take you to the new url (new domain name). But, in this case we will change the old index file to another name. Create a new optimized HTML page. Then, name it as the old index page. Google will index the new page that is redirected to the flash page. It will look like this


redirect 301 (the instruction that the page has moved)


/old/old.htm (the original folder path and file name)


http://www.you.com/new.htm (new path and file name)


Basically, we are just forwarding one page directly to another. Not to another domain name. This sounds more complicated than it actually is. This is usually the best way to utilize a redirect. You could also use a java script redirect.


<script language="javascript"><!-- location.replace("url") //--> </script>


Never use the META tag redirect. Those have already been recognized as a blacklisted move.


I cannot guarantee that these will not get you banned. But, from my research, the 301 re-direct is the best bet. If your site is 100% Flash and you need to rank higher, this will get you results. These are your SEO options for clients with Flash sites.


Visit Joe's blog at http://mr-seo.blogspot.com/ to read more articles on SEO. You can also visit his other web sites www.jnb-design.com and www.mr-seo.com for more information on SEO and to try out his services.


 

Get the Most Leverage out of Your Articles - Compliments of Yahoo!

by Robin Nobles


I've long been an advocate of a form of online marketing that I personally call "article marketing." Yahoo! has recently added a layer to article marketing which is extremely exciting, and any one who uses the power of articles needs to take notice.


Introducing Yahoo!'s Creative Commons Search in Beta http://search.yahoo.com/cc


Here's what it says as means of explanation at the site:


"This Yahoo! Search service finds content across the Web that has a Creative Commons (http://creativecommons.org/) license. While most stuff you find on the web has a full copyright, this search helps you find content published by authors who want you to share or reuse it, under certain conditions."


Obviously, if you're not a writer and are in need of content on your site, this is a great place to go. You can find content through Yahoo!'s Creative Commons search that you can use for commercial purposes, and you can also find content that you can modify, adapt, and build upon.


What forward thinking on Yahoo!'s part!


Now, let's talk about forward thinking on your part.


Why is this Important to Article Writers?


Let's think about it for a minute. The links pointing back to your Web site from your articles and the relevant link text in the bio are extremely important to a savvy article writer. By allowing other Web sites, e-zines, online publications, and print publications to publish your articles, you're widening the scope of your visibility.


And in walks a powerhouse like Yahoo! with their new Creative Commons Search.


Wouldn't you like your articles to be available in a select search on Yahoo!?


Do you have to think twice? (Or even once?)


How About an Example?


Disclaimer: It's rather dangerous to give an example in print. As soon as you do, your example could slip in rankings. Forgive me if that happens here.


Please go to:


http://search.yahoo.com/cc


Click on "Find content I can use for commercial purposes."


Type "seo articles" (without quotes) in the search box.


Click Search CC.


The #1 page at the time of this writing is: http://www.searchengineworkshops.com/articles.html


Click on the link, then scroll to the bottom of the page. You'll see the Creative Commons License that says "Some Rights Reserved."


It says, "This work is licensed under a Creative Commons License."


Click on the link. You'll see the actual license and what rights are available under the license as well as what conditions have to be met.


So, if you want your articles to be available through a Yahoo! Creative Commons Search, you simply allow it to be licensed out through Creative Commons.


How Can Article Writers Take Advantage of Yahoo!'s Creative Commons Search?


Follow these easy steps:


1. Go to Creative Commons: http://creativecommons.org/


2. Click on the Publish graphic at the top of the page on the right.


3. Answer the questions: Allow commercial uses of your work? Allow modifications of your work? Jurisdiction of your license? Format of your work (such as text)?


4. You can also click to add more information about your work. If you're only going to create a license for one article, you can get very specific about your article.


5. Click on Select a License. You'll see how the license will look on your page. You can then copy and paste the text to your Web site.


You can even add a Creative Commons License to your blog!


How Long Does it Take for Yahoo!'s Crawler to Find the CC License?


After I put the licenses up on our Web pages, our pages were found in a Yahoo! Creative Commons Search within two weeks.


What about All of the Typical SEO Ramifications?


Does the same Yahoo! crawler crawl Creative Commons licenses? (To my knowledge, there isn't a different crawler.)


Will having a Creative Commons license get your pages into Yahoo! faster? (I haven't done any testing on this yet to see if new pages will get into the regular index as well as Creative Commons, but it's an interesting concept.)


What about a brand new Web site with an articles directory? (You need to have a link from another Web site, because of your site isn't indexed at all, you can't expect Yahoo! to find and spider those article pages quickly.)


Does having a Creative Commons license on your articles affect your regular Yahoo! rankings? (I've seen no evidence of this to date.)


What about Relevancy of Search Results?


That's an interesting question. Let's look at another example.


Again, go to:


http://search.yahoo.com/cc


Choose "Find content I can use for commercial purposes."


Type in "wordtracker." Click on Search CC.


The #1 result is our articles page again:


Marketing on Internet Search Engines - articles by Robin Nobles and John Alexander www.searchengineworkshops.com/articles.html


"Wordtracker" is being used 11 times on the page, since we've written several articles about Wordtracker. It's not being used in the title, description, etc. This page is not focused on Wordtracker at all. However, this page definitely has a higher link popularity than our other pages.


We have several Wordtracker articles in those same results, yet our articles page is #1. I'll let you study the rest of the results yourself.


Some of the Search Results Aren't Exactly "High Quality"


We have seen some SERPs that aren't exactly high quality. Will your results float to the top? We'd like to think so.


Will Yahoo! or Creative Commons find a way to police the results that are less than quality? After all, this fabulous tool definitely has potential for abuse if not policed in some manner.


In Conclusion . . .


Article writers, if you don't mind others using your content on their sites, be sure to visit Creative Commons and add the CC licenses to your articles. How easy can increasing your online visibility get?!


But don't abuse the system. If the beta tool gets abused, it may never make it to the full version, which will be a shame for us all.


For those of you who are looking for valuable content to add to your sites, be sure to visit Yahoo!'s Creative Commons Search. This is an ideal spot for finding relevant content that's available to be used on your site.


Just remember that the #1 ranked result may not be the best article for you, so do your research.


Robin Nobles teaches 2-, 3-, and 5-day hands-on search engine marketing (http://www.searchengineworkshops.com) workshops in locations across the globe as well as online SEO training (http://www.onlinewebtraining.com) courses. They have recently launched localized SEO training (http://www.searchengineacademy.com) centers through Search Engine Academy.


 

How To Measure Search Engine Marketing ROI

Copyright © 2005 Charles Preston
Click Response


According to the Search Engine Marketing Professional Organization (SEMPO), advertisers spent $4 billion in 2004 on search marketing programs and are expected to spend 39% more than that this year.


Search engine marketing appears to be a great way to advertise but is it right for you and your business? If you are not already employing search engine marketing(SEM) for you business is there a way to forecast the return should you decide to invest in it? Is there a way to measure the results you are getting if you have already invested in SEM?


The answer is mostly yes. By utilizing data discovered in recently released research surveys and with the help of a few free online tools you can put begin to take some of the guesswork out of search engine marketing ROI.


By using Overture's free keyword suggestion tool (inventory.overture.com) you can get an idea of how many times a keyword is getting searched each month. Another free tool to use is called Good Keywords and can be downloaded from www.goodkeywords.com.


Let's say for instance that you are a mortgage broker in the Denver Colorado area and you are interested in getting more leads for your business. You have a website and are considering search engine marketing to bring in some new leads. You get a quote from a search engine marketing provider who can guarantee top 10 positions among the major search engines for 6 months for your keywords for $1,500.00.


The question now becomes is it worth it to you to spend the $1,500.00. To figure this out we need to look at some numbers.


Berrier & Associates estimate that 65% of all traffic generated by a search in a search engine will go to the sites listed within the first 10 results (first page) returned for that search. By using Overture's keyword suggestion tool you discover that the term "Denver mortgage broker" gets approximately 540 searches a month.


Using this criteria a first page position for "Denver mortgage broker" would bring you approximately 65% of 540 searches a month = 350 visitors to your site each month. Having a compelling title tag in your website's pages might even boost this visitor number since the title tag is what appears as the clickable link in the search results.


The formula we just used would then be applied to all the other keywords you are targeting such as "mortgage Denver" which gets approximately 2,600 searches a month or "mortgage company Denver" with 466 searches a month. A first page placement for any of those would yield similar results.


So let's say we just use the "Denver mortgage broker" key phrase as our example with its estimated 350 visitors a month for a first page position. You would now need to know what your website conversion rate is. The website conversion rate is the ratio of leads or sales you get per visitor amount. The average website conversion rate is about 1-2% or 1-2 leads or sales for every 100 visitors according to Shop.org.


If your website conversion rate is average then you would expect on average 2-3 good leads from your site each month for that one first page listing. Then depending on your sales conversion rate which is the number of sales per leads you get on average multiplied by your average sale price you can begin to calculate what your return might be.


So let's say as a mortgage broker you make roughly $2k on each deal you broker and your sales conversion rate is 1 sale for every 3 quality leads. In any given month then you could estimate 1 sale at $2k out of the 3 quality leads generated from your website which came as a result of the 350 visitors you got from being on the first page of Google, Yahoo or MSN for the term "Denver mortgage broker".


You paid the search engine marketing company $1,500.00 dollars for 6 months of first page listings. From one of those first page listings you stand to gain $2k x 6 months = $12,000.00. That sounds like a really good return for money invested.


In closing the methodology outlined in this article to calculate search engine marketing ROI is by no means 100% accurate due to several factors but it is a good way to get a "feel" for what you might get back for your marketing dollars. Its also a way to get business owners to start thinking about how to better track their e-business.


Charles Preston is an Austin based SEO with over 7 years of industry experience. Charles is also the President of Click Response an internet marketing company focused on teaching small businesses how to get the most out of their internet marketing. For a free consultation or more information please visit http://www.clickresponse.net


Saturday, May 14, 2005

 

The Number One In Your Niche Blog has moved

I've started a new blog where I will be combining all my marketing blog posts (it was pretty stupid posting to three separate blogs and then referring to them on each blog)


Check out the first post on my brand new Marketing Slave blog.


Rise Of The Evil Blog-Spam Empire: Blog Power, RSS2Blog, BlogBurner


Please update this blog address in your RSS reader to 
http://marketingslave.com/feed/



Update: I've decided to continue posting articles on SEO to this blog, so stay subscribed for new posts.

Thursday, May 12, 2005

 

Has Your SEO Stopped Working?

Is this the line your SEO firm is giving you nowadays? That their SEO has stopped working?


Yes its true. Google’s new algorithm, link filters and sandbox are proving to be a formidable obstacle for most SEOs who are still stuck with their old ideas of building reciprocal link directories.


But if you and your SEO have been following good SEO practices from day one, its unlikely that the changes would have hit you hard.


In fact, my sites have only benefited from the changes. I still manage to get brand new domains listed in Google – for less competitive terms on the beginning and then more competitive terms as the site grows and gets more inbound links.


If you aren’t using one-way link building and advanced linking strategies, you’re going to be at a disadvantage with Google.


Get a head start on your search engine rankings with a copy of my search engine optimization guide – now updated with tips and advice on tackling the new Google algorithm changes.


 


Tuesday, May 10, 2005

 

New SEO and Blog Posts

One of the reasons I haven’t been posting much to this blog of late is because I’ve been posting a few blog and SEO-related posts to my Blog Brandz blog.


You can read the posts here


Tagging: Why Am I Not Excited?


Spam-Blogging Can Hurt Searchers and SEOs


Many Niche Blogs Or One Collective One?


Maybe its time to combine one or more of my blogs. After all I don’t want to be posting duplicate content to each.


Friday, April 08, 2005

 

Bypassing Google's New Link Filter: Tips From Wayne Hurlbert

Now that we know Google has changed the way it views incoming links, how do we proceed with Google's new link filter? 


Wayne Hurlbert of Blog Business World has lots of tips that will help you plan your SEO and linking campaign to match Google's new standards.


On the existence of a dampening filter on new incoming links:



The thought that Google may be employing a dampening filter on new incoming links is not new. The idea has been given serious consideration, especially as part of the "sandbox theory" discussions. Advocates of the new link filter theory believe that Google does not give immediate full credit for an incoming link.


The theory says that Google provides a partial immediate credit, by running new links through a dampening filter. Only as the link ages, and remains linked to the site for a given period of time, does the full value of the Google PageRank and the link popularity receive its complete credit level.


That total link value and PageRank credit, is also measured for link theme relevance, making the process of link building much more difficult than in the past.


What the theory contends, in short, is new links don't provide immediate benefit to the receiving website. The link popularity and Google PageRank benefit is not passed in its entirety, from the date of discovery and indexing of a new link. In effect, the theory postulates the existence of a Sandbox for new links.


Much like the Google Sandbox theory itself, there is evidence in support of this dampening effect theory. Also like the sandbox theory, there is evidence that the phenomenon doesn't exist, or is simply one of mistaken identity.


How this applies to your linking campaign:



As with all potential filters, their possible existence must be taken seriously. If there is indeed a filter in place to dampen the value of new links, steps must be taken to reduce or eliminate its effect. If there is no such dampening filter, the same sound practices will provide additional benefits as part of a well designed link building program.


I mentioned in a previous post that Google was in the process of implementing filters to clamp down on purchased links. Wayne explains how the new link filter works to do this.



While Google's algorithm is not made public, it's generally thought that Google intends to clamp down on link sales for PageRank and for ranking in the SERPs. Also on Google's hit list are multiple interlinked sites, existing on the same ip c block, entirely for the purposes of link popularity and PageRank enhancement.


Purchased links tend to be added to a website in medium to large quantities, and often all at one time. Large quantities of incoming links, appearing all at once, might indeed trip a filter. Google could suspect a high volume of links added at one time to be purchased, and therefore suspect.


By dampening the value of new incoming links, Google probably hopes to discourage link sales in particular. By lessening their value, and removing any immediate link boost, Google might reason that website owners will be less inclined to buy incoming links.


The problem lies with the possibility that all incoming links, including natural and relevant ones, are being filtered along with the purchased and non-theme related links.


On devaluing interlinked sites:



The ip c block is the third series of numbers in the identity of an ISP. For example, in 123.123.xxx.12 the c block is denoted as xxx. Google is able to readily identify those links.


A dampening filter is not only used on such linking schemes, but a penalty filter as well. They are not the type of links that are part of the possible link dampening filter. The alleged link dampening filter is supposedly placing new incoming links in a version of the sandbox.


Google doesn't consider purchased links, or interlinked sites to be natural, and has provided some indication that they are devaluing them. In the case of interlinked sites, Google is even penalizing sites in much the same way that link farms are given penalties.


On what loss of link value means to SEOs:



A loss of link value kicks out one of the most important legs of the optimization stool. Should such a link dampening filter exist, a radical rethinking of SEO strategy would have to take place. There is definitely much at stake.


Many website owners have added new incoming links to their sites, but have not received a corresponding boost in the search engine rankings as a result. Conventional SEO wisdom holds that additional incoming links will enhance the any site's placement in the search engine results pages (SERPs) for the targeted keyword phrase.


Some webmasters and SEO experts no longer believe that link boost to be the normal course of events. In fact, some experts believe almost the opposite, that the new links are dampened by a filter, and could even cause a temporary drop or hold in the SERPs.


On how to design a linking campaign to bypass Google’s link filters:



Instead of worrying about new link filters, develop a sound linking policy, and any potential problems shouldn't affect the vast majority of websites. A good linking program will bypass most, of not all possible filters, real or imagined.


A linking strategy should concentrate on developing natural incoming theme relevant links as its ultimate objective. While that goal is a bit idealistic for many website owners, it certainly has the potential to avoid any filters.


By providing precisely the type of link Google prefers, it is far less likely to trigger any dampeners, if at all. Because they are added gradually over time, relevant natural links are highly unlikely to be sandboxed.


To receive this type of natural incoming link, strong theme relevant content must be developed for the website. Good informative content for website visitors attracts links. The problem is that natural linking is a slow process, and the real world SERPs need faster attention.


Add one way directory links. Google's spider crawls the major, and even minor directories, on a very frequent basis. Categorized directory links, especially from human edited directories, are very relevant and theme oriented.


As incoming links, they are far less likely to be filtered than links from other websites. It's widely thought that a link from the Open Directory Project (DMOZ) provides an almost immediate boost to the indexed website.


Keep link exchange programs confined to theme relevant sites. Avoid exchanges with websites that have little to no topic relation to your site. Entirely non-relevant links are much more likely to be viewed with suspicion by Google, and possibly filtered. We already are quite certain, that Google passes along more PageRank and link popularity boost from theme relevant sites, than from topically unrelated sites.


When making link exchanges, space them out over a period of time. Instead of doing all of the link trades in one week, use a two to three month time frame instead. A longer time lag will give each link a full opportunity to be integrated into the Google system, and avoid being dampened.


If a link is going to be dampened, it may as well be delayed.


In a previous post, I mentioned why blogs are excellent tools for linking. Wayne’s thoughts on why bloggers will benefit from the new link filters:



Bloggers are less affected because the links are from similar theme relevant blogs. Because the topics discussed are similar, the inbound links are given more weight faster by Google. The fact that links are often from within posts themselves help, as do permanent links from blogrolls.

Google is also rewarding sites that link out to other sites. Talk about another win for bloggers!

Bloggers freely link to other blogs and traditional websites. This generous linking policy, shared by most bloggers, is rewarded by Google. Higher search rankings for the helpful blogger are the benefit. The reason for this benefit, resulting from linking out, is to encourage links to other people who provide useful and interesting content.


I talked about how one of my sites had suffered in Google because of the lack of fresh content, coupled with not-so-good linking practices.


Wayne stresses the importance of fresh content:



Fresh content is rewarded by Google. It's doubly rewarded as your blog becomes better established over time in the search engine rankings. Older sites, while still strong, slip lower in the search results over time, when no fresh content or information is added.


On the Sandbox:



In effect, new sites are placed on probation to see if they last, or if they are only disposable get rich quick spam sites. Those spam sites break every Google guideline in the book, but rise to page one very quickly.

The idea of spam websites is to glean as much revenue as possible, prior to a Google banning from their index. Google does ban them too. Don't worry about that one.

To prevent this sort of mischief, Google has instituted the Sandbox to keep the new sites lower in the rankings until they prove their worthiness. While the system might be unfair to new sites, it's a fact of life. There are also some ways of minimizing the damage caused by the Sandbox filter.


As always, if you are following good SEO practices, you won't be affected as much by the new filter.


Yes it will take much longer to actually get good rankings for your site and because Google now also considers higher clickthroughs from its SERPS as an indication of your website's value to users, sites that already rank well will do even better.


So its going to become harder to displace sites that already rank well in the Google index.


Good SEO and linking practices, coupled with relevant, fresh, quality content added over time are now even more important to achieving high rankings in Google.


 


Sunday, April 03, 2005

 

Google Ranking Algorithm Patent Create Buzz

An article by Mike Banks Valentine outlines interesting points in Google’s patent on their ranking algorithm that was posted on March 31st at the US Patent Office.

He notes that:

“it seems to be PageRank redefined with a few variations to limit link spamming and reduce stale results, along with multiple innovative elements not previously considered.”

Some of the (stranger) points worth noting are.

  • Older pages that get more inbound links will benefit: The algo would appear to limit the perceived value of a page unless it becomes wildly popular over time.

  • Do you have advertisers on your site?: Google will rank a site based on the advertiser choosing to advertise on a particular site. That might be an indication of how valuable a site is to users and visitors.

  • Press releases, forum postings, citations matter: As a further measure to differentiate a document related to a topical phenomenon from a spam document, they may consider mentions of the document in news articles, discussion groups, etc. on the theory that spam documents will not be mentioned, for example, in the news.

  • Is your site bookmarked or in your user’s list of favourites?: They also plan to determine the value of pages based on "user maintained/generated data" - read that "bookmarks" and "favorites lists" built into your browser.

  • How often do users access your site: Further, they reference user's browser cache files as a method of determining value of a site. For example, the "temp" or cache files associated with users could be monitored by search engine to identify whether there is an increase or decrease in a document being added over time.

  • More time in the sandbox: It appears to apply further penalties to new sites by keeping them poorly ranked for even longer periods

  • Longer domain registrations are better: Applies an apparently new item to algorithms of long term purchase of domain names and historical data related to IP address and hosting company. The date when a domain expires in the future can be used as a factor in predicting the legitimacy of a domain and, thus, the documents associated therewith.

  • Variations in anchor text are preferred: Unique Words, Bigrams, Phrases in Anchor Text are significant in determining rank, because if natural links develop, they would vary when webmasters link to a document differently.

  • Clickthroughs in Google’s SERPs matter: Sites that get significant clickthrough rates from the Google SERPs will rank higher.

All in all, it seems to be that Google is trying to make its algorithm simulate the way a human would access a relevant document.

Bookmarks, user cache, domain registration period, higher clickthroughs… are all steps in the direction of weeding out spammy pages and giving higher rankings to sites that users view as having better quality content.

Read the complete article here: Google Search Algorithm Patent Application Creates Spring Buzz!


Friday, March 25, 2005

 

Quality Content Equals High Rankings

An excellent article by Fredrick Marckini in the Search Engine Watch newsletter highlights the fact that quality content still gets the best rankings.

Some tips from the article on why and how to leverage your content:

  • Higher rankings in search engines go to websites with higher quality content that earn more links.
  • Make no mistake, this law of search engine marketing is clear: less content, lower rankings.
  • By providing valuable textual content, a site can not only increase its search engine visibility, but also improve the site's conversion rate.
  • Keyword research is important and the content needs to be created in such a way that it does not negatively influence the user experience.
  • You can achieve Number 1 position without an optimized site if the content is good
  • Don't focus entirely on "big money" keywords while ignoring secondary keywords
  • Identify important keywords through keyword research process
  • Include seasonal topics (events or holidays)
  • Create catchy titles to increase the click through rate
  • Keep the length of pages between 250-300 words
  • Generate content ideas from customer correspondence
  • Use message boards for additional content inspiration
  • Search engine spiders don't have credit cards, but people do. Pay more attention to your users than to the spiders
  • No matter how good the content copy is, if you can't find a way to meet customers' needs, they won't buy
  • Content should not attract unqualified traffic
  • Content should inform and persuade visitors
  • Placing trigger words in links can increase user click-through rates
  • Wireframe your content to provide an interconnected flow and personality
  • Create different personas based on different archetypal visitors
  • Define three dimensional personas: Topographic, Psychographic and Demographic
  • Create internal linking of the site based on the personas

For more tips on how to leverage your content for high search engine rankings get a copy of my search engine optimization guide, Number One In your Niche.


Thursday, March 24, 2005

 

CSS Spam: Time For Another Google Update?

Just read this rather good article by Ken Webster on yet another “black-hat” technique that uses CSS (instead of HTML) to hide text and links on web pages.


Here’s how he explains it.



There are many ways to hide text and links using CSS and it is seems to be running completely unchecked by the major SE’s at this time.


Up to now the search engines haven’t been able to parse CSS files and combine that information with the page code to determine if spamming techniques were being used.


Most methods deploy a separate attached CSS file and use “hide” terms, such as:


<H1 class="hide">keyword keyword</H1>


Ken also predicts that the abuse of CSS will force Google to update its algorithm to correct this problem in the near future. He even suggests that the coming update be named the “Bernard Update.”


Tuesday, March 15, 2005

 

Spam-Blogging: Blogs Are More Than SEO Tools

Update: I've just compiled this post into an article that I posted here Will Spam-Blogging Be The Death Of Blogging?

Technorati reports that 30,000 - 40,000 new weblogs are being created each day.

Part of the growth of new weblogs created each day is due to an increase in spam blogs - fake blogs that are created by robots in order to foster link farms, attempted search engine optimization, or drive traffic through to advertising or affiliate sites.

Those in the SEO world are well aware of this. There are even services like Blogburner available that encourage creation of spammy blogs.

I blogged about blogs and spam-pinging before and I recommend blogging as an SEO tactic. But I always emphasize that you use your blog for more than just SEO.

According to Technorati's David Sifry:

Prior to January, spam wasn't much of an issue. I'd estimate that we currently catch about 90% of spam and remove it from the index, and notify the blog hosting operators. Most of this fake blog spam comes from hosted services or from specific IP addresses.

One of the results of the extremely productive Spam Squashing Summit of a few weeks ago is the increased collaboration between services in order to report and combat this spam. Right now, about 20% of the aggregate pings Technorati receives are from spam blogs.

Steve Rubel sums up this dilemma rather well (emphasis added):

As soon as people figure out there's ways to exploit new technologies, they do it. It's human nature. It's really up to the search engines to help put a stop to these by undercutting the economics of blogspam, much like they did with nofollow and comment spam.

Of course, such a move would also reduce any impact that blogs have on search results. That's the trade-off.

More information on comment spam and the no-follow tag.

It would be easy for me to launch into a sermon here about how a blog can be a great tool for personal branding and building relationships with your website visitors and customers.

Instead I’ll just say that the more you abuse a technology, the less effective it becomes. And so blogging will become less effective as an SEO tactic over time. Then the spammers will just have to find new avenues and means to spam the engines.

Instead focus on building content-rich sites, real, high-value links to them and don’t restrict yourself to just the SEO benefits of blogging.

Blogs are more than just tools for search engine optimization.

 


 

Search Engine Strategies: B.L. Ochman's New Report

B.L. Ochman (whom I consider required reading for bloggers) reports highlights from this year’s Search Engine Strategies Conference In New York.


She sums up her points in a 12–page report that costs only $15. Definitely get a copy for yourself. I did!


Essential Tips, Tactics and Resources from Search Engine Strategies New York 2005 will teach you:


  • What works even better than paid search?
  • Why Page Rank matters and how to find yours
  • What's the next big thing?
  • Who really rules search?
  • Who are the biggest influencers online today?
  • How blogs can increase your search engine success
  • How press releases can help your SEO results
  • How to optimize your press releases for search engines
  • How your online press room can help search status
  • Is there a better search engine than Google?
  • Why you need lots of anchor text
  • Can links hurt your search engine placement?
  • Online sources for top SEO knowledge
  • 23 Must-Know SEO Resources

  •  Also worth picking up (especially for corporate houses) is a copy of B.L.’s blogging book, What could your company do with a Blog?