Tag Archives: SEO

SEO Content Distribution Linking For Newbies

The new buzz on the internet is all about getting one-way links by distributing content to other sites in exchange for back links. As with every other SEO or website promotion technique ever devised, there are plenty of newbie myths about it that can ruin your chance for success before you even start.

Newbie Myth 1: The “Duplicate content penalty.”

Some webmasters worry that if the content on their sites is suddenly on hundreds of other sites, search engines will inflict a “duplicate content penalty.” Why is this concern unjustified?

* If this were true, every major newspaper and news portal website would now be de-indexed from the search engines, since they all carry “duplicate content” from the news wires such as

Reuters and the Associated Press.

* Thousands of self-promoting internet gurus have proven that distributing content is an effective method of improving search engine rank.

* Even more thousands of content websites have proven that republishing this content does not carry any search engine penalty.

True, the first website to publish an article often seems to be favored by search engines, ranking higher for the same content in searches than higher-PageRank pages with the same content. But the “duplicate” pages do show up in the search engine results, even if lower than the original site. Meanwhile, the reprint content has no effect on the ranking of a site’s other pages.

The only duplicate content penalty is for duplication of content across pages of a single website. Meanwhile, there is a sort of “copyright theft” penalty, whereby someone who copies content without permission can be manually removed from search engine indexes out of respect for the Digital Millennium Copyright Act. But that penalty is only for flagrant theft, not minor mistakes in attributing reprint content.

Newbie Myth 2: The goal is to get in article clearinghouse websites.

There are over 100 popular, high-traffic websites that act as clearinghouses for content made available for redistribution. These websites include isnare.com, ezines.com, and goarticles.com.

Many novice content-distributors are upset when the article clearinghouse websites, with tens of thousands of articles each with a back link, pass negligible Page Rank. But the point of distributing content to those websites is for other website owners to find your content and put it on their websites–not to get a back link directly from the clearinghouse website (though this is sometimes an unexpected bonus).

Plus, to maximize Page Rank-passing links, you also have to submit articles to website owners individually. It’s not a small amount of work. But there’s no substitute for a polite, individually crafted email recommending a website owner complement his or her existing articles with one you’ve written.

Myth 3: Any content will do.

Reality: It should be obvious that many website owners, jealous of their link popularity, will only republish exceptionally high – quality content. For articles, this means a unique point of view and solid information that cannot be found just anywhere, ideally presented in compelling language in a web-optimized format by a professional published writer. You can conduct a content distribution campaign with bad content, but you’ll be handicapping yourself from the start.

Myth 4: Distributing content is easy. Just hit “send.”

Reality: Content distribution campaign requires skillful planning to target publisher websites effectively.

This is essentially a four-step process.

1. You must identify the categories of websites most likely to republish your articles. These categories range from the very broad, such as internet, business, and family, and can go as narrow as family-friendly internet businesses.

It’s a careful balance: you need to make your target category narrowly relevant to maximize the value of the link and your chances of getting your article accepted for publication. But if you target too narrow a category, you’ll lower the maximum number of links you can hope to get.

For instance, a website on web content writing has to target its content distribution to more than just sites focusing on web content. There are only so many websites devoted to web content as a topic of interest, and besides, many such websites would be competitors. Distribution should target broadly relevant categories, such as web design, webmaster issues, writing, marketing, business, website promotion, and SEO. Yet some broadly related categories, such as internet or publishing, are not relevant enough to yield good results.

2. To maximize success, you must have articles custom-created for each major category you want to submit to. “Incorporating Content in Web Design” and “Marketing with Content” would be possible titles for a web content-writing website owner targeting web design and marketing websites, respectively. An article about web design won’t appeal as strongly to marketers, or vice versa, so simply submitting to websites having to do with “the web” would not be as effective.

3. For maximum success, articles custom-written for a category then often have to be refined for sub-categories. For instance, “Incorporating Content in Web Design” becomes “Incorporating Content into Flash Web Design,” or “Incorporating Content into Accessible Web Design.” Sometimes the refinement is just a “find and replace” of one keyword for another, sometimes just in the title. Sometimes, entire paragraphs have to reworded or removed.

4. Once you’ve identified sub-categories of websites, you still have to be able to meet the requirements of individual websites. Some sites only publish articles up to 500 words, some only do how-to articles. Owners of high-ranking websites can afford to be choosey. To really maximize results within a sub-category, you need at least three different articles of varying lengths and focus specifically geared toward that sub-category.

In the end, distributing content for website promotion and inbound links is a marvelously effective way of promoting a website. But it’s not magic beans. Like anything else having to do with achieving success on the web, it takes hard work and knowledge to be successful.

Why Google Indexing Requires A Complex Blend Of Skills

If it was easy, everybody would be doing it. Getting a company’s name and products, or services, onto the first page of a genuine Google search isn’t a trivial piece of work. In fact, there are four distinct skills that a search engine optimizer needs to possess. Most people possess one or maybe two of these skills, very rarely do people posses all four. In truth, to get to all four, people who are good at two of these need to actively develop the other skills. Now, if you are running your own business, do you really have the time to do this? Is this the best use of your time?

Specifically the four skills needed for SEO work are:
Web Design – producing a visually attractive page
HTML coding – developing Search Engine friendly coding that sits behind the web design
Copy writing – producing the actual readable text on the page
Marketing – what are the actual searches that are being used, what key words actually get more business for your company?

Many website designers produce more and more eye-catching designs with animations and clever rollover buttons hoping to entice the people onto their sites. This is the first big mistake; using designs like these will actually decrease your chances of a high Google rating. Yes, that’s right; all that money you have paid for the website design could be wasted because no-one will ever find your site.

The reason for this is that before you get people to your site you need to get the spider bots to like your site. Spider bots are pieces of software used by the search engine companies to trawl the Internet looking at all the websites, and then having reviewed the sites, they use complex algorithms to rank the sites. Some of the complex techniques used by web designers cannot be trawled by spider bots. They come to your site, look at the HTML code and exit stage right, without even bothering to rank your site. So, you will not be found on any meaningful search.

I am amazed how many times I look at websites and I immediately know they are a waste of money. The trouble is that both the web designers and the company that paid the money really do not want to know this. In fact, I have stopped playing the messenger of bad news (too many shootings!); I now work round the problem. So, optimizing a website to be Google friendly is often a compromise between a visually attractive site and an easy to find site.

The second skill is that of optimizing the actual HTML code to be spider bot friendly. I put this as different to the web design because you really do need to be “down and dirty” in the code rather than using an editor like dreamweaver, which is OK for website design. This skill takes lots of time and experience to develop, and just when you think you have cracked it, the search engine companies change the algorithms used to calculate how high your site will appear in the search results.

This is no place for even the most enthusiastic amateur. Results need to be constantly monitored, pieces of code added or removed, and a check kept on what the competition are doing. Many people who design their own website feel they will get searched because it looks good, and totally miss out this step. Without a strong technical understanding of how spider bots work, you will always struggle to get your company on the first results page in Google.

Thirdly, I suggested that copy writing is a skill in its own right. This is the writing of the actual text that people coming to your site will read. The Google bot and other spider bots like Inktomi, love text – but only when written well in proper English. Some people try to stuff their site with keywords, while others put white writing on white space (so spider bots can see it but humans cannot).

Spider bots are very sophisticated and not only will not fall for these tricks, they may actively penalize your site – in Google terms, this is sand boxing. Google takes new sites and “naughty” sites and effectively sin-bins them for 3-6 months, you can still be found but not until results page 14 – really useful! As well as good English, the spider bots are also reading the HTML code, so the copy writer also needs an appreciation of the interplay between the two. My recommendation for anyone copy writing their own site is to write normal, well-constructed English sentences that can be read by machine and human alike.

The final skill is marketing, after all this is what we are doing – marketing you site and hence company and products/services on the Web. The key here is to set the site up to be accessible to the searches that will provide most business to you. I have seen many sites that can be found as you key in the company name. Others that can be found by keying in “Accountant Manchester North-West England”, which is great, except no-one ever actually does that search. So the marketing skill requires knowledge of a company’s business, what they are really trying to sell and an understanding of what actual searches may provide dividends.

I hope you will see that professional Search Engine Optimization companies need more than a bit of web design to improve your business. Make sure anyone you choose for SEO work can cover all the bases.

Foreign Language SEO

Bob is from United States and speaks English, along with some Hebrew and Spanish. Alice is from Romania and speaks Romanian, English, and some French. Why does this matter? There are concerns— Both from a language angle, as well ad some interesting technical caveats— when one decides to target foreign users with search engine marketing. Here we will discuss about the most pertinent factors in foreign search engine optimization.

The internet is a globalized economy. Web sites can be hosted and contain anything that the author would like to publish. Users are free to peruse pages or order items from any country. There are some exceptions, but in general, to enhance user experience, a search engine may treat web sites from the same region in the same language as the user preferentially.

Foreign Language Optimization Tips

Needless to say, Internet marketing presents many opportunities; and nothing stops a search engine marketer from targeting customers from other countries and/or languages. However, he or she should be aware of a few things, and use all applicable cues to indicate properly tothe search engine which language and region a site is focused on. First of all, if you aim at a foreign market, it is essential to employ a competent copywriting service to author or translate your content to a particular foreign language. He or she should know how to translate for the specific market you are targeting. American Spanish, for example, is somewhat different than Argentine Spanish. Even proper translation may be riddled with problems. Foreign language search behavior often differs by dialect, and using the common terminology is key.

Indicating Language and Region

A webmaster should use the lang attribute in a meta tag, or inside an enclosing span or div tag in HTML. Search engines may be able to detect language reasonably accurately, but this tag also provides additional geographical information. The language codes es-mx, es-us, and es-es represent Spanish from Mexico, the United States, and Spain, respectively. This is helpful, because a language dialect and region cannot be detected easily, if at all, just by examining the actual copy. Here’s an example:

Use ‘<span lang=”es-us”>CONTENT</span>’ to indicate language in a particular text region.


Use ‘<meta lang=”es-us”> in the header (“<head>”) section of the page to indicate language of the entire page.

A few examples of languages and region modifiers.

English en-AU (Australia), en-CA (Canada), en-GB (UK), en-US (United States), en-HK (Hong Kong)
German de-AT (Austria), de-BE (Belgium), de-CH (Switzerland), de-DE (Germany)
French fr-CA (Canada), fr-CH (Switzerland), fr-FR (France), fr-MC (Monaco)
Spanish es-AR (Argentina), es-CU (Cuba), es-ES (Spain), es-MX (Mexico), es-US (United States)
Japanese ja (Japan)

5 Free Ways to Increase Your Website Traffic

I don’t know about you, but when I first entered the world of internet marketing I thought I could just submit my newly finished website to a few search engines, then sit back as the visitors flocked to my site. I imagined that people would arrive on my website, as if by magic, purchase goods, and perhaps come back again for more.

A week or so later I came down to earth with a big bump.

I realized that it would take a bit of time and effort to see the results that I was dreaming about!

Since my reality check, I have learned all about the weird and wonderful ways of internet marketing.

In this article I will tell you about my top 5 free ways of increasing website traffic.

All these methods are completely free and if you spend some time on them you will find that they work consistently.

(1)Writing Articles

Writing an article on a subject related to your website and getting that article published has two major benefits:

– People who are interested in your article will read it and often click on the URL in your resource box find out more. This gets you another free targeted visitor. Targeted, because that reader wants to find out more on the subject of your article, which is hopefully related to the subject of your website.

– Every publisher of your article must also publish your “resource box”. Adding a resource box with your URL to all of your articles will increase the number of links leading back to your website, which in turn helps to increase your search engine position

(2)Forum Networking

There are many discussion forums on the internet, on every topic you could possibly imagine.

Most discussion boards allow posters to attach a “Signature” with their post containing additional information about themselves, such as their name, URL and sometimes even an advertisement.

By visiting a few forums regularly and participating in the discussions, asking and answering questions, you can build up trust with other forum members, whilst at the same time getting free exposure for your website.

Just try to make a useful contribution to the forum – be sure to read the forum rules and don’t spam!

(3)Reciprocal Linking

Reciprocal linking has two main benefits for you.

Firstly, the more links that you exchange, the more chance there is that someone will follow a link from another site and land on your website.

Secondly, your website will be perceived more importantly by search engines.

The more links you have from other sites, the greater your chance of getter ranked more highly by all the search engines.

Here are some Dos and Don’ts to help you get more out of your link exchanges.

– Do link with sites that will be of interest to your visitors.

– Don’t link with pages that have unorganised link directories with hundreds of links on each page. This won’t benefit you with increased traffic or search engine rankings.

– Do link with sites that have a clearly labelled “Link Directory” from their main page. You aren’t likely to get much traffic from a hidden or hard to find link directory.

– Don’t use link farms or FFA pages. You are unlikely to get extra traffic using these methods and the search engines may penalise you.

– Do use link directories to help you find link partners.

– Do stay organised. Use link exchange software, or a spreadsheet to keep track of the link exchanges you have requested and the contact details of the webmasters.

(4) Email Signatures

This is a very simple, but often forgotten way of increasing your website visitors.

Most of us are sending lots of emails a day, but many of us just sign them with just our name, or perhaps nothing at all.

Instead, why not end your emails with a short signature containing your name, a bit about your website along with the URL?

Keep the signature short (4 lines), to the point and avoid hype or SHOUTING.

You may be surprised at the extra visitors you receive through doing this. It’s amazing how curiosity will lead the recipients of your email to click on your link!

(5)Using Traffic Exchanges

Finally, this is one easy way to guarantee instant hits to your website. You can build up credits for free by surfing or building up a downline to surf for you.

The downside of this method is that the visitors you receive from traffic exchanges are not as targeted as the visitors that you will receive via the other methods that I have described.

The reason for this is that people surf traffic exchanges for one reason – to earn as many credits as possible in as short a space of time as possible!

This gives you a challenge – how do you attract the attention of someone who is looking at your web page for 20 seconds or less?

Here are two tips –

– Know your market – spend some time surfing on the exchange that you are advertising on and pay attention to the types of web pages being advertised. Make sure that the pages you advertise are going to interest your target market.

– Use a short, simple attention grabbing page that can be read in a few seconds – there is no point advertising a huge page of text that takes 5 minutes to read. The chances are, your visitor will get bored and click on the “next” button without giving your page a fair chance.

Regularly invest some time in each of these 5 traffic generating methods and you will see your web statistics moving in the upwards direction before you know it.

4 Ways In Which You Can Achieve Critical Mass

Critical Mass Website Promotion is the elite goal rarely attained by website marketers in any industry. Reaching critical mass and getting targeted website traffic on autopilot (meaning you don’t have to promote your site for 6 months and the traffic NEVER declines) is the Holy Grail of internet marketing.

When you hit critical mass in your market, things change drastically for you and your business. Your marketing efforts go down in direct proportion to your customer support and sales going up. It sounds awesome. It IS awesome. But only a tiny fraction of websites on the net every achieve critical mass. Here’s a nutshell version of what it takes.

1. Work Hard

Internet marketing is no game to be manipulated by using software and quick fixes to a very large problem, which is long-term, steady traffic coming to your site.

There is no magic pill, silver bullet, or guru secret that will help you to achieve critical mass in your market. Hard work and intense study of the internet and how people surf, what they want, and how you get in front of them while they are surfing is the only answer.

The internet marketing game is for professionals and you MUST make yourself a professional marketer of your “shop” just as you would in the real world.

2. Understand Your Market

Understanding exactly who you are selling your products or services to is the crucial first step in achieving critical mass.

The worst thing you could ever do is put the hard work in and achieve critical mass in the wrong market for your products and services!

Know your market. Know your best, ideal customer inside and out. Get your links in front of them, ideally, everywhere they surf for related information to your site.

3. Be Vigilant

Over the years I have watched people listen to what I have to say about website promotion and then take one of two paths:

1) They work their tails off and do what I tell them, with great results and rewards for their efforts (more sales, more traffic, more branding and recognition).

2) They buy my book, take a half-hearted shot at the easiest things in the book, get bored, tired, or impatient, and then they go back to chasing down quick fixes.

I have watched people who had the answers they needed in their hands, and drop them for more glamorous-sounding, fast solutions. Which we all know don’t work.


You must work like anyone in business who is in it to succeed works. If you do not know everything about your business and how to market it, your competition will eat you alive because they WILL know everything, I guarantee it.

Just like in the real world, the one who works harder, smarter, and is vigilant about his or her business is going to come out on top every time.

You owe it to yourself and your family to learn your craft and do it better than most people in your market if you want to succeed.

4. Achieve Critical Mass

In order to hit critical mass, you must be everywhere, or nearly everywhere your best customers surf.

This means linking back to your site from reciprocal links, articles, press releases, joint ventures, forum participation, and good search engine positioning for your best keywords.

In order to have literally thousands of links pointing to your site in high-profitable areas where human beings actually go and read/surf, it takes pure time and effort.

You must submit your articles to the best free content directories. Over and over until you have published everywhere you can that is relevant to your market.

You must secure deals with high-profitable sites in your market to swap QUALITY links. I am talking about links that people will click on, not just links that only search engines will see.

You must be everywhere that is a good place to be on the net in your market. This takes time. This takes hard work. This is what it takes to have a viable, long term website traffic solution for your business.

Reaching critical mass, or even sitting down and deciding a plan of action to pursue critical mass in your market takes maturity and a final realization that you can have anything you want in the world if you want to work for it. The day you stop chasing quick fixes to drive spurts of unqualified, un-targeted traffic to your site is the first step in achieving your goals as a professional internet marketer.

After all the debate over website design, shopping carts and credit card processors,
every website owner eventually comes to the startling realization that they need one
more thing to survive – website traffic!

Without website traffic it’s the same as building an expensive billboard and, instead
of placing it alongside a busy highway, you hide it in your basement where nobody can
see it.

Upon realizing they need traffic, most website owners run out and start blowing chunks
of money and time trying to get “hits” to their sites, but they fail to realize that
all “hits” are not created equal.

In their quest to get eyeballs to their websites, most online operators don’t realize
there’s a big difference between driving “general” traffic to your website and driving
“targeted” traffic.

Just getting any traffic is the same technique TV advertisers use. They flash ads on
the screen in front of people who can’t afford or don’t need the advertised product.

Since general advertising can’t hit specific targets, they hit everyone and hope that
someone in their target audience is actually watching at that moment. Spam, banner
ads, “safe-lists” and similar traffic techniques fall into this “general” category.

“Targeted” traffic is made up of people who are genuinely interested in what you have
to say or sell online. These people either share the same interests or have an
immediate need or problem they are trying to solve.

“Targeted” traffic is best because the people hitting your website have a much higher
likelihood of actually making a purchase.

Targeted traffic comes from people following recommended links on other sites, typing
in relevant keywords into the search engines, or even reading articles you’ve written
on a particular subject and then clicking over to your site for more information.

If you don’t already know where to find the best sources of targeted traffic for your
website, you will need to experiment with lots of different sources to find the ones
that bring visitors who give you the most “bang for your buck.”

The fastest way to determine which avenues provide the most targeted traffic is by
using an “ad tracker”. An “ad tracker” is a simple program, residing on your web
server, that tracks how many visitors your site gets from a particular source and how
many of them purchased.

Though it sounds simple, most businesses don’t do this! Most businesses can’t tell you
their visitor to buyer conversion percentage and, therefore, don’t know exactly how
much they can invest in traffic generation and remain profitable.

Whether you pay for your website traffic with cash (pay-per-click search engines,
ezine ads), or you pay for it with the sweat of your brow (article distribution, free
search engines), you must identify your best and most profitable sources of targeted
traffic that convert into buyers, subscribers, or leads.

Failure to identify and track where your buyers come from and then calculate how much
they really cost you ultimately translates into failure for your online business.

eCommerce Web Site Building: Where Do I Start?

Building a web site isn’t something that is really cut and dry. There’s a huge variety of products and services that can either help you get your web site where you want it or simply confuse you. It’s also important that you make the right choices upfront so that you don’t end up having to restructure your whole web site because of some problem in your design layout. The level of time investment necessary for mastery in a lot of these software packages can range from little to a VERY significant amount. Because of this I feel it’s important to be lead in the “right” direction to make sure you don’t spend time in the wrong areas, or learning some software that might not be all that useful for you later on (*coughs* Frontpage *coughs*).

Where you should start greatly depends on what you plan on attempting to do, and how deep you’re going to dive in. For a moderately professional, clean looking web site without a lot of automation or intensive animated graphics you can probably get by with some basic knowledge of html, ability with a good WYSIWYG editor, and an image editing program. On the other hand, if you’re someone that’s looking to build something that will really wow your audience then you might consider spending some major time and developing some animation skills with a program like Macro media Flash. I personally always spring for what I believe will bring me the greatest amount of profit with a minimal amount of effort, and because of this I usually end up spending all of my time diving in deeper with ONLY my WYSIWYG HTML editor, and my image editor.

## What is a “WYSIWYG” editor? ##
A WYSIWYG (What You See Is What You Get) HTML editor is what allows you to get by with minimal knowledge of HTML. Yes, that means you don’t have to know EVERYTHING about HTML to have a decent looking website. When you use a WYSIWYG editor it interprets what you’re doing (inserting an image for example) as being a certain series of HTML tags with attributes, and does it for you… Thus, what you see on your screen is what you get. Instead of seeing a bunch of HTML code in text format, you’ll mostly see what will actually show up in your browser once your web site is up WHILE you’re making it. I highly recommend using the latest version of dreamweaver — it is well-known as one of the best HTML editors by general consensus. Dreamweaver’s interface is very friendly, has a built in FTP client, and is specifically built to be flexible enough to suit both the coder and the everyday amatuer webmaster.

## Image editing? What do I need that for? ##
Okay, let’s be realistic here: If you’re going to make a professional *appearing* web site it’s important that you can make some basic, decent looking graphics. There’s a lot of graphics problems that can truly get the job done, but as far as power and flexibility goes I recommend Adobe Photoshop. Adobe Photoshop definitely takes some time getting used to, but in the end it’s VERY rewarding. I’ve ended up using my knowledge of Photoshop to make not only graphics for multiple web sites, but also touched up portraits, made business cards, flyers, and other online advertisments such as banners. In fact, I’ve used it for everything except animation… But it also comes with Adobe Image-ready which is very good with animation. This software is amazing, and if you’re going to learn ANY image editing software I recommend you start with Photoshop because of it’s wide range of overall… usefulness!

## Let’s get me some sales! ##
Kick off your new web site you’ve gotten up from your knowledge of web mastering and image editing with a few new sales… Sounds like a plan? Well a great way to do that quickly is with pay-per-click advertising. BUT WAIT! Doesn’t that cost money? Well… Yes. But with the tools brought to us by some of the biggest pay-per-click advertisers out there we should be able to make a good evaluation of how much profit we’re going to make without much investment upfront.

The big question behind pay-per-click advertising is whether or not it’s worth the cash when you can simply get traffic from regular search engine ranking (otherwise known as organic traffic). After all, there are plenty of companies out there that promise to help get you all the traffic you need through optimizing your web site for organic ranking. The answer to this question is quite simple: profit is profit. Through conversion tracking tools such as those offered by Yahoo! Search Marketing and Google Adwords anyone can calculate exactly what their profit is after cost of PPC advertising is taken out. In my opinion, Google Adwords has the most user-friendly interface among the PPC advertisers. Google Adword’s interface makes it very easy to see which keywords are pulling you in the most sales, and which ones aren’t even worth your advertising money.

Let us not forget, however, that in order to make those conversions we’re going to be needing a shopping cart! There’s a lot of diverse software packages out there you can use, but I’ve been using Mal’s E-Commerce Free shopping cart for a number of years with great success. The cart’s server is hosted on their machines so that means you not only get away with not having to pay for the software itself, but you get out of having to buy an SSL security ticket too! Nothing’s a better bargain than free, eh?

## Getting those sales leads you’ve been building up to BUY! ##

Once you’ve scored a few sales it would probably be a good idea to start using some kind of lead management services. I highly recommend the use of auto responders for this purpose. Auto responders are, essentially, a newsletter sign-up that allows you to strategically determine what you want to send each lead after a certain allotted amount of time. For example, let’s say someone visits your web site and you offer them a free newsletter. If you were selling an ebook on some very complicated topic, you might consider sending them only information on the most basic concepts at first to get them interested. Slowly but surely, you can turn those visitors that might have left your web site and never have returned into some serious revenue!

Top 10 Search Engine Positioning Mistakes!

Search Engine Positioning is the art of optimizing your web site so that it gets into a high position on the search engine results page whenever someone searches for keywords that relate to your products and services.

However, some people make basic mistakes while designing their web site and as a result, never make it to the top. Even if they work hard on it! Or may be waste a lot of money on useless tools and services.

Do you make these mistakes too?

1. Designing a Frames-based web site
This one is the biggest loser of them all. Frames may make the job of maintaining a very big and complicated web site easy but search engine absolutely hate them. Most of the search engines cannot find out their way easily through them and end up indexing only the home page.

Now imagine this. One of your internal pages has been reported by the search engines and the user has clicked on it. What a mess! The page looks orphan without the outer frame and the navigation.

Lose your frames right away. You will start getting positive improvements the moment you redesign your site without frames.

2. Having an all-Flash or graphic-only home page

This is another classic mistake. Many designers design web site home pages like brochures. A beautiful cover which has to be opened to read. But on the Internet every click takes away some prospects. Did they click your ENTER button or the Back button?

You see, search engines need content to index. If you don’t have content on the home page but only a Flash movie or a big animated graphic, how will the search engine know what you deal in. And why will it give you a high enough ranking?

3. Not having a good title

What’s your title, Sir?

A good title is an absolute must for getting a good search engine position and the most vital thing — the click-through. With the title, you are always walking a tightrope. You need a title with your most important keyword near the beginning but it should still appeal to the human reading the results.

Don’t, don’t stuff it with the keywords. How does this look to you —

Search engine position, search engine positioning, search engine ranking

If you saw this in the search engine results, will you click on this or you will prefer-

Top 10 Search engine positioning mistakes!

4. Hosting your site with a FREE host

It takes away all your credibility. You want to do business from your web site. Right? And you can’t even afford a decent web hosting package. How do you expect your prospect to trust you?

Most of the search engines do not spider web sites hosted on the free hosts. Even if they do, they rank them quite low. How many geocities web sites have you seen in the top 10?

Also, will you be comfortable buying your merchandise from someone who can’t even afford a small shop? And web site hosting is much cheaper!

Do you want your visitor to look at your message or look at the pop-up that your free web host popped over your site?

Go get a good web hosting package right away.

5. Putting all links on Javascript

Google and many other search engines don’t read and process JavaScript. So if you have all your links on JavaScript only, Google is blind to them.

You must have at least one text-based link to all the pages that you want to link to. And the anchor text (the visible text on the site) should contain your important keywords, not “Click here”.

6. Stuffing lots of keywords in the keywords tag

Do you have a keywords tag that lists all the words related to your product in a big long series? This is a certain recipe to invite negative points.

While many search engines have already started to ignore keywords tag precisely because of this misuse, you should have the keywords tag for the search engines that still use them. It also serves as a reminder of the keywords that you are optimizing for.

However, put only the 2-3 most important keywords in there. Here’s a quick test – don’t put any term in the keywords tag if it does not appear at least once on the body copy.

7. Not having any outgoing links

Do you know why the Internet is called the Web? Because the web sites link to each other. If you are only having incoming links but don’t have any outbound links, it is not appreciated by the search engines as it violates the web-like structure of the Web.

Because some people try to conserve PageRank (a proprietary index used by Google to measure link popularity), they avoid having any outbound links. This is one big myth. You can get very good points if you have some outbound links with keyword-rich anchor text and preferably keyword-rich target URL also.

Of course, you should not turn your web page into a link-farm. There should be a few good links amidst some good content.

8. Insisting on session variables and cookies to show information

Session variables are used extensively by ecommerce-enabled sites. This is to trace the path used by the visitor. Shopping cart and various other applications also benefit by using session variables. However it should be possible to visit the various information related and sales pages without needing to have session variables.

Since you can’t put cookies on the search engine spiders, they can’t index your pages properly if the navigation requires cookies and session variables.

9. Regularly submitting your site to the search engines

“We will submit your site to the top 250,000 search engines every month for only $29.95.” Who has not seen these ads or received Spam with similar messages?

And which are those 250,000 search engines? There are only about 8-10 top search engines worth bothering about. And a handful of directories.

With most of the search engines, you only need to submit once to get spidered and then they will keep your listing fresh by crawling your site at regular intervals. All you need to do is to keep adding fresh content to your site and the search engines will absolutely love you. In fact, Google prefers to locate you through a link and not through the URL submission page.

For some sites like DMOZ, if you resubmit while you are waiting to be indexed, you entry is pushed to the end of the queue. So you can resubmit regularly and never get indexed 🙁

10. Optimizing for more than 2 or 3 search terms

It is virtually impossible to optimize a page for more than 2-3 keywords without diluting everything. Don’t try to work on more than 3 phrases on one page. Split.

Get similar phrases together and work on those in this page. Take 2 or 3 out of the other phrases and develop a new page with entirely new copy. Remember, you cannot just copy the same page and squeeze these new phrases in there. It will look very funny to the visitor.

Streamline Your Website Pages

Squeezing the most efficient performance from your web pages is important. The benefits are universal, whether the site is personal or large and professional. Reducing page weight can speed up the browsing experience, especially if your visitors are using dial-up internet access. Though broadband access is the future, the present still contains a great deal of dial-up users. Many sites, ecommerce sites especially, cannot afford to ignore this large section of the market. Sites with a large amount of unique traffic may also save on their total monthly traffic by slimming down their web pages. This article will cover the basics of on-page optimization in both text/code and graphics.


Graphics are the usual suspect on heavy pages. Either as a result of a highly graphic design, or a few poorly optimized images, graphics can significantly extend the load-time of a web page. The first step in graphics optimization is very basic. Decide if the graphics are absolutely necessary and simply eliminate or move the ones that aren’t. Removing large graphics from the homepage to a separate gallery will likely increase the number of visitors who “hang around” to let the homepage load. Separating larger photos or art to a gallery also provides the opportunity to provide fair warning to users clicking on the gallery that it may take longer to load. In the case of graphical buttons, consider the use of text based, CSS-styled buttons instead. Sites that use a highly graphic design, a common theme in website “templates”, need to optimize their graphics as best as possible.

Graphics optimization first involves selecting the appropriate file type for your image. Though this topic alone is fodder for far more in depth analysis, I will touch on it briefly. Images come in 2 basic varieties, those that are photographic in nature, and those that are graphic in nature. Photographs have a large array of colors all jumbled together in what’s referred to as continuous tone. Graphics, such as business logos, are generally smooth, crisp and have large areas of the same color. Photographs are best compressed into “JPEGs”. The “Joint Photographic Expert Group” format can successfully compress large photos down to very manageable sizes. It is usually applied on a sliding “quality” scale between 1-100, 1 being the most compressed and lowest quality, 100 the least and highest quality. JPEG is a “lossy” compression algorithm, meaning it “destroys” image information when applied, so always keep a copy of the original file. Graphics and logos generally work best in the “GIF”, or more recently, the “PNG” format. These formats are more efficient than JPEGs at reducing the size of images with large areas of similar color, such as logos or graphical text.

A few general notes on other media are appropriate. Other types of media such as Flash or sound files also slow down a page. The first rule is always the same, consider whether they are absolutely necessary. If you are choosing to build the site entirely in Flash, then make sure the individual sections and elements are as well compressed as possible. In the case of music, I will admit to personal bias here and paraphrase a brilliant old saying, “Websites should be seen and not heard.” Simply, music playing in the background will not “enhance” any browsing experience.

Text and Code

The most weight to be trimmed on a page will come from graphical and media elements, but it is possible to shed a few extra bytes by looking at the text and code of a web page. In terms of actual text content, there may not be much to do here. A page’s content is key not only to the user’s understanding but also search engine ranking. Removing or better organizing content is only necessary in extreme situations, where more than page weight is an issue. An example might be a long, text heavy web page requiring a lengthy vertical scrolling to finish. Such a page is common on “infomercial” sites, and violates basic design tenants beyond those related to page weight.

Code is a different story. A website’s code can be made more efficient in a variety of fashions. First, via the use of CSS, all style elements of a web page can now be called via an external file. This same file can be called on all a site’s pages, providing for a uniform look and feel. Not only is this more efficient; it is also the official recommendation from the W3C. The same may be said of XHTML and the abandonment of “table” based layout. Tables, though effective for layout, produce more code than equivalent XHTML layouts using “div” tags. Where a minimum of 3 tags are required to create a “box” with content in a table, only 1 is needed using divisions. Using XHTML and CSS in combination can significantly reduce the amount of “on page” code required by a web page. A final, relatively insignificant trick is the removal of all “white space” from your code. Browsers don’t require it; it is primarily so authors can readily read and interpret the code. The savings are minimal at best, but for sites that receive an extreme amount of traffic, even a few saved bytes will add up over time.


Target images and media files first when seeking to reduce the weight of a page. They are the largest components of overall page weight and simply removing them can significantly reduce total weight. The images that remain should be optimally compressed into a format appropriate for their type, photos or graphics. Avoid huge blocks of text that cause unnecessary vertical scrolling. Organize the site more efficiently to spread the information across multiple pages. Adopt XHTML and CSS to reduce the size of the on-page code, and call the CSS externally. These tips should help reduce the size of your pages and speed their delivery to your viewers.

Effective SEO Comes Cheap

Search engine optimization or SEO is the hottest way to drive targeted traffic to your website. Maximizing the benefits of a well optimized website will yield lots of earnings for the marketer. However, optimizing your site might cost you thousands of dollars if you are not skilled in this area.

But to tell you the truth, you can essentially get information on low cost SEO anywhere in the Internet. But only several really show you how to work out an affordable search engine optimization endeavor. And those few that really inform include this article.

1. Link exchanges

One cheap SEO method that can get you best results is through link exchanges or linking to and from other web sites. Depending on the websites that you would like to exchange links with, this tool could even cost you nothing at all. Contact the author or owner of the web site you want to have a link exchange with. You will be surprised with the eventual spiking up of your page ranking using this means of getting your website optimized.

2. Write or acquire key word rich articles

Writing truly informative and keyword-rich articles is one surefire way to make your Internet business more visible than ever. It’s either you write your own articles or you get them from article directories that allow you to post these articles on your website as long as you keep the resource box or the author’s byline in tact. Just don’t stuff your articles with keywords that even idiots would get bore of reading them. The readability and freshness of your articles will still be the basis of whether your writers will keep on coming back to your website or not.

3. Catchy Domain Name

What better will make your target visitors remember your website but with a very easy-to-recall domain name. Something sweet and short will prove to be very invaluable. Registering your domain name is not for free. But creativity is.

4. Organize your site navigation

Providing easy steps in navigating your site is one way to make your visitors become at ease with your site. This, in turn, will improve the flow of traffic to your website.

Low cost SEO is always evolving like any other approach in information technology. There are many methods that can very well land you on the top ten rankings of Google or on any other search engines. Some may cost a lot but there are methods that can give you the same results at a low price or you can even do on your own such as those mentioned above.

Search Engine Optimization History

Webmasters today spend quite some time optimizing their websites for search engines. Books have been written about search engine optimization and some sort of industry has developed to offer search engine optimization services to potential clients. But where did this all start? How did we end up with the SEO world we live in today (from a webmaster standpoint seen)?

A guy named Alan Emtage, a student at the University of McGill, developed the first search engine for the Internet in 1990. This search engine was called “Archie” and was designed to archive documents available on the Internet at that time. About a year later, Gopher, an alternative search engine to Archie, was developed at the University of Minnesota. These two kinda search engines triggered the birth of what we use as search engines today.

In 1993, Matthew Gray developed very first search engine robot – the World Wide Web Wanderer. However, it took until 1994 that search engines as we know them today were born. Lycos, Yahoo! And Galaxy were started and as you probably – two of those are still around today (2005).

In 1994 some companies started experimenting with the concept of search engine optimization. The emphasis was put solely on the submission process at that time. Within 12 months, the first automated submission software packages were released. Of course it did not take long until the concept of spamming search engines was ‘invented’. Some webmasters quickly realized that they could swamp and manipulate search results pages by over-submission of their sites. However – the search engines soon fought back and changed things to prevent this from happen.

Soon, search engine optimizers and the search engines started playing some sort of a “cat and mouse” game. Once a way to manipulate a search engine was discovered by the SE-optimizers they took advantage of this. The search engines subsequently revised and enhanced their ranking algorithms to respond to these strategies. It was clear very soon that mainly a small group of webmasters was abusing the search engine algorithms to gain advantage over the competition. Black Hat search engine optimization was born. The unethical way of manipulating search engine resulted in faster responses from search engines. Search engines are trying to keep the search results clean of SPAM to provide the best service to customers.

The search engine industry quickly realized that SEO (Search Engine Optimization) as an industry would not go away, and in order to maintain useful indexes, they would need to at least accept the industry. Search engines now partially work with the SEO industry but are still very eager to sort out SPAMMERS that are trying to manipulate the results.

When Google.com started to be the search engine of choice for more than 50 of the Internet users it was highly visible to anyone in the industry that search engine spamming had reached a new dimension. Google.com was so much more important to the success of a website that many webmasters solely concentrated on optimizing their sites for Google only as the payoff was worth the efforts. Again – Black Hat SEO took place, pushing down the honest webmaster and their sites in search results delivered. Google started fighting back. Several major updates to Google’s algorithms forced all webmaster to adapt to new strategies. Black Hat SE-optimizers but suddenly saw something different happening. Instead of just being pushed down in the search results their websites were suddenly completely removed from the search index.

And then there was something called the “Google Sandbox” to show up in discussions. Websites either disappeared into the sandbox or new websites never made it into the index and were considered in the Google Sandbox. The sandbox seemed to be the place where Google would ‘park’ websites either considered SPAMMY or not to be conform with Google’s policies (duplicate websites under different domain names, etc.). The Google Sandbox so far has not been confirmed or denied by Google and many webmasters consider it to be myth.

In late 2004 Google announced to have 8 billion pages/sites in the search index. The gap between Google and the next two competitors (MSN and Yahoo!) seemed to grow. However – in 2005 MSN as well as Yahoo! Started fighting back putting life back into the search engine war. MSN and Yahoo seemed to gain ground in delivering better and cleaner results compared to Google. In July of 2005 Yahoo! Announced to have over 20 billion pages/sites in the search index – leaving Google far behind. No one search engine has won the war yet. The three major search engines however are eagerly fighting for market share and one mistake could change the fortune of a search engine. It will be a rocky ride – but worth watching from the sidelines.