Tag Archives: Javascript

Top 10 Search Engine Positioning Mistakes!

Search Engine Positioning is the art of optimizing your web site so that it gets into a high position on the search engine results page whenever someone searches for keywords that relate to your products and services.

However, some people make basic mistakes while designing their web site and as a result, never make it to the top. Even if they work hard on it! Or may be waste a lot of money on useless tools and services.

Do you make these mistakes too?

1. Designing a Frames-based web site
This one is the biggest loser of them all. Frames may make the job of maintaining a very big and complicated web site easy but search engine absolutely hate them. Most of the search engines cannot find out their way easily through them and end up indexing only the home page.

Now imagine this. One of your internal pages has been reported by the search engines and the user has clicked on it. What a mess! The page looks orphan without the outer frame and the navigation.

Lose your frames right away. You will start getting positive improvements the moment you redesign your site without frames.

2. Having an all-Flash or graphic-only home page

This is another classic mistake. Many designers design web site home pages like brochures. A beautiful cover which has to be opened to read. But on the Internet every click takes away some prospects. Did they click your ENTER button or the Back button?

You see, search engines need content to index. If you don’t have content on the home page but only a Flash movie or a big animated graphic, how will the search engine know what you deal in. And why will it give you a high enough ranking?

3. Not having a good title

What’s your title, Sir?

A good title is an absolute must for getting a good search engine position and the most vital thing — the click-through. With the title, you are always walking a tightrope. You need a title with your most important keyword near the beginning but it should still appeal to the human reading the results.

Don’t, don’t stuff it with the keywords. How does this look to you —

Search engine position, search engine positioning, search engine ranking

If you saw this in the search engine results, will you click on this or you will prefer-

Top 10 Search engine positioning mistakes!

4. Hosting your site with a FREE host

It takes away all your credibility. You want to do business from your web site. Right? And you can’t even afford a decent web hosting package. How do you expect your prospect to trust you?

Most of the search engines do not spider web sites hosted on the free hosts. Even if they do, they rank them quite low. How many geocities web sites have you seen in the top 10?

Also, will you be comfortable buying your merchandise from someone who can’t even afford a small shop? And web site hosting is much cheaper!

Do you want your visitor to look at your message or look at the pop-up that your free web host popped over your site?

Go get a good web hosting package right away.

5. Putting all links on Javascript

Google and many other search engines don’t read and process JavaScript. So if you have all your links on JavaScript only, Google is blind to them.

You must have at least one text-based link to all the pages that you want to link to. And the anchor text (the visible text on the site) should contain your important keywords, not “Click here”.

6. Stuffing lots of keywords in the keywords tag

Do you have a keywords tag that lists all the words related to your product in a big long series? This is a certain recipe to invite negative points.

While many search engines have already started to ignore keywords tag precisely because of this misuse, you should have the keywords tag for the search engines that still use them. It also serves as a reminder of the keywords that you are optimizing for.

However, put only the 2-3 most important keywords in there. Here’s a quick test – don’t put any term in the keywords tag if it does not appear at least once on the body copy.

7. Not having any outgoing links

Do you know why the Internet is called the Web? Because the web sites link to each other. If you are only having incoming links but don’t have any outbound links, it is not appreciated by the search engines as it violates the web-like structure of the Web.

Because some people try to conserve PageRank (a proprietary index used by Google to measure link popularity), they avoid having any outbound links. This is one big myth. You can get very good points if you have some outbound links with keyword-rich anchor text and preferably keyword-rich target URL also.

Of course, you should not turn your web page into a link-farm. There should be a few good links amidst some good content.

8. Insisting on session variables and cookies to show information

Session variables are used extensively by ecommerce-enabled sites. This is to trace the path used by the visitor. Shopping cart and various other applications also benefit by using session variables. However it should be possible to visit the various information related and sales pages without needing to have session variables.

Since you can’t put cookies on the search engine spiders, they can’t index your pages properly if the navigation requires cookies and session variables.

9. Regularly submitting your site to the search engines

“We will submit your site to the top 250,000 search engines every month for only $29.95.” Who has not seen these ads or received Spam with similar messages?

And which are those 250,000 search engines? There are only about 8-10 top search engines worth bothering about. And a handful of directories.

With most of the search engines, you only need to submit once to get spidered and then they will keep your listing fresh by crawling your site at regular intervals. All you need to do is to keep adding fresh content to your site and the search engines will absolutely love you. In fact, Google prefers to locate you through a link and not through the URL submission page.

For some sites like DMOZ, if you resubmit while you are waiting to be indexed, you entry is pushed to the end of the queue. So you can resubmit regularly and never get indexed 🙁

10. Optimizing for more than 2 or 3 search terms

It is virtually impossible to optimize a page for more than 2-3 keywords without diluting everything. Don’t try to work on more than 3 phrases on one page. Split.

Get similar phrases together and work on those in this page. Take 2 or 3 out of the other phrases and develop a new page with entirely new copy. Remember, you cannot just copy the same page and squeeze these new phrases in there. It will look very funny to the visitor.

Search Engine Optimization

The Good and the Bad of SEO ‘s From Google’s Mouth!

I recently had the opportunity to ask questions of some Google staffers. There were some questions I felt I needed to get verification on, so when I had the opportunity via a conference call I took it.

In this article I highlight some of the points made during the call so you know what Google thinks.

You know its bad when you take time from your holidays to come into work to attend a conference call. But that’s what I did a few weeks ago. You see I had to because I was going to have the opportunity to ask some Google employees specific questions on things that I’d been pretty sure about, but wanted to hear it right from the horses mouth.

The call lasted less than an hour, but in that time I found that there were many things I figured were indeed true. So lets start with the most obvious:

Is PageRank still important?

The short answer is yes  PageRank has always been important to Google. Naturally they couldn’t go into details but it is as I suspected. Google still uses the algorithm to help determine rankings. Where it falls in the algo mix, though, is up for speculation. My feeling however is that they’ve simply moved where the PageRank value is applied in the grand scheme of things. If you want to know what I think, be sure to read this article.

Are dynamic URLs bad?

Google says that a dynamic URL with 2 parameters should get indexed. When we pressed a bit on the issue we also found that URLs themselves don’t contribute too much to the overall ranking algorithms. In other words, a page named Page1.asp will likely perform as well as Keyword.asp.

The whole variable thing shouldn’t come as a surprise. It is true that Google will indeed index dynamic URLs and I’ve seen sites with as many as 4 variables get indexed. The difference however is that in almost all cases I’ve seen the static URLs outrank the dynamic URLs especially in highly competitive or even moderately competitive keyword spaces.

Is URL rewriting OK in Google’s eyes?

Again, the answer is yes, provided the URLs aren’t too long. While the length of the URL isn’t necessarily an issue, if they get extremely long they can cause problems.

In my experience, long rewritten URLs perform just fine. The important thing is the content on the page.

That was a common theme throughout the call  content is king. Sure optimized meta tags, effective interlinking and externalizing JavaScript all help, but in the end if the content isn’t there the site won’t do well.

Do you need to use the Google Sitemap tool?

If your site is already getting crawled effectively by Google you do not need to use the Google sitemap submission tool.

The sitemap submission tool was created by Google to provide a way for sites which normally do not get crawled effectively to now become indexed by Google.

My feeling here is that if you MUST use the Google sitemap to get your site indexed then you have some serious architectural issues to solve.

In other words, just because your pages get indexed via the sitemap doesn’t mean they will rank. In fact I’d bet you that they won’t rank because of those technical issues I mentioned above.

Here I’d recommend getting a free tool like Xenu and spider your site yourself. If Xenu has problems then you can almost be assured of Googlebot crawling problems. The nice thing with Xenu is that it can help you find those problems, such as broken links, so that you can fix them.

Once your site becomes fully crawlable by Xenu I can almost guarantee you that it will be crawlable and indexable by the major search engine spiders. Download it from http://home.snafu.de/tilman/xenulink.html

Does clean code make that much of a difference?

Again, the answer is yes. By externalizing any code you can and cleaning up things like tables you can greatly improve your site.

First, externalizing JavaScript and CSS helps reduce code bloat which makes the visible text more important. Your keyword density goes up which makes the page more authoritative.

Similarly, minimizing the use of tables also helps reduce the HTML to text ratio, making the text that much more important.

Also, as a tip, your visible text should appear as close to the top of your HTML code as possible. Sometimes this is difficult, however, as elements like top and left navigation appear first in the HTML. If this is the case, consider using CSS to reposition the text and those elements appropriately.

Do Keywords in the domain name harm or help you?

The short answer is neither. However too many keywords in a domain can set off flags for review. In other words blue-widgets.com won’t hurt you but discount-and-cheap-blue-and-red-widgets.com will likely raise flags and trigger a review.

Page naming follows similar rules ‘ while you can use keywords as page names, it doesn’t necessarily help (as I mentioned above) further, long names can cause reviews which will delay indexing.

How many links should you have on your sitemap?

Google recommends 100 links per page.

While I’ve seen pages with more links get indexed, it appears that it takes much longer. In other words, the first 100 links will get indexed right away, however it can take a few more months for Google to identify and follow any links greater than 100.

If your site is larger than 100 pages (as many are today) consider splitting up your sitemap into multiple pages which interlink with each other, or create a directory structure within your sitemap. This way you can have multiple sitemaps that are logically organized and will allow for complete indexing of your site.

Can Googlebot follow links in Flash or JavaScript

While Googlebot can identify links in JavaScript, it cannot follow those links. Nor can it follow links in Flash.

Therefore I recommend having your links elsewhere on the page. It is OK to have links in flash or JavaScript but you need to account for the crawlers not finding them. Therefore the use of a sitemap can help get those links found and crawled.

As alternatives I know there are menus which use JavaScript and CSS to output a very similar looking navigation system to what you commonly see with JavaScript navigation yet uses static hyperlinks which crawlers can follow. Therefore do a little research and you should be able to find a spiderable alternative to whatever type of navigation your site currently has.

Overall, while I didn’t learn anything earth shattering, it was good to get validation “from the horses mouth so to speak.

I guess it just goes to show you that there is enough information out there on the forums and blogs. The question becomes determine which of that information is valid and which isn’t. But that, I’m afraid, usually comes with time and experience.