Tag Archives: Pagerank

What are SEO Back Link?

Back links are an important part of most successful websites on the net and are almost essential in order to get reasonable listings on search engines. Back links are quite simply, links from other websites. As part of most search engines algorithms it is obvious to see they rely heavily largely upon the amount of quality incoming links your website has. I would like to reiterate, quality links, there are many unscrupulous link farms and link schemes that if you participate in, will get you banned from the major search engines. Search engines see links from other, usually established, websites as a mark of respect and that you probably have something to offer.

If you your website is just starting out or you are new to the world of SEO then you should know that back links are a crucial part to a website, almost as much as any content on it. With just one link to you from another indexed site you can usually get a few search engine bots visiting your site within hours instead of days, or even weeks through manual submission. Back links usually mean that you can achieve some reasonable key words in some search engines within weeks if you have a good quality site.

Armed with the above information it is easy to see how vital back links are, so now you are wondering where to get them. Well you are advised to stay away from disreputable sources like I previously mentioned but they rarely work any way, instead you should visit places like the Sitepoint Forums & WebHosting Talk where many professional webmasters can be found. It is recommended you aim to link swap with websites that have a better google page rank than yourself

SEO Success: Step Three is Creating Long-Term Popularity

Finally, after the hard-core efforts that are directly related to generating traffic to your website, you next step is to develop a strategy that creates follow-on, long-term traffic. Several methods exist for this Tier III strategy:

– Taking the large list of ancillary keywords that relate directly or indirectly to your website, begin purchasing keyword related domain names that will be used to create traffic driving websites that point to your money site.

– Setup satellite pages using the keyword domain names with the idea of capturing traffic geared toward those specific keywords, developing niche market traffic. As you develop niche markets for those new customers, you are going to be able to capitalize on those customers for the other products and/or services that your company offers when these sites link back to your money site as a result of informing and attracting these new customers through the niche sites.

– Create links, back links and cross-links between the satellite pages and your money site, improving your link popularity for all websites, especially your money site.

– Display articles or e-books related to your industry and place these articles on your websites being aware of the best location based on keyword dominance.

– For long-term, viral marketing results, create articles or e-books that you will either sell on your website or offer to public domain websites where your link information must be retained for use by anyone else on the internet; these articles will create links that go back to your money site or your niche sites.

– Follow the same steps in Tier I and II for all the niche sites as you did for your original money site; that is, design a SEO-optimized website and follow Tier II strategy to create the necessary link popularity to drive traffic to these sites.

Monitoring
In order to determine if the SEO efforts are successful, you must monitor results of for number and quality of back links, PR and web trend statistics. There are many sources for generating this data as well as programs that will assist in these analysis. Whatever method you choose, whether it is using specific programs or setting up a spreadsheet for entry of data, be consistent in tracking the data. There will be fluctuations from day-to-day that you should expect. You are looking for general, overall upward trends, not short-term blips. Upward progress followed by maintenance of a strong position with slow, steady growth would indicate a successful SEO campaign.

This 3-Tiered approach to SEO Strategy is very effective when implemented with the patience it takes for long-term results. Short-term “fixes” and “tricks” may have an effect in the short-run, but as the search engines change and adapt their algorithms — as they doing almost monthly — what worked today for quick results may actually get you banned tomorrow. This strategy is based on search engine directives: Well-designed websites, free of bad code, offering information, services or products of interest to the internet community will create their own base of popularity — for which you will be rewarded by strong, growing traffic. When you are in it for the long haul, your strategy must utilize long-term efforts.

Keyword Targeting Strategy In Your Site

Once the keywords have been decided for the site one has to come up with a strategy to target those keywords across the site. Here is a primer on that.

Keywords Targeting Strategy for Keyword of Single word:
——————————————————-

Keyword of Single word is useful to attract general audience and helps in getting high rank ,if the web site is new. Keywords Targeting Strategy for Keyword of Single word is that they should be more used in the root level pages or top level domain pages. This is because these pages are the ones that attract the general traffic & are generally the pages which do not specify specifics. Hence single word keywords based on themes can be targeted for on these pages.

Keywords Targeting Strategy for Keyword of Multiple words:

———————————————————-

Keyword of Multiple words are useful for attract targeted audience therefore should be used topic wise for each page according to subject of that web page. Relevant set of Keywords should be use in Title tag, Header Tag, Meta tag, Body tag, Alt tag, Anchor tag, Comment tag and in the url (uniform resource locator) of that specific web page. Use underscore or hyphen to differentiate Keywords. These keywords are normally targeted upon in deeper level or sub directory level pages.

Keywords Targeting Strategy for Keyword based on Theme:

——————————————————

Keyword based on theme is useful to attract targeted audience therefore they are strongly recommended to be used. Typically, although it is not a hard and fast rule but in theme based keyword we use general keywords in root level and uses specific keyword in directory level.

Overall one can safely say that in keyword targeting one uses a dart board strategy. Wherein the smallest circle attracts your core audience & hence should have theme oriented multiple word keywords. As we move up on the theme (dartboard) the circle ( traffic) tends to get larger & the keywords tends to get simpler( singular) even though they are theme based.

KEYWORDS and KEYPHRASES usage in Domain Name:

——————————————–

Keywords and keyphrases use in Domain Name & urls of directory pages is also a factor in Search Engine Optimization (SEO) as well as it helps to inform targeted audience, about the sites content . Therefore, special care should be exercised in choosing a domain name. All the search engines start reading each web page with its respective domain name.

Having said that, let me add that there is a raging debate amongst the seo community to determine the exact importance of keywords in the domain name. One camp believes that it is hugely important to have keywords in domain whereas the other maintains that brand building domain name is more important than keywords incorporating one.

Whether keyword is a factor or not is debateble however it has been observed that domain name extensions do definitely play a part in rankings. Search engines ( specially google) have a predilection for sites having extension .gov,.edu,.mil in domain name. Search engines believe that information available at these domains would have a greater likelihood of being authentic as no commercial interest is served here.

Domain Name can be classifieds into two categories:

1. Keyword Specific

2. Brand Name Specific

Keyword Specific Domain Name
—————————-

When Keywords and Keyphrases are used in a Domain name then its called Keyword specific Domain name.

Keyword in the domain name to my mind has two advantages.

One It is better to have a keyword domain name for it is worth remembering that most of the people link to you using your URL. If your URL has the keyword or the keyphrase then you are automatically using the keyword in the very important anchor tag.

Secondly, though marginally, in a ranking scenario with every other parameter remaining equal, the Keyword Specific Domain Name will enjoy a slight edge over the other web site. However remember Keyword used in Domain name should be primary and generic. Specific keywords can be used in the sub domains. Read about the use of subdomains.

Keywords and Keyphrases uses with hyphen/underscore in Domain name are said to preferred as they are read by search engines as separate words. So the domain name can have a search phrase incorporated into it. However the flip side of it is that domain names with hyphen or underscore are inconvenient to carry.

Brand Name Specific Domain Name
——————————-

When company or organizations’ name is used in Domain Name to brand it then it’s called as Brand Name Specific Domain name.

Brand Name Specific Domain Name does not help online searches at all. However they are very powerful tools for company identification in the mind of users. In the anonymous online world a brand spells loyalty, trust & value. So if you choose to go for building your brand rather than deep rooted optimization, one way of incorporating keywords is through your directories & pages names.

KEYWORD DENSITY
—————

Keyword Density is a proportion of the searched term ( Keyword or Keyphrase) against the whole words written on a given page. The ideal Keyword Density is 6%-8% though various search engines have various tolerance levels before their spam filters get activated. Higher Keyword Density does help to boost a pages’ ranking.

Keyword Density can be increased by using target keywords repeatedly in Title tag, Header tag, Body tag, Comment tag, Alt tag, Anchor tag, Paragraph Tag , Domain name and in diretory/page names.

However one disadvantage of trying to hike the keyword density is that the visible text on the page starts to look spammy if it is not carefully crafted. That makes for a bad copy.

Are You Getting Nuked By Google Lately?

Since the last Google update, there have been many instances and examples of the Google Nuke Bot! This is what I call it anyway. Have you visited a favorite website lately only to realize they’ve been nuked by Google?

More and more we are seeing internet marketing / SEO companies getting nuked, by Google completely removing them from their data banks. I am not going to mention any names because I’m sure the owners of the once populated websites already know and are embarrassed from this development.

Since the word went out on WebPositionGold getting banned from Google for automatic queries sent to Google, we are noticing other related websites going down for the count as well.

For the info on WebPositionGold, go here:
http://www.socialpatterns.com/search-engine-marketing/webposition-banned/

The things is, we already know about Webpositiongold, what about other sites that are getting hit hard? Has your site been nuked?

It seems as though, some sites that had thousands of links pointing from Google are getting hit the hardest. It seems like Google is cracking down on “spam tactics”, “submission tactics”, and anything related to unethical SEO practices.

Is Google Making An Effort To Uphold Their Webmaster Guidelines?

Will it come to a point where if we don’t uphold the Google guidelines, we cannot be successful online? This thought is ridiculous but almost scary to think it could happen! What about website’s that still hide text through same background colours? Hidden div layers? and mirror pages? Why hasn’t Google attacked those issues first?

You can almost make the assumption that by Google nuking websites that send automatic ranking & link popularity queries to their data base, this may be a huge effort to relieve the strain on the query servers in order to free up some memory.

How Does Getting Your Website Nuked From Google Affect Your Credibility?

An event like this could ultimately ruin your reputation online. People who have come to trust your knowledge and judgment on Google rankings may never look at your company the same ever again. People might think to themselves “I don’t want to get nuked like they did!”.

How Can You Tell You’ve Been Nuked?

* Your Google Page Rank is now 0-2/10 and should be at least 5/10
* You have zero backlinks listing in Google anymore
* You have zero internal website listings within Google by doing (site:www.yoursite.com)
* Google’s cache of your website is no longer to be found

For newer websites, don’t be confused between this nuking process and your own evolution online. Getting and maintaining a high PR level takes a lot of work.

Once Nuked, Does Google Still Come Back?

The question I have for websites that have been nuked: Can you still see Google in your stat log files for your site? If so, I wonder if Google is still keeping an eye on you and watching your every move?

In Conclusion:

Stay away from programs that generate automatic queries into Google. Don’t check your link popularity 3 times a week and especially don’t check your search engine rankings twice a day. Just simply promote your website and measure your success through your internal website stats and monthly profits. Google doesn’t appreciate websites that consistently draw their power in order to measure their success, just simply take that out of your daily actions.

Cheers to your online success!

Search Engine Optimization

The Good and the Bad of SEO ‘s From Google’s Mouth!

I recently had the opportunity to ask questions of some Google staffers. There were some questions I felt I needed to get verification on, so when I had the opportunity via a conference call I took it.

In this article I highlight some of the points made during the call so you know what Google thinks.

You know its bad when you take time from your holidays to come into work to attend a conference call. But that’s what I did a few weeks ago. You see I had to because I was going to have the opportunity to ask some Google employees specific questions on things that I’d been pretty sure about, but wanted to hear it right from the horses mouth.

The call lasted less than an hour, but in that time I found that there were many things I figured were indeed true. So lets start with the most obvious:

Is PageRank still important?

The short answer is yes  PageRank has always been important to Google. Naturally they couldn’t go into details but it is as I suspected. Google still uses the algorithm to help determine rankings. Where it falls in the algo mix, though, is up for speculation. My feeling however is that they’ve simply moved where the PageRank value is applied in the grand scheme of things. If you want to know what I think, be sure to read this article.

Are dynamic URLs bad?

Google says that a dynamic URL with 2 parameters should get indexed. When we pressed a bit on the issue we also found that URLs themselves don’t contribute too much to the overall ranking algorithms. In other words, a page named Page1.asp will likely perform as well as Keyword.asp.

The whole variable thing shouldn’t come as a surprise. It is true that Google will indeed index dynamic URLs and I’ve seen sites with as many as 4 variables get indexed. The difference however is that in almost all cases I’ve seen the static URLs outrank the dynamic URLs especially in highly competitive or even moderately competitive keyword spaces.

Is URL rewriting OK in Google’s eyes?

Again, the answer is yes, provided the URLs aren’t too long. While the length of the URL isn’t necessarily an issue, if they get extremely long they can cause problems.

In my experience, long rewritten URLs perform just fine. The important thing is the content on the page.

That was a common theme throughout the call  content is king. Sure optimized meta tags, effective interlinking and externalizing JavaScript all help, but in the end if the content isn’t there the site won’t do well.

Do you need to use the Google Sitemap tool?

If your site is already getting crawled effectively by Google you do not need to use the Google sitemap submission tool.

The sitemap submission tool was created by Google to provide a way for sites which normally do not get crawled effectively to now become indexed by Google.

My feeling here is that if you MUST use the Google sitemap to get your site indexed then you have some serious architectural issues to solve.

In other words, just because your pages get indexed via the sitemap doesn’t mean they will rank. In fact I’d bet you that they won’t rank because of those technical issues I mentioned above.

Here I’d recommend getting a free tool like Xenu and spider your site yourself. If Xenu has problems then you can almost be assured of Googlebot crawling problems. The nice thing with Xenu is that it can help you find those problems, such as broken links, so that you can fix them.

Once your site becomes fully crawlable by Xenu I can almost guarantee you that it will be crawlable and indexable by the major search engine spiders. Download it from http://home.snafu.de/tilman/xenulink.html

Does clean code make that much of a difference?

Again, the answer is yes. By externalizing any code you can and cleaning up things like tables you can greatly improve your site.

First, externalizing JavaScript and CSS helps reduce code bloat which makes the visible text more important. Your keyword density goes up which makes the page more authoritative.

Similarly, minimizing the use of tables also helps reduce the HTML to text ratio, making the text that much more important.

Also, as a tip, your visible text should appear as close to the top of your HTML code as possible. Sometimes this is difficult, however, as elements like top and left navigation appear first in the HTML. If this is the case, consider using CSS to reposition the text and those elements appropriately.

Do Keywords in the domain name harm or help you?

The short answer is neither. However too many keywords in a domain can set off flags for review. In other words blue-widgets.com won’t hurt you but discount-and-cheap-blue-and-red-widgets.com will likely raise flags and trigger a review.

Page naming follows similar rules ‘ while you can use keywords as page names, it doesn’t necessarily help (as I mentioned above) further, long names can cause reviews which will delay indexing.

How many links should you have on your sitemap?

Google recommends 100 links per page.

While I’ve seen pages with more links get indexed, it appears that it takes much longer. In other words, the first 100 links will get indexed right away, however it can take a few more months for Google to identify and follow any links greater than 100.

If your site is larger than 100 pages (as many are today) consider splitting up your sitemap into multiple pages which interlink with each other, or create a directory structure within your sitemap. This way you can have multiple sitemaps that are logically organized and will allow for complete indexing of your site.

Can Googlebot follow links in Flash or JavaScript

While Googlebot can identify links in JavaScript, it cannot follow those links. Nor can it follow links in Flash.

Therefore I recommend having your links elsewhere on the page. It is OK to have links in flash or JavaScript but you need to account for the crawlers not finding them. Therefore the use of a sitemap can help get those links found and crawled.

As alternatives I know there are menus which use JavaScript and CSS to output a very similar looking navigation system to what you commonly see with JavaScript navigation yet uses static hyperlinks which crawlers can follow. Therefore do a little research and you should be able to find a spiderable alternative to whatever type of navigation your site currently has.

Overall, while I didn’t learn anything earth shattering, it was good to get validation “from the horses mouth so to speak.

I guess it just goes to show you that there is enough information out there on the forums and blogs. The question becomes determine which of that information is valid and which isn’t. But that, I’m afraid, usually comes with time and experience.

How Important Are Back Links?

When setting up your website for SEO (Search Engine Optimization) on Google there are several factors you need to look at in order to obtain a high rank on their search engine. Of course your content and meta tags must be inline with positive density percentages and reciprocal links. Google then takes your website and performs a mathematics equation and places a numeric value on your website depending on one of the most important features, reciprocal or back links.

A back link and reciprocal link are identical. They both say the same thing to the Google engine, that your site should be ranked higher in the order because other people find value in what your website has to offer, thus they provide a link to your site. In turn, you keep a closed loop by reciprocating the favor to the other website by extending the same courtesy of a back link. Thus creating a solid network connection. Google likes to see inter connectivity and will reward your website well for planning it this way.

There are drawbacks to the equation. As things change a website that you are affiliated with may drop a hyperlink or a page may get accidentally deleted. When the Google robot goes through your website and finds a dead link it notes that you aren’t keeping good care of your website and punishes your web rank by reducing its point value. If you wish to know what your sites current point value is download The Google Toolbar and search for  website www.canigetinfo.com in the box and perform a Google web search. Upon reading the full URL, Google will go directly to your site first thus pulling up your home page. There on the toolbar will be a page rank for your website between 1 and 10. 1 being a less visited and noted website and 10 a site that screams traffic 24/7.

Some of the individuals you share reciprocal links with may in fact scan all their links for continuity, should they receive a bounce back for a broken link on your website you can be assured you will receive an email from them. Keeping your website in balance with other sites you share links with will keep the Google engine happy. If you go off and add a company that is not Google friendly, meaning they have no back links you may also lose points.