Saturday, April 21, 2007

Link Rot and Its SEO Effects

Dead link or broken link is a link on the world wide web pointing to a website, web page or web server that may either be permanently unavailable or temporarily inaccessible . The most common result of a dead link is a 404 error, which indicates that the web server responded to a query but the specific page could not be found and loaded in your browser.
The browser may also return a DNS error or the temporary unavailability indicating that a web server could not be found at that specifically intended domain name. Any link may also be called dead because of some form of blocking such as content filters, anti-spy or firewalls which are all configurable on their settings. Most webmasters make mistakes here seeing dead links permanently but actually just having problems on their settings of either of the three factors.

There is actually another type of dead link which i would like to personally call as "irrelevant link"wherein a certain URL that points to a site unrelated to the content sought becomes virtually useless to the visitor or to the original content itself. This type of dead link are the ones responsible for getting tagged as useless by most visitors to a webpage or website. This is also where many webmasters make mistakes and take for granted their SEO efforts for the sake of giving one-way outbound links to their advantage to inflate or overpopulate their website's link popularity campaign sheets.This can also sometimes occur when a domain name is allowed to lapse, and is subsequently re-registered by another party for ownership and giving other types of content to it as like buying stiff stable domains with multiple or numerous existing inbound links for a lesser effort on SEO and link building needs. Domain names acquired in this manner and advantage are attractive to those who wish to take a peek at the already existing stream of unsuspecting surfers that will inflate hit counters and Page Ranking (PR).

Link rot on the other hand is the process by which links on a website or webpage gradually becomes irrelevant or broken or sometimes dead for over a period of time as sites they link to disappear, change content, have DNS problems or redirect to new locations for whatever purpose they may serve the owner.

Therefore Link Rot is actually what becomes of any link, after a certain period of time of becoming a dead link, broken link or irrelevant link existing in the world wide web today. Link rot is also one of the most disregarded terms in seo these days although many webmasters are actually familiar with them and encounter them somewhere along the way on their seo life.

There is another existing phenomena on links based on a certain name called originally by the Dutch as "zangelding". According to wikipedia "links specially crafted to not resolve, as a type of meme, are known as Zangelding, which roughly translated from Dutch means tangle thing. A zangelding is basically a list of self referencing broken links."
This zangelding is actually a very popular technique but also a very unpopular as an SEO word at the same time as many of us webmasters and SEO personalities i am sure have tried doing it before or quite some time on our SEOing of webpages and websites that we handle personally.

Actually, this is very unbecoming or unethical to use or resort to as a webmaster but i myself has tried it also and am still continually doing to this point in time of my SEO and blogging life for my own reasons which i find ease doing.

According also to wikipedia, "dead links commonplace on the Internet can also occur on the authoring side, when website content is assembled, copied, or deployed without properly verifying the targets, or simply not kept up to date. Because broken links are to some very annoying, generally disruptive to the user experience, and can live on for many years, sites containing them are regarded as unprofessional."

So assessing overall the usability of these dead links, i would say it falls to "0" because of its natural characteristics and bad reputation as a word and as a tactic of usage. Although most of us webmaster never even bother at this terminology, still we cannot take our attention at these few things that some of us don't even know exists and is as popular as the best SEO personalities there are in the SEO umbrella.

Monday, April 09, 2007

The Google Delay Filter and Sandbox

Delay Filter - Called sometimes as the aging delay, this happens to any site entering its first 6 months of being uploaded and available in the SERPs and search engines to crawl. What most experiences tell about this fact by webmasters is that when you start to upload the site that you have, it will at first suffer being repressed by Google for the first 6 months of its age and that is the standard being followed implementing this aging delay or delay filter.

Although some sites are even 8 - 9 months old and still earning that repressing delay filter scheme of Google, it does not necessarily mean that you have to do drastic things and measures to undergo getting indexed and virtually appear at the major search engine results pages.

Some webmasters due to this delay filter effect thinks that they can still do something out of their site like optimizing for more advantage at appearing in the SERPs. Some would even resort to link building through linkfarms and automated link exchange softwares and even worse, re-assembling the webpages of the site you have.

What to Do With Your Website Subjected Under Delay Filter:

This delay filter for the information of some of you webmasters out there who still does not know and understand anything about this phenomenon, i suggest that you give a little space and allow your websites the time to recover by natural ways because it is very wrong to tweak and modify and over optimize your pages only to get the results that you will not get because you are being closely watched by this delay filter scheme of Google.

It will only hurt your websites the more if you will not stop going for these wrong practices instead of getting the importance and trust by search engines. Remember that you cannot ask or force Google to just simply tell you what ever is wrong with your website or websites because up to this point in time, nobody can tell us specifically or anybody about what factor or what a webmaster is to do to trigger the removal of this delay filter or aging delay.

The only best thing to do under this situation is to continue to build up links to your website and nothing more. This might be the most basic yet is proven to be the most effective of all when used properly and in the right timing and phase.

Actually this delay filter to my belief is for the purpose of minimizing spam and is greatly effective at blocking unwanted e-mail, spams, phishing and viruses. It also blocks up to 73 % of all unwanted e-mails according to , a web site dedicated at removing spams and unwanted e-mails in the server side applications.
Most webmasters believe that sandbox is to blame for the fate of their websites getting lost in the serps after ranking so well in other search engines like MSN and Yahoo. This sandbox theory is another belief that by obtaining too many links at such a short period off time, will result at getting penalized by Google for being seen as spamming the search engines.
This sandbox theory actually sound like a valid and logical possibility and this is the reason why many webmasters used to blame the engines like Google for most in sandboxing their website.
What about suffering the same ranking and indexing delay even if you don't apply automated linking through automated directory submissions and link farms or networks and not even spamming through keyword stuffing? How was that? Now this has been explained again and again in the many forums that i have visited on my SEO knowledge hunting.
That sandbox theory is very different from aging delay or delay filter. Delay filter is about getting hard to impress the search engines to index the website because of natural causes and delays such as getting only a few relevant links which happens when the site is on its early ages from 6 months to 9 months at most while the sandbox theory is bout not getting indexed after ranking high in some search engines for a particular keyword phrase because of suspected spamming by obtaining a large volume or too many links at the shortest possible time set by the search engines themselves.
There is a buzzing theory that "the age factor for new sites is Google's answer to mini-networks and other multi-site strategies intended to artificially inflate link popularity", according to on a relevant article i have read as reference from a famous forum thread.
This is done by dividing a single site into multiple websites and therby exchanging their links in each others benefit or interlinking the pages themselves to achieve more links leading to the main site. Another popular tactic is to build multiple small sites to give a more distinct link popularity to the main site.
This method leaves us a very useful tip that if we have a project from a prospected client and we already have the supporting sites ready and up, the better because the point here is to have Google know in advance that a new domain of supporting sites are already recognized related to the main site and on the way is the main site's new domain.
AdWords or PPC or Overture Campaigns:
AdWords or PPC or Overture campaigns may easily drive traffic to your site and the purpose of this scheme is to add up extra optimization and traffic to your site while you are waiting for Google to lift its delay filter scheme or aging delay scheme up on your main site.
Try also to optimize your pages for MSN and Yahoo instead of just waiting for Google's authority and respect as a prestigious engine by building up backlinks your way. Avoid as much as possible those automated submission softwares and link farms because it won't help in any such way for your quest to gain top SERP results.

Enter your email address:

Delivered by FeedBurner