Google will penalize a domain for too many 404s. Opinion-Myth
While it might seem to make some sense on the surface, 404's alone couldn't be used to penalize an entire domain. Anyone -- your competition, for instance -- can create an unlimited number of 404 links to your domain. The only reason I didn't declare this idea as pure, unambiguously a myth, is that I'm not sure about the case where a mountain of previously resolving urls now all go 404 or 410.
However, I have a redevlopment case, now 3 weeks live, where thousands of urls have gone 404, and we only used 301 for maybe 60 urls that got the heaviest search engine traffic. The site as a whole is doing BETTER than before the redevelopment, so far at least. So if there is any truth hiding in this idea at all, I still see no evidence, even though some feel they do.
The 3 Click Rule of website design has never been more important than today in Google.
Time and time again I've seen this recently in "portfolio" sections on websites, within a full web design (ie all navigation links on every main page) leading to a linear portfolio section of 10 items.
What I've witnessed:
# Level 1 - Home Page from PR5 (Root)
# Level 3 - Portfolio Index file Linked from Every Page (PR5)
# Level 3 - Portfolio Page 1 - Linear - Page Rank 4
# Level 4 - Portfolio Page 1 - Linear - Page Rank 3
# Level 5 - Page Rank 0 (Supplemental) and so on....
This is what Ive seen, although said linear pages may suffer from similar metatags and lack of page content.
So if you want to ensure your site gets spidered - make sure your linking to most of your pages on most pages. This has big benefits for PR distribution on a site.
I use PR as a measure of a sort of trust and quality google assigns a page, not as a magic bullet for traffic levels or serps. If you've got a Page Rank 5 site, you should be adding content to increase traffic - as well as keeping an eye on and modifying internal linking structure.
Great thread by the way. I've not seen one as good as this for some time.
If you want to avoid supplementals - make sure your linking structure is coherent, page titles and meta descriptions are individual and you have at about 250-500 words on the page - and of course the content is not similar to other pages on your site or others.
Try and generate external links to interior pages to fend off supplemetals.
|Anyone -- your competition, for instance -- can create an unlimited number of 404 links to your domain |
Be careful not to mix apples and oranges. Other people creating 404 links to your domain (accidentally or on purpose) is quite a different matter from having 404's or other broken links within your own site.
Too many 404's within your own site will harm your rankings -- HIGHLY PROBABLE
|Its not about bots saving bandwidth, its about the page getting crawled and indexed. Adsense pages get crawled and indexed a lot quicker |
Yes, but if that's happening, it's a pleasant and understandable side effect, not Google's stated reason for caching crawl data.
from post 648
|Neither pages had any links to them. |
Maybe I'm not reading this right but if there really were no links to the new pages then without AdSense there is no way Google would have found the page without finding it through AdSense.
A good test would be to build the two pages and link them from the same page, say a level 2. Then put AdSense on one and see if it gets listed first.
It does seem like my new pages are getting found and put in the serps a lot faster than even 6 months ago. Maybe it is AdSense. I've had no obvious changes in PR.
I dug up a similar list of logical fallaices a couple of weeks ago after arguing against basing decisions on popular consensus (aka Argumentum ad numerum).
|Too many 404's within your own site will harm your rankings |
Ah, yes buckworks, a very good clarification -- so lets correct the language of this assertion altogether.
Too many 404 server responses will harm your rankings - Myth
Too many broken links within your own site will harm your rankings - Highly Probable
There's significant proof that google uses humans to check the results -- simply go to their hirings page and you'll see positions that describe evaluating search engine quality etc., If a good employee sees blatant blackhat, don't you think in the "evaluation" they'd throw up a red flag on that particular site? Any logical business-minded person would see that if you have hired people to evaluate the quality, there's a good chance that particular instances where the quality is deemed bad will be improved through further human intervention.
|The 3 Click Rule of website design has never been more important than today in Google. |
The three click rule of web design is a myth itself. :)
People will keep on clicking as long as you provide useful information, and the links that they have to follow are clear and easy to understand and enable them to find the information they are looking for, and complete the tasks they have set out to complete.
If Google only assigned pagerank to the URL at the home page of your site, and it flowed throughout a site from that page, diminishing somewhat at each deeper directory level, there might be something to drawing that analogy.
But, people can and will link to deeper pages on a site if given a reason, and internal pages of a site can have a higher pagerank than the home page of that site. (Example - Matt Cutt's blog)
edit to correct typo
[edited by: slawski at 6:06 am (utc) on Oct. 31, 2006]
|If a good employee sees blatant blackhat, don't you think in the "evaluation" they'd throw up a red flag on that particular site? Any logical business-minded person would see that if you have hired people to evaluate the quality, there's a good chance that particular instances where the quality is deemed bad will be improved through further human intervention. |
Or not, if it makes better sense to leave the offending page where it is for testing of algorithms and filters.
Your following statements appear conflicting:
non-compliant html causes you to rank badly - generally FALSE
following W3C guidelines helps you to rank better - TRUE
Or is it just me?
Naming all your images with the same keywords don't help rank - True
Write a great content with a great title will help ranking - True
Domain name is so important - True
Matt Cutts is a robot - Myth or maybe probable.... :P
all you said is Ture.I want to say,but...
sorry - should qualify.
I meant following guidelines re content/element structure, using correctly nested/appropriate headings headings, meta-information etc.
High traffic pages will be ranked well without adequate linking or on page SEO - Probable
I base this on the 'pull effect' I've noticed a few times. If a page gets vast amounts of traffic, Google must surely take notice and recognise the fact that users are finding the content on that page relevant and give it good ranking?
"Google is using human editorial input to affect the SERP"
Tedster, this is one of the things I was trying to get at when I sticky'd you a few days ago. I've noticed that in particular serps, none of the following factors seems to account for the sites that sit in positions 1-6 on page one: pagerank, numbers of IBL, anchor text. Human input made the difference? BTW, the sites that sit on top of this serps are
1. old and
2. are .gov sites or .edu
I would think the trust factor for them is high.
"adding outbound links to relevant sites makes a BIG difference in SERP results - TRUE "
This is something that Brett has written about but, being a pagerank hoarder, I didn't want to believe it.
Can we please get a show of hands on who believes this is true and who does not?
I'm starting to think that it is true myself.
|Human input made the difference? |
Ifgoal, I'd say that case is probably because of trust and history algorithm factors, and not due to human intervention. When we talk about human editorial input, I'd say there are at least three different things people are looking at:
1. Editorial input sets a parameter for a domain that then shifts the "raw" algo score when the SERP is calculated. But the actual effect is calculated automatedly. As in this Google Patent [webmasterworld.com] True
2. Hand tweaking the SERP - Google wants to avoid this, but my Opinion is that, in some cases of urgent necessity, it does happen from time to time and from market niche to market niche.
3. Hand setting a penalty or a ban on a given domain. We all know this can happen. True
I would say hand editing to serps has some major effects. Look at the shifts after the 1 billion page spammer was discounted...
|adding outbound links to relevant sites makes a BIG difference in SERP results - TRUE |
I've seen this effect in action. One of the most effective small sites I ever created had a set of 2-4 outbound links on every page to helpful authority sites. Rankings quickly were strong and stayed very stable.
One factor in play here is that anchor text matters as an on-page factor, not just as an influence on the target url.
|being a pagerank hoarder, I didn't want to believe it. |
Time to get off that sauce -- it will kill you!
|Can we please get a show of hands on who believes this is true and who does not? |
If you could see my hand it would be raised. This is an item that way to many people put down as myth, mainly because they just can't stand the thought of giving a visitor the slightest way out other than closing the browser.
*This* is one of the best threads I've seen in a long time.
Lets hope it de-bunks a few myths and strengthens a few truths.
Lets have more, lots more, especially anecdotal evidence as almost no one has done serious research on some of the topics already put up here.
I'm not a mythbuster like most of y'all but here's one I've been wondering about for a LONG time:
Submitting a Sitemap to Google can hurt you in the SERP's -?
I would think that it would either help or have no effect but I've heard too many horror stories that it hurts.
We have a large site that, even though it is data driven, still with parameters (we will be changing soon), it is well crawled by Google and has many top ten pages (some #1 and #2's even).
I would HATE to submit a Sitemap and have that change (i.e. learn the hard way).
props to Tedster for the great post topic.
ok here goes:
1. Putting too much focus on a web site's onsite optimization (i.e: h1, kw density, links, links, links etc.) is pointless as Google wants you to focus on your sites content and value to the USERS and not to G - natural growth of links and content is the predominant factor. - OPINION
2. Making your site easily crawlable and compliant to W3C standards WILL help your site in its rankings, as Google considers accessibility an important issue for its users, and will be going in that direction more and more - OPINION
3. Link age and stability are senior to link acquisition - TRUE - I've seen sites that have been on first SERPs with a handful links, outscoring bigger "bader" sites...thanks to old and relevant links.
|No PR and no deep IBLs to a page may mean "soon to become supplemental no matter how legit, unique and on topic it is, sorry we don't need this many results" - If this is to be true... ain't it just plain funny? Yeah, i'm laughing too >:D |
I am experimenting with outbound links this week.
|Submitting a Sitemap to Google can hurt you in the SERP's -? |
There were a handful of such reports in the earlier days of Sitemaps. Could have been a bug? Could have been that submitting a Sitemap helped Google spot troubles that already existed but weren't seen before? I haven't heard any recent reports of such troubles, and I have clients who regularly submit 6-7 figures worth of URLs in their Google xml sitemap.
In the opening post, I mentioned logical fallacies that can corrupt your SEO process, and here's a big one that might be in play on the Sitemaps issue. The fallacy is called post hoc ergo propter hoc, or translated from the Latin, "after this, therefore because of this".
Just because two events happen in sequence does not mean that the earlier event caused the later event. No cause-and-effect can be assumed just because of the order that events follow. The statistical probability of cause-and-effect goes up as the sequence gets observed frequently, over and over, but even then it's just a probability - not a truth.
Here's my Opinion. If an xml Sitemap for Google really CAUSED trouble, we'd be hearing a lot more yelling than we do. A whole lot more. And I'd be telling my clients to pull their Sitemaps as fast as I could, because it's my reputation on the line.
Can there be bugs? Of course there "can be". But I've seen nothing to convince me that significant damage has been caused by an xml sitemap, and I see plenty of counter-examples where both small and large sites are thriving AND using the Google program.
So I say, look at anecdotal reports with a skeptical eye -- especially when they are rare, and most especially when they are anonymous. If it's a report on a web forum, I particularly note the clarity of language and any evidence of a somewhat logical thought process. Amazing how often those factors are missing.
|Ifgoal, I'd say that case is probably because of trust and history algorithm factors, and not due to human intervention. |
I guess CSE groups are now doing the hand editing in a way.
|adding outbound links to relevant sites makes a BIG difference in SERP results - TRUE |
I don't know if it's true, but it stands to reason that Google would regard outbound links as a "signal of quality" when the page or site meets Google's other quality standards.
After all, Google's stated mission is "to organize the world's information and make it universally accessible," not just to help you or me or the guy next door to make a buck, and hypertext linking (a.k.a. citing other resources) is the most fundamental principle of the Web. It's also worth noting that information or quasi-information megasites like Wikipedia and TripAdvisor (which do very well in Google) have outbound links up the wazoo.
> Sorry I don't have a link to did a worthwhile analysis, but as I recall the traffic starts coming strong when you hit the #3 spot, but nothing compares to being #1.
I did some analysis on this and I found that
#1 80% will go to your site
So yes click throughs drop off rapidly as you move down the rankings. I guess if you are #1 some people either never click on any results or go for the sponsored links.
|I am experimenting with outbound links this week. |
Me too - This part of the discussion has definitely got me thinking as one of my sites (old site, changed content) does very well with just one (non reciprocal) out bound to 'authority' sites on each page, when compared to similar sites where I'm hoarding PR.
| This 117 message thread spans 4 pages: < < 117 ( 1 2  4 ) > > |