homepage Welcome to WebmasterWorld Guest from 54.226.0.225
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 33 message thread spans 2 pages: 33 ( [1] 2 > >     
On Page Content = Little or NO Weight in SERPs
I proved it the hard way
iJeep




msg:203103
 10:44 pm on Jan 19, 2005 (gmt 0)

Yesterday, a table in my database crashed which caused the site to freeze until the table was dropped and reloaded. During that short time Google came by.

I am now ranked on the same competitive 2 word phrase in the same spot with the heading "Warning" and the body text:
Warning: mysql_connect(): User ****** has already more than 'max_user_connections' active connections in *****************/database.php on line 20
Unable to connect to database server!

It taught me not to concentrate on the page so much, and more importantly----Make a soft landing page for database errors!

 

Dugger




msg:203104
 8:45 pm on Jan 20, 2005 (gmt 0)

There are some sites in the top 10 with the page title and body text - "This Site Is Suspended" - and they still rank for the keywords they used to target.

On page content means zero now by the looks of things.

diamondgrl




msg:203105
 8:46 pm on Jan 20, 2005 (gmt 0)

Actually, I think you took the wrong lesson from this. It sounds like the competitive two-word phrase is NOT on your page so I presume Google bases its order of the SERPs on an index that is more delayed than the display for the SERPs. And if Google continues to see this, you will not rank as highly any more for this phrase.

Although your soft landing point is a good one.

pageoneresults




msg:203106
 8:47 pm on Jan 20, 2005 (gmt 0)

On page content means zero now by the looks of things.

That's a pretty bold statement if you ask me. :)

Did you check the cached page?

Dugger




msg:203107
 9:45 pm on Jan 20, 2005 (gmt 0)

Cache is "suspended" page even though the site is in the serps just as if it was still the old unsuspended page. As long as there are still links pointing at it with the keywords in the link text that sems to be all that is required to rank in Google.

iJeep




msg:203108
 10:02 pm on Jan 20, 2005 (gmt 0)

Yes, my cache was of the error message created by the die() function in php.

The freshbot came by and G has an updated cache now. (and I am in the same spot on the rankings)

-------------------

Even if the on page content means something, It seems to mean very little. It used to be that I could raise and lower my ranking by one or two spots by playing around with key phrases in the content and the freshbot listings would move me around from day to day.

The thing I don't understand is that if content means so little and others say links don't mean that much anymore...what are the pages actually ranked on? Perhaps on Google toolbar visits to a site?

pageoneresults




msg:203109
 10:15 pm on Jan 20, 2005 (gmt 0)

Others say links don't mean that much anymore.

Not sure where that advice was given but links are the core factor in how sites rank in Google and other search engines. The quantity may not matter as much as the quality does though.

diamondgrl




msg:203110
 10:38 pm on Jan 20, 2005 (gmt 0)

I think this discussion has gotten a little beyond the actual evidence.

I proposed what I think is a much more likely possibility in this case: that the SERPs held steady simply because the algorithm that ranks the site may be somewhat forgiving of temporary outtages. It does not penalize the site immediately.

Now if you kept that error on the page for a few weeks, I would be hard-pressed to believe you would continue to rank on those keywords.

steveb




msg:203111
 11:36 pm on Jan 20, 2005 (gmt 0)

"Cache is 'suspended' page"

The fresh cache or the master cache. What you describe the page is pretty clearly being ranked on its master cache.

More to the point, just because links are more important doesn't mean on page means nothing.

glitterball




msg:203112
 12:23 am on Jan 21, 2005 (gmt 0)

Just to add another story to this thread...

Recently the official government website for a popular holiday destination rose dramaticaly in the serps (the first time I ever saw it in the top 10).
Why? Because someone had forgotten to renew the domain name and the Network Solution error page was being displayed.

This error message (despite not containing the destination name) actually ranked better than the actual (Flash) website.

The domain name has since been renewed and the page is now back in it's former postion in the SERPS (about 10 places lower!)

paybacksa




msg:203113
 12:27 am on Jan 21, 2005 (gmt 0)

The domain name has since been renewed and the page is now back in it's former postion in the SERPS (about 10 places lower!)

thanks! A new SEO tactic guaranteed to get a client ranked!

stace




msg:203114
 2:20 am on Jan 21, 2005 (gmt 0)

Yeah sure that's a great tactic: let your domains expire to achieve higher rankings.... more like sure way to lose your website rankings for good.

Sites don't disappear from rankings just because they are old, outdated, and even producing error pages. Nor do most webmasters bother deleting outdated links more than once in a blue moon. There are tons of sites w/ a ridiculously high number of backlinks that are either gone or haven't updated in 4 years - I myself am probably guilty of leaving some of these links up.

So I wouldn't assume that after 2 or 3 months a site will begin losing all of their backlinks, that's just not the case. Most of us are very busy managing the content, trying to keep our head above water with SEO trends, developing our sites - and don't have time to monitor & delete links all the time. And those backlinks will help many of these dead & decaying sites keep rankings that are higher than they should be.

jaffstar




msg:203115
 9:05 am on Jan 21, 2005 (gmt 0)

Don't discount on page content, you would be amazed how much content can help an established domain.

If you are targeting a certain keyword such as "widgets" with a heavy link campaign, then you don't need any on page content , but you can get new rankings from different keywords derived from on page content.

McMohan




msg:203116
 9:22 am on Jan 21, 2005 (gmt 0)

And an another story.

For one of my sites that went obsolete, I put a robots.txt to prevent all robots from crawling the site. It is 2 months now, and it is still ranking for the phrase (an Overture 2125 monthly) at the same spot within the first page, albeit without a cache. Strange?

Mc

JuniorOptimizer




msg:203117
 10:45 am on Jan 21, 2005 (gmt 0)

This is great news. Now I can just make blank pages and save myself some time :)

petehall




msg:203118
 10:57 am on Jan 21, 2005 (gmt 0)

As a test I created a site of almost entirely of links on an five year old domain (with old backlinks) and it outranked everything I targeted.

I agree pages do not need content to rank highly in Google.

Whether or not content would be the icing on the cake in a close battle is unknown to me.

It's still there by the way - outranking everything with content beneath it.

But then it would because it is a domain with history so therefore it must be better than everything else... isn't that right Google :-)

trillianjedi




msg:203119
 11:03 am on Jan 21, 2005 (gmt 0)

Jaffstar nailed point #1

As long as there are still links pointing at it with the keywords in the link text that sems to be all that is required to rank in Google.

No. Amend to : ".....all that is required to rank in google for that specific phrase".

Good on page content will catch a lot of the topics surrounding that phrase. You miss a lot of free traffic without it.

Point #2 is your users will love you for your content, not your anchor text. Good content will mean that they all email their mates saying "check out this great site I found!". They'll come back. Ranking #1 on a specific term is a waste of time and bandwidth if all they do is hit the back button.

Good content will also organically attract more inbound links, some of which will have anchor text. That brings traffic and searchers.

Your original title statement:-

On Page Content = Little or NO Weight in SERPs

Is wholly incorrect. Whilst you're right in thinking that good inbound links with anchor text is critical in google at the moment (and mega important for your main phrase), don't get so caught up in it that you miss the bigger picture.

TJ

Oliver Henniges




msg:203120
 11:50 am on Jan 21, 2005 (gmt 0)

> For one of my sites that went obsolete, I put a robots.txt to prevent all robots from crawling the site...

McMohan that doesn't work as far as i know. You should either actively use google's delete-url form or put the "noindex"-tag onto the pages, open it again in the robots.txt, and then wait until it vanishes from the index.

As to the main topic: OPs conclusion goes too far and wasn't really meant serious, I assume. In the past months I got the impression that google somewhat rotates the way it turns the knobs: sometimmes backlinks are more important, sometimes anchor text, sometimes lexical content and so on. Make sure you don't concentrate on any single of these techniques too much and good page content is still the best in the long run: for you, for your visitors, and for your position in the serps.

Over all this we're faced with the assumption that google is not able to cope any longer with the mere number of backlinks produced by automated blog-spamming. As brett tabke pointed out elsewhere the no-rel-tag issue is the first time google goes public to solve a technical issue.

An historical cesure, which surely will lead to a backshift towards issues of linguistic analysis of pages.

Jon_King




msg:203121
 12:33 pm on Jan 21, 2005 (gmt 0)

>>On Page Content = Little or NO Weight in SERPs

One example != Algo

mikec




msg:203122
 2:59 pm on Jan 21, 2005 (gmt 0)

there is a site in one of my fields that has been down for over a month..the cache is one of those domain for sale sites, it also has been that way for over a month now. it has still held it's ranking.

irock




msg:203123
 3:35 pm on Jan 21, 2005 (gmt 0)

I'm no expert, but my guess is that Google engineers have given the thought that ranking shouldn't be affected for a short while if a website goes down for whatever reasons.

Maybe a day of outage (mysql error or 404 or 503) will not affect ranking at all. When my site went down for a week last year, Google cache was missing, but my ranking remained. Though I doubt my ranks would stay should my server remain out of operation.

I confirmed that when my competition had the same problems some time ago. The competing website went down for 2 weeks, yet, rank remained.

walkman




msg:203124
 4:34 pm on Jan 21, 2005 (gmt 0)

"I'm no expert, but my guess is that Google engineers have given the thought that ranking shouldn't be affected for a short while if a website goes down for whatever reasons."

that makes sense. Databases go down, and also, if your site is not responding they are not going to remove it right away. Unless it's the Adobe Acrobat reader page with a zillion links with "Acrobat Reader" as anchor, on page content matters. Whether it matters as much as it once did, its a different story.

Green2K




msg:203125
 5:02 pm on Jan 21, 2005 (gmt 0)

There is definitely a lag between on-page changes and position in SERPS. I experienced this recently on a site I acquired by substantially tweaking the page and the title tags.

The cached page changed in a day or two but it took longer (I looked away so can't say exactly) for the site to move position (to #1).

Green2K

elklabone




msg:203126
 6:21 pm on Jan 21, 2005 (gmt 0)

One of the sites a client is competing against is a framed site, with a blank cache (the top frame is just used to make the URL look the same), the keyword appears in the title tag ONE time, and this site ranks #3 for a very competitive keyword, with absolutely NO content #4 is a identical site... if you add &filter=0 you'll see dozens more of the same type of site.

These sites have been there for months, if not years. Can I post the keyword? It gets 55,283 searches a month on Overture... so it's a fairly competitive keyword.

--Mark

McMohan




msg:203127
 6:32 pm on Jan 21, 2005 (gmt 0)

ranking shouldn't be affected for a short while if a website goes down for whatever reasons

"Short while" can't be defined as a month or two. If Google still thinks a site deserves a rank even after being down for nearly a month, then they are doing a great disservice to user community, by giving them results that is meaningless. On the other hand, Google can drop the site, if its found down for more than a couple of days, and then do a periodical visit(the visit interval may depend on the PR or the traffic according to the toolbar) to check if the site is up and then show them in its index.

Ah, its easier said than done, huh? :)

Mc

irock




msg:203128
 7:09 pm on Jan 21, 2005 (gmt 0)

Well, for my oldest site which was pulled down by an irresponsible host, Google took about some 20 days to remove it from the index completely.

elklabone




msg:203129
 3:14 am on Jan 23, 2005 (gmt 0)

... stumbled across another good example:

<widget>

THe #1 site is a frame with no content.

--Mark

[edited by: ciml at 4:38 pm (utc) on Jan. 24, 2005]
[edit reason] No keyphrases please. [/edit]

martingale




msg:203130
 3:56 am on Jan 23, 2005 (gmt 0)


I bet there's a simpler answer to this. Thinking through as a software architect, I will gamble this is what is happening:

(1) Google only re-runs the main PageRank algorithm once every few weeks or months.

(2) Google freshens the cache on an ongoing basis, with only minor tweaks to a page's rank at this time.

So, the last time the main PageRank algo ran your site had real content in Google's cache, plus the backlinks, and you ranked pretty high. After that your site went down, and the freshbot freshened the data in the cache, but no re-run of PageRank so your site retained the PageRank it had previously.

If that's what is happening then it's only a matter of time before you lose your PageRank once your content disappears. "Matter of time" might be a few weeks or a few months, but it'll happen.

I don't know this to be the case. I am just imagining how the software might work to produce this effect.

McMohan




msg:203131
 5:40 am on Jan 23, 2005 (gmt 0)

martingale,

You have given a sincere effort in trying to analyse this, but you start with a wrong premise that is PR is based on page content. A site can contain absolutely zero content and still have pagerank since PR is only based on links.

Mc

martingale




msg:203132
 10:19 am on Jan 23, 2005 (gmt 0)

OK, perhaps I was a bit sloppy with terminology. My understanding is that SERP placement depends on both PageRank and relevancy, where the content on your site would be an important relevancy factor.

Think about what Google has to do to produce a SERP based on this data: they need this massive, enormous index of the entire web that relates practically every page to every other page in some way, taking into account (a) links, (b) text near links, and (c) text on the page itself. That's a massive computing task, and I bet Google doesn't have the computing resources to keep this index totally up to date. I bet it takes them weeks or months to rebuild these massive indexes. These rebuilds are called "updates".

Were Google only updated once every few months people would complain that their SERPs are stale, so they run around this "freshbot" that makes it seem like the index rebuilds more frequently than it really does. The freshbot updates the page cache, and it may even tweak the SERP rankings a bit, but it doesn't do the big recalculation. The results are still mostly from the old index based on the old data (for all pages, not just yours) until Google crunches through another overall update.

Now there could be variations on this: Google could incrementally update SERPs on an ongoing basis, but take months to update them all, etc., etc., but the idea is still the same. I have no idea if this is how it works. It's just a bet based on how much data they have to crunch, and the behavior described on this thread.

This 33 message thread spans 2 pages: 33 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved