Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: open
I am now ranked on the same competitive 2 word phrase in the same spot with the heading "Warning" and the body text:
Warning: mysql_connect(): User ****** has already more than 'max_user_connections' active connections in *****************/database.php on line 20
Unable to connect to database server!
It taught me not to concentrate on the page so much, and more importantly----Make a soft landing page for database errors!
Although your soft landing point is a good one.
The freshbot came by and G has an updated cache now. (and I am in the same spot on the rankings)
Even if the on page content means something, It seems to mean very little. It used to be that I could raise and lower my ranking by one or two spots by playing around with key phrases in the content and the freshbot listings would move me around from day to day.
The thing I don't understand is that if content means so little and others say links don't mean that much anymore...what are the pages actually ranked on? Perhaps on Google toolbar visits to a site?
I proposed what I think is a much more likely possibility in this case: that the SERPs held steady simply because the algorithm that ranks the site may be somewhat forgiving of temporary outtages. It does not penalize the site immediately.
Now if you kept that error on the page for a few weeks, I would be hard-pressed to believe you would continue to rank on those keywords.
Recently the official government website for a popular holiday destination rose dramaticaly in the serps (the first time I ever saw it in the top 10).
Why? Because someone had forgotten to renew the domain name and the Network Solution error page was being displayed.
This error message (despite not containing the destination name) actually ranked better than the actual (Flash) website.
The domain name has since been renewed and the page is now back in it's former postion in the SERPS (about 10 places lower!)
Sites don't disappear from rankings just because they are old, outdated, and even producing error pages. Nor do most webmasters bother deleting outdated links more than once in a blue moon. There are tons of sites w/ a ridiculously high number of backlinks that are either gone or haven't updated in 4 years - I myself am probably guilty of leaving some of these links up.
So I wouldn't assume that after 2 or 3 months a site will begin losing all of their backlinks, that's just not the case. Most of us are very busy managing the content, trying to keep our head above water with SEO trends, developing our sites - and don't have time to monitor & delete links all the time. And those backlinks will help many of these dead & decaying sites keep rankings that are higher than they should be.
If you are targeting a certain keyword such as "widgets" with a heavy link campaign, then you don't need any on page content , but you can get new rankings from different keywords derived from on page content.
For one of my sites that went obsolete, I put a robots.txt to prevent all robots from crawling the site. It is 2 months now, and it is still ranking for the phrase (an Overture 2125 monthly) at the same spot within the first page, albeit without a cache. Strange?
I agree pages do not need content to rank highly in Google.
Whether or not content would be the icing on the cake in a close battle is unknown to me.
It's still there by the way - outranking everything with content beneath it.
But then it would because it is a domain with history so therefore it must be better than everything else... isn't that right Google :-)
As long as there are still links pointing at it with the keywords in the link text that sems to be all that is required to rank in Google.
No. Amend to : ".....all that is required to rank in google for that specific phrase".
Good on page content will catch a lot of the topics surrounding that phrase. You miss a lot of free traffic without it.
Point #2 is your users will love you for your content, not your anchor text. Good content will mean that they all email their mates saying "check out this great site I found!". They'll come back. Ranking #1 on a specific term is a waste of time and bandwidth if all they do is hit the back button.
Good content will also organically attract more inbound links, some of which will have anchor text. That brings traffic and searchers.
Your original title statement:-
On Page Content = Little or NO Weight in SERPs
Is wholly incorrect. Whilst you're right in thinking that good inbound links with anchor text is critical in google at the moment (and mega important for your main phrase), don't get so caught up in it that you miss the bigger picture.
McMohan that doesn't work as far as i know. You should either actively use google's delete-url form or put the "noindex"-tag onto the pages, open it again in the robots.txt, and then wait until it vanishes from the index.
As to the main topic: OPs conclusion goes too far and wasn't really meant serious, I assume. In the past months I got the impression that google somewhat rotates the way it turns the knobs: sometimmes backlinks are more important, sometimes anchor text, sometimes lexical content and so on. Make sure you don't concentrate on any single of these techniques too much and good page content is still the best in the long run: for you, for your visitors, and for your position in the serps.
Over all this we're faced with the assumption that google is not able to cope any longer with the mere number of backlinks produced by automated blog-spamming. As brett tabke pointed out elsewhere the no-rel-tag issue is the first time google goes public to solve a technical issue.
An historical cesure, which surely will lead to a backshift towards issues of linguistic analysis of pages.
Maybe a day of outage (mysql error or 404 or 503) will not affect ranking at all. When my site went down for a week last year, Google cache was missing, but my ranking remained. Though I doubt my ranks would stay should my server remain out of operation.
I confirmed that when my competition had the same problems some time ago. The competing website went down for 2 weeks, yet, rank remained.
joined:Dec 29, 2003
that makes sense. Databases go down, and also, if your site is not responding they are not going to remove it right away. Unless it's the Adobe Acrobat reader page with a zillion links with "Acrobat Reader" as anchor, on page content matters. Whether it matters as much as it once did, its a different story.
The cached page changed in a day or two but it took longer (I looked away so can't say exactly) for the site to move position (to #1).
These sites have been there for months, if not years. Can I post the keyword? It gets 55,283 searches a month on Overture... so it's a fairly competitive keyword.
ranking shouldn't be affected for a short while if a website goes down for whatever reasons
"Short while" can't be defined as a month or two. If Google still thinks a site deserves a rank even after being down for nearly a month, then they are doing a great disservice to user community, by giving them results that is meaningless. On the other hand, Google can drop the site, if its found down for more than a couple of days, and then do a periodical visit(the visit interval may depend on the PR or the traffic according to the toolbar) to check if the site is up and then show them in its index.
Ah, its easier said than done, huh? :)
(1) Google only re-runs the main PageRank algorithm once every few weeks or months.
(2) Google freshens the cache on an ongoing basis, with only minor tweaks to a page's rank at this time.
So, the last time the main PageRank algo ran your site had real content in Google's cache, plus the backlinks, and you ranked pretty high. After that your site went down, and the freshbot freshened the data in the cache, but no re-run of PageRank so your site retained the PageRank it had previously.
If that's what is happening then it's only a matter of time before you lose your PageRank once your content disappears. "Matter of time" might be a few weeks or a few months, but it'll happen.
I don't know this to be the case. I am just imagining how the software might work to produce this effect.
Think about what Google has to do to produce a SERP based on this data: they need this massive, enormous index of the entire web that relates practically every page to every other page in some way, taking into account (a) links, (b) text near links, and (c) text on the page itself. That's a massive computing task, and I bet Google doesn't have the computing resources to keep this index totally up to date. I bet it takes them weeks or months to rebuild these massive indexes. These rebuilds are called "updates".
Were Google only updated once every few months people would complain that their SERPs are stale, so they run around this "freshbot" that makes it seem like the index rebuilds more frequently than it really does. The freshbot updates the page cache, and it may even tweak the SERP rankings a bit, but it doesn't do the big recalculation. The results are still mostly from the old index based on the old data (for all pages, not just yours) until Google crunches through another overall update.
Now there could be variations on this: Google could incrementally update SERPs on an ongoing basis, but take months to update them all, etc., etc., but the idea is still the same. I have no idea if this is how it works. It's just a bet based on how much data they have to crunch, and the behavior described on this thread.