Forum Moderators: open
Curioser and curioser..
Whether google is downvaluing index pages or not, I do see some benefits in doing it. When i do searches i always see more index pages than i really want, perhaps with these index pages with links to the info i want, but not enough inner pages with the exact info i want. ive always though that I would like to see more "inner" pages turning up in comprison with index pages.
It all comes down to whether google should be returning SITES or PAGES for search queries. I would think that the latter is better for Google, leaving ODP, directories and maybe even Y! to concentrate on the former.
(filthy thought) and maybe google wants you to find the pages with adsense on them - which seems to work better for highly targeted inner pages rather than broader focused index pages. Wouldn't blame them really, the user would probably beneift from getting more directly to the info they want.
[edited by: chiyo at 6:13 am (utc) on June 23, 2003]
Was nicely surprised to see client site at #6 for THE key phrase. Checked G's DC's for curiosity...good 'ole FI is ranking the index page at #9. Was #32 Pre Dominic.
First time FI has been "kind" to my client site since update news broke.
That's the only DC to rank the index page though as of 2hrs ago. :(
16 out of first 20 results on Y! are index pages for this SERP.
9 out of the 10 on Page 1 of Google are index pages for this SERP.
Little to no content on most of the index pages that are in Top 10.
So, I wonder how G decides what index pages to rank if it is not content based? (old link w/ anchor text scheme?)
So, with Index page- a Page 1 Google Result.
With sub page- a page 15 Google Result.
All or nothin'. :)
Doing a search is like gambling...
Maybe 1 or 2 DC's will have more results with Index page and the others with sub pages. Google waits to collect data from the few who fill out the forms...
AW
No... Brett's dead wrong. If it was as simple as that the switch would be flicked and that would be that - we would see a sembance of consistency. What we are seeing is fluctuation and instability.
What we are seeing gives indications that Google is struggling to keep all the link data and anchor text in play, and the new link data is losing out. By the way, what we are also seeing in a perceptable degradation in the quality of results in many areas as a result.
I will be posting a lengthy analysis of this later today, based upon the data I was sent by various members last week.... assuming I am allowed to start the thread of course!
Googles main goal is to model the web in its natural state. That means no link inflations, no SEO'ed homegepages etc. If google where to drop index pages completely and say only internal pages counted, it would devastate all SEO'd 5-10 page domains, but sites full of content and and useful information will get all the traffic.
I will be posting a lengthy analysis of this later today, based upon the data I was sent by various members last week.... assuming I am allowed to start the thread of course!
Is this the same issue as what was going on during the beginning of the update? I'm not questioning it if that's what you're implying, I'm just curious what people are thinking.
"Dropped" seems different and more permanent than "buried under internal pages," or are we just using different terms for the same phenomenon?
If google where to drop index pages completely and say only internal pages counted, it would devastate all SEO'd 5-10 page domains, but sites full of content and and useful information will get all the traffic.
Sorry, but the negative effects of this would far outweigh any benefit seen from removing ultra SEO spam index pages.
Think about it...google can't do anything that leads to such huge false positive errors.
If you think you see spam now, think about what would happen if index pages no longer were meaningful "landing" pages for sites that rely of search engine traffic. I.E., if people aren't likely to ever SEE an index page, why make it user-friendly? Might as well use it to lend the most weight to your internal pages if they're the only ones google's going to display. I'm picturing pages-long lists of links with juicy anchor text leading to internal pages and crosslinked sites. There are plenty of other ways to facilitate site navigation between internal pages. You could even set up a completely separate "user index" type page that would be easily navigable.
No...this is something google wouldn't do intentionally. It would just break too much of the natural order of the web.
Here is the way i see it, google is always working with a mathematical model and they have a testing environment where they apply thier new "filters". Once they apply thier filters they figure out what the delta is from the existing index. Lets say by killing index pages you reduce spam by 30% across the board, but you also kill 30% of 5-10 page domains but larger more established sites gain a greater portion of the serps. Would you apply a filter like that?
These are the kinds of decisions that probably happen every day at google, when your working with 3 billion pages no matter what you do there will be huge side effects some very good and others not so good.
It may be an intentional attack on pages which have incoming links that don't match content. This is a common occurrence on index pages whose backlinks may have anchor text which is more appropriate for sub pages. Index pages aren't magic, they must be optimised for one or two key phrases only and should have incoming anchor that aligns with title and content, just as any other page.
I believe you are right. In my case, I'd been hovering around position #10 in the SERPS for my primary keyword phrase in the last couple of months. Recently my SEO work has been focussed on getting inbound links to my index page with link anchors containing this keyword phrase. The index page is optimized for this phrase and a couple others.
As of the new update I am position #1 for my main keyword phrases; unchanged for the other keywords.
I suspect Google is applying a 'quality' weighting to each inbound link, at least part of the equation involves the extent of common keywords in the anchor text and the target page. If they had this before, perhaps they've just increased the weighting. Either way, its a good thing.
Of my 3 major sites (I have other small, mini websites which are subject specific helpers and excellent traffic generators and are saving me right now) only 1 has remained its top ranking and has actually climbed. What is different about that site?
1. It is on a static IP.
2. It has a lot of non reciprocal backwards links.
3. It has a lot of inside pages which were optimized for relevant keywords. In the last 3 months, when link requests came in, I tweaked them for the inside pages and the relevant keywords they were optimized for WITH the same relevant anchor text. All of it relevant, no spam tactics involved.
Just about every page of that static IP site is having great serps ever since the May update.
Okay, but the other two sites have taken a knockout punch. (shared IP) About 4 days ago, one of them popped back in for top 3 ranking for about 48 hours then disappeared again. Before Google started playing around with their algo (for whatever reason) the site appeared in the top 5. It was just a non-commercial, solid well built and no spam site with lots of info. When it appeared in the top 3 again, that told me that Google was using all the ingredients in their algo recipe.
But when it dropped out of sight again, I assumed Google was back to half baked ingredient algo methods. = OKay, cross fingers and hope they get around to putting all the ingredients back in their half-baked cake so my non-commercial site can return to it's proper place.
At the moment, with their half baked algo, in my kw category, I am seeing the oldest sites, as in when the domain name was registered, are showing up in the highest spots - even tho they might have crappy PR. Despite this, my static IP domain still is strong (serps) and the Google directory seems to me to be an accurate reflection of how all the sites will fall in line once they complete whatever "the hell" they are trying to accomplish. Once they throw in all the ingredients, I hope my sites return to normal. But this Alta Vista Black Day (x60) repeat has taught me to diversify, diversify, diversify your traffic sources.
Diversify - Diversify - Diversify your traffic!
GG takes too many of the posts here with a grain of salt. I think he underestimates the Google Fatigue many here are feeling. He underestimates the Google Fatigue as I expect any "Google+Director+of+Communications" person with a six figure income in California would. (Cough-hint-cough) Sorry, my throat was bothering me.
Public Relations = Public Image.
One more thing, the GG bootlickers here are making me sick. Please, for your own dignity, stop it.
Maybe, but it isn't just the index page missing that is at issue, it is the rankings with most of those remaining. I think it is pretty clear there was an algo tweak of some sort that has pushed index pages down.
>That's a massive generalisation!
And your point? Google can do anything they want. When they find an issue they fix it.
>general vs specific
There are few searches that aren't specific.
Devaluing the homepage (a 90-95% contentless page) increases the value of the index. Afterall, the only thing of value on most homepages is just the fresh links to follow down into the real content.
Take CNN for example: zero content worth returning that sites homepage for anything other than the domain name. That's the same story with the other top sites (even Google.com itself).
> I suspect our esteemed Administrator is winding us up
> ...we would see a sembance of consistency
I was a bit wound up myself when I wrote that. Everyone thinks google has personally deleted their index page and three guys with black suits and glasses are hanging around their front door.
It is just becoming clear that G did an algo tweak of some sorts with relation to the home page and all the whinning and crying in your coffee isn't going to change that.
just fyi: we heard from dozens of people about this issue. It's more than just dropped index pages here and there.
But if someone searches on 'CNN' they expect to enter through the front door.... not the contact page, or link page, or copyright page. That's a bad search experience.
Comparing several websites targeting similar keywords, I note that there is a filter regarding the inbound links.
Doing a search on google with link:www.thesite.com, some sites ( listed in the same high PR sites ) show their inbound link, and other site NO. Maybe there is a filter that apply when the anchor text link has the similar keywords in your title in your page site: in this case that excellent link is not part of the Algo.
My suspect is that google is penalizing SEOs that did a linkpopularity campaign suggesting the preferred anchor text description.
The only thing that is clearly happening is Google poorly ranked some cluster pages on the same domain, and Google Guy has sought out feedback from members here on this topic. It's an error, not a choice to display a content-less contact page rather than a content-rich(er) page.
In reading all these intermindable threads, there are virtually no comments from people saying their PR4 page on widgeting in Nebraska is outraking their PR6 widgeting site with a "widgeting in nebraska" link on it. If that was actually happening, who would care?
But of course that is not what is happening. It's not that better topical pages are being found. Objectively worse pages are being ranked while objectively better ones are being ignored.
And it almost certainly doesn't matter anyway, because the knob turning that we can observe the past few days has shown times when the pages have been ranked more sensibly.
Postulating that google is doing something foolish for no apparent reason is not going to get anyone anywhere. Sure they may be attempting to shift their algo in this direction, but just as sure, the comments people have been making have had nothing to do with that, but everything to do with errors.
Lol, I've been trying to tweak things so my interior pages rank above my index page for several terms, but my index page just powers over them.
[edited by: steveb at 11:37 am (utc) on June 23, 2003]
OK.... 'British Broadcasting Corporation'. Again you'd expect the entry page not the copyright page... and it isn't in the domain name (which is BBC.COM).
Hypothetical in this case of course, but that sort of situation is widespread and frankly is ridiculous.
Most people design sites with the expectation that visitors will enter through the front door. Good designers try to ensure the best and richest experience possible through that route.
Any attempt to create entry pages (almost at random) elsewhere defies logic, common sense and common practice. I don't think for a minute Google would intentionally do anything so daft.
Much more likely (and of course I would would say this) is the Google Twilight Zone Theory [webmasterworld.com]
Since Dominic, my home page, which had been on page one for our main keyphrase, has effectively disappeared from those SERPS.
I am not getting stressed about this for two reasons.
1) Lots of other pages indexed and providing very specific, useful referrals for users.
2) It is a predominantly UK site, and for the same key phrase on google.co.uk we are #1
Needless to say, there are many UK sites below us on .co.uk that are above us on .com for the same phrase.
But I don't really understand it. Not even in top 150 for "UK [keyphrase]" on google.com but #1 on .co.uk for "[keyphrase]" (UK sites only).
(As an additional clue, despite being in the Google directory, the category button on my toolbar is greyed out on the site...)
DISCLAIMER: not whinging here as we are getting more traffic from Google than ever before. Just perplexed...
What I'm actually looking for is Blue Widget related information. Not as specific as you (and maybe Google) suggest.
I just took a peek at few serps for industrial products that took me to other than home pages... its down right a poor way to research.
Getting to the general information and having the human processor kick in to make the refined next click at the index page level seems the way to go.
Google shouldn't be the Table of Contents for a site. That's what the index page is for.
[edited by: Jon_King at 12:53 pm (utc) on June 23, 2003]
Ahhh... now THAT'S common sense.
Maybe that's what's actually missing at the Plex at the moment. In addition to the missing link data of course ;)
Now for example, take lots of amateur sites that are small and focused. If hypothetically someone out there has a website devoted to a specific 1960s bubblegum rock band, odds are probably anyone who links to it will do so to the home page, and the home page is the one most users would want to land on. For a lot of sites, the index page will be the only one at all that gets inbound links period. The sites aren't large enough deeplinking to content makes sense.
I'm not sure what the exact filter is, however, there definitely seems to be a filter on inbound links/anchor.
For those that are getting all their links through swaps, and are using the EXACT same anchor, this could explain the drop in the index pages.