homepage Welcome to WebmasterWorld Guest from 54.147.248.118
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Google supplemental index
member22

5+ Year Member



 
Msg#: 4637114 posted 5:35 pm on Jan 13, 2014 (gmt 0)

I heard that google has multiple index.

It has apparently the one that you see by typing site:example.com) but it also has a second one and maybe even more.

Apparently even when you remove pages or have pages that return 404 and 410 its keeps those in a 2 nd index.

Is that true ? and can you ranking be penalized because it has pages in a secondary index even though its says that everything is fine in the webmaster tools and that all your pages ( duplicate etc… were removed )

Thank you,

[edited by: brotherhood_of_LAN at 5:40 pm (utc) on Jan 13, 2014]
[edit reason] example.com [/edit]

 

hannamyluv

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4637114 posted 7:27 pm on Jan 13, 2014 (gmt 0)

Google will not really confirm or deny the existence of multiple indexes. Long, long ago, there was an official supplemental index, but they supposedly got rid of it.

When I was still consulting, I saw plenty of clues that there were secondary indexes. And to be honest, the idea of secondary indexes just makes sense from a computing standpoint. But, I don't think there is any official Google opinion on this anymore. But it has been a few years since I consulted, so that may have changed.

bumpski

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4637114 posted 9:08 pm on Jan 13, 2014 (gmt 0)

There are pages that Google never shows in its indexed results.
There are pages Google will never show in a search with a date range. ( I call these pages, pages without dates ).
There are pages Google will never show when a reading level is selected. ( I call these pages without reading levels )
You can mark a page "noindex" and Google clearly honors this, but, Google still crawls these pages and certainly keeps track of their content. Also these pages may have links that Google will follow even though they are not in Google's primary index. So there probably is a "noindex", index! This is an index I really wonder about!

Google clearly knows about all these pages it excludes from conditional search results. So it is likely Google does have multiple "supplemental" indexes. But certainly now the old "supplemental" results are perhaps hidden or restricted as I mentioned above.

3zero



 
Msg#: 4637114 posted 1:58 am on Jan 14, 2014 (gmt 0)

I think that Google is designed to crawl everything it gets fed a link too. The supplemental index I think consists of low searched pages and possibly censured results. It cannot be accessed anymore through normal search but yes I believe it exists.

Bones

10+ Year Member



 
Msg#: 4637114 posted 5:28 pm on Jan 14, 2014 (gmt 0)

You may find this recent video from Google's John Mueller (recorded 13th Jan 2014) of interest, see from about 35m 30s onwards.

[youtube.com ]

The question was:

"We believe that one of our sites has had 90% of its pages fall into the supplemental index. Is there a way of confirming this and what pages are there? How do we recover from that?"

The first part of John's answer was:

"We don't have a supplemental index any more in the sense that these pages will be treated differently in the search results. So that's not something you'd need to worry about.

We do have different index tiers, depending on how we categorise your pages, how we need to crawl them, but it's not something you'd see specific changes in the search results."

hannamyluv

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4637114 posted 8:48 pm on Jan 14, 2014 (gmt 0)

Well, that was a timely answer from Google, now wasn't it...

Anyhoo, in regards to the OP's question on whether it would affect their ranking. I think that if you are worried about pages being in a secondary index, that would indicate that your pages have more problems than a secondary index.

In today's Panda world, regardless of how or where Google is putting a page, if the page is weak/thin/duplicative/rubbish, it will not compete (at least not without some short lived tricks that anyone who has a legitimate business would not touch with a 10 foot pole).

Panda is kind of like the next generation of what was once the Main and Supplemental indexes.

bumpski

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4637114 posted 10:47 pm on Jan 14, 2014 (gmt 0)

Sorry I have to disagree.

If it were Thanksgiving;
Google's efforts in the last couple years are equivalent to using a;
chain saw on a dry turkey.
...
There'd be stuffing everywhere and no meat to be found

FranticFish

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4637114 posted 8:17 am on Jan 15, 2014 (gmt 0)

If you run a 'site:' query, sometimes you'll find that something like this happens...

1) Let's say Google says it has 200 urls for your site, which normally would mean 20 pages of results. But if you look at the pagination at the bottom of the SERP you might see only 6 pages, which would mean that there are 120 urls tops.

2) As you click to the last paginated page, the number of results Google says it has suddenly drops from 200 to 115.

3) Google says that there are more urls to see if you want, and provides a link to show all of them. Sometimes this will then reveal the 200 it originally said it had, sometimes the number will be higher than 115 but still lower than 200.

This has been a feature of the 'site:' query since there was a Supplemental index.

Now John Mu says they have "different index tiers". You say tomato...

He also says that all pages, no matter the tier, are treated equally. I cannot reconcile my own recent observations on one site recently with that statement.

bumpski

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4637114 posted 1:02 pm on Jan 15, 2014 (gmt 0)

Google says that there are more urls to see if you want, and provides a link to show all of them. Sometimes this will then reveal the 200 it originally said it had, sometimes the number will be higher than 115 but still lower than 200.
I think this is the prompt:
In order to show you the most relevant results, we have omitted some entries very similar to the 54 already displayed.
If you like, you can repeat the search with the omitted results included.
I think usually when you get this prompt from a Google "site:" query, Google thinks that some of the content on the site is duplicate. If you canonicalize a site very well, and truly don't have duplicate content, this message should go away.
In some cases you can use the site: command on subdirectories or subdomains to localize the duplicate content. (and Webmaster Tools)
And I agree, I think in this case this content was attributed to the "supplemental" directory status.

member22

5+ Year Member



 
Msg#: 4637114 posted 1:39 pm on Jan 15, 2014 (gmt 0)

Can you explain how to do the site command to find those that google considered as duplicate because I am not sure how to do that. I guess that would be the reason why my ranking is still not back to where it was...

I have tried a few things with the site command and it doesn't work.

FranticFish

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4637114 posted 4:01 pm on Jan 15, 2014 (gmt 0)

Google thinks that some of the content on the site is duplicate

That's not always what I've seen. Sometimes it IS duplicate content, but it's also been
- PDFs, Word and other attachments/downloads
- Flash and other 'include' type files

And most recently it was a load of pages that had all been returned in the first instance on a different domain, but were then stuck into this 'second indexing tier' when the domain was rebranded.

The fact that the same content appeared to have a lesser index status based on a domain switch is what makes me doubt what John Mu says about these pages not being ranked differently. They were: drastically.

FranticFish

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4637114 posted 4:04 pm on Jan 15, 2014 (gmt 0)

@member22

You can use 'site:domain.xtn'
Or 'site:domain.xtn/folder/'

Or even partial urls for wildcard match 'site:domain.xtn/dynamic.xtn?variable='

Try 'inurl:' too.

The important thing is no space in the query.

member22

5+ Year Member



 
Msg#: 4637114 posted 6:58 pm on Jan 15, 2014 (gmt 0)

Let me make sure I got that right

site:www.mywebsite.com.xtn ?

member22

5+ Year Member



 
Msg#: 4637114 posted 7:07 pm on Jan 15, 2014 (gmt 0)

When john muller says : "all pages, no matter the tier, are treated equally"

Does it mean that he doesn't see the duplicate ones and "penalizes" my website ?

FranticFish

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4637114 posted 8:01 pm on Jan 15, 2014 (gmt 0)

If it were this site, your query would be 'site:webmasterworld.com'

I usually query the domain without the 'www' because it'll show both 'www' results and 'non-www' results that way.

Re: John Mueller, as I said above I'm not sure I believe him. If I understand your question correctly, you're asking if Google could be penalising your site based on the content of pages that aren't part of their public index.

I'm really not sure: it could depend on the type of penalty that was issued based on what was on those pages when they WERE in the index. What sort of penalty are you worried about?

If duplicate pages are what you're worried about, it's widely accepted that there is NO penalty for duplicate content. It can cause problems because it can waste resources and your site suffers, but that's not a 'penalty' in the sense that hidden text or sneaky redirects can cause penalties. Certainly I've never heard of or seen any issues caused by duplicate pages that didn't vanish as soon as those pages were cleaned out the index.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved