homepage Welcome to WebmasterWorld Guest from 54.211.231.221
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Trying to find proper indexation ratio for large website after Panda hit
JVerstry




msg:4608624
 7:20 am on Sep 10, 2013 (gmt 0)

Hi,

I am developing a website offering time-related functionalities for various locations around the world. I have been hit by what seems to be a Panda update on Sept 4th. No more traffic or queries.

The site has about 8 cores pages (index, online services, etc...), 215 category pages and 22.700 content pages for locations around the world. It is new and work in progress. It did not have much traffic yet.

I have set noindex on the 22.700 content pages to try to mitigate thin content issues. So only 0,96% of my website will be indexed. I know I still have to work to do to beef-up content on my content pages.

a) My first question is, should I noindex my 215 categories pages too (no thin content)? Would it help recover from this hit?

b) I have reckoned that for some important keyword, 260 content pages could cover for 90-95% of the traffic I am aiming for. Since each of these pages are for a specific location in the world, I am wondering whether they may be considered as lack of original content. Should I re-index them at all somewhere in the future? Or never? This would bring the indexing ratio to 2,09%.

c) Does Panda ignore noindex pages in its filter? Or does it take them into account anyway when Google can reach those pages?

d) If one implements proper corrections after a Panda update, how much time does it take to analyze the website again? Does anyone have minimum/maximum estimations?

 

aakk9999




msg:4609067
 6:34 pm on Sep 11, 2013 (gmt 0)

I would firstly wait to see the effect of noindexing content pages before making further decisions.

How long ago did you noindex 22.700 content pages? Have they dropped out of Google index or is Google still processing them?

JVerstry




msg:4609201
 8:22 am on Sep 12, 2013 (gmt 0)

I did it on Sept 8th (4 days ago) and resubmitted a sitemap.

Webmaster's index status is starting to show a downward trend in the number of indexed pages, but it is still high.

This morning, I am also seeing a tiny recovery in search queries for Sept 9th data, I hope it is the beginning of a bigger trend.

Yesterday (Sept 11th), I beefed up all pages with users instructions, meaning a minimum 250-300 words (prose) on all pages. Before, I had at most 100 words because I wanted to keep the layout sober and clean. These instructions can be displayed/hidden with a click on new help links. I have also removed small keyword lists I was maintaining in the footer to compensate for the lack of word content.

I have also decided to index again the 260 pages than would cover for 90-95% of my traffic. This morning, Webmaster says 661 pages have been submitted and 553 are indexed. So about 2,4% of the total number of pages on my website.

Robert Charlton




msg:4609369
 7:29 pm on Sep 12, 2013 (gmt 0)

JVerstry - The effects of noindexing can take a while. ... a lot longer than 4 days.

I have also decided to index again the 260 pages than would cover for 90-95% of my traffic.

This in fact sounds like a good idea, much better than noindexing all your content pages in the first place. But be aware that trying to reverse some of the noindexing before Google has processed the first set of instructions is not only going to take even longer... but also that changing these directives can cause Google to get things confused. I would leave these things alone for a while and let Google sort it out.

In the future, I suggest that you carefully think through such large scale measures before you initiate any changes.

Yesterday (Sept 11th), I beefed up all pages with users instructions, meaning a minimum 250-300 words (prose) on all pages.

This, on the surface, does not sound like a good idea. From your description, I'm guessing that this is going to be adding the same boilerplate content to all pages. This creates duplicate content throughout the site, which... on each page... will dilute your page-relevant, unique content, and overall gives Google a sense that your pages are all pretty much the same. The last thing you should do on a thin-content site is to enlarge the "footprint" of your template.

The goal should never be simply to increase word count.

What you want to create is a large enough amount of unique, useful, and relevant content on each of your pages to engage your visitors. You want them to have a good enough experience on your site that they recommend your site, and that they will come back.

JD_Toims




msg:4609375
 7:56 pm on Sep 12, 2013 (gmt 0)

I'm guessing that this is going to be adding the same boilerplate content to all pages. This creates duplicate content throughout the site, which... on each page... will dilute your page-relevant, unique content, and overall gives Google a sense that your pages are all pretty much the same. The last thing you should do on a thin-content site is to enlarge the "footprint" of your template.

This is definitely a good point.

These instructions can be displayed/hidden with a click on new help links.

This, if set to display:none; on the page load *may* [note: also meaning may not] actually keep the preceding point Robert_Charlton made from happening, depending on how Google has the filters for non-displayed content set on any given day.

I have also removed small keyword lists I was maintaining in the footer to compensate for the lack of word content.

Good idea.

What you want to create is a large enough amount of unique, useful, and relevant content on each of your pages to engage your visitors. You want them to have a good enough experience on your site that they recommend your site, and that they will come back.

Absolutely this.

Also, I would try to think about how to add "depth" to the content other sites don't present, so rather than "overviewing" try to "detail" things where possible.

londrum




msg:4609376
 8:03 pm on Sep 12, 2013 (gmt 0)

maybe you could try adding some google maps. if each page is for a different location then you could have a different map on each page showing its location in the world.
at least you will know then that google is aware of the content, even if they don't index it -- because they are being called every time the page loads.
it probably won't help, but it can't hurt. and it will beef the pages up a little tiny bit.

JVerstry




msg:4609495
 10:52 am on Sep 13, 2013 (gmt 0)

enlarging the footprint of my template


Robert: yes I did not think about that. However, most of those pages will remain noindex and I can live with that. Only the 260 pages will be indexed with similar instructions. Let's see whether Google is happy enough with that. Worst case scenario, I noindex more pages.

My plan is to let the dust settle in the coming days and check the trend before new intervention.

I have decided to go for a 'massive' changes, because my site is still new (not a lot of traffic) and I wanted to make sure some of my pages would rank again. I was loosing all traffic and did not have much to loose. I read somewhere that going for little change after Panda may keep one in some grey area where nothing happens. Electro-choc therapy...

JD_Toims: I am working a lot on good UI experience and service, because I don't believe one can build sustainable traffic by gaming SEO strategies only. I already have a high performance website (says webpagetest.org) and I am currently working on responsive design.

londrum: yes, I was thinking about some maps and other unique features too. It is on my to do list. I may reindex more pages once they have more specific content.

JVerstry




msg:4610020
 8:35 am on Sep 16, 2013 (gmt 0)

For the records:
- Google did not like the re-indexation of the 260 pages (90-95% traffic). Traffic vanished again.
- I noindex these pages again, and traffic came back.

In my case, I don't think it is a thin content issue, but rather a near duplicate issue. No-indexing these near duplicate content pages is a fix.

Nichita




msg:4610191
 10:34 pm on Sep 16, 2013 (gmt 0)

I suppose the pages with locations are autogenerated. Something similar with "Accommodation new york" / "Accommodation location" or the text is made in a similar way. If I am right, you should delete / noindex all of these pages.

nomis5




msg:4610268
 8:28 am on Sep 17, 2013 (gmt 0)

For the records:
- Google did not like the re-indexation of the 260 pages (90-95% traffic). Traffic vanished again.
- I noindex these pages again, and traffic came back.


Reading the above and other comments you have made I guess that you are not giving your changes sufficient time to take effect.

I personally would have re-indexed 10 pages or so, beefed up their content with totally new and relevant information, then wait three months to see what happens.

There's no easy way to make large changes to a site, it takes work, planning and a lot of patience to see what does or does not work. Easy, quick fixes made for a week or two then rolling them back is the worst way to go forward.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved