homepage Welcome to WebmasterWorld Guest from 54.161.190.9
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 54 message thread spans 2 pages: 54 ( [1] 2 > >     
Understanding Panda - Thin Content vs Low Interest Content
getcooking




msg:4451892
 4:17 pm on May 10, 2012 (gmt 0)

I've got a 17 year old site that was hit with Panda 1.0 and every iteration since. Yay me.

I've identified the problem areas and have been working to fix things up but so far no recovery. While I do still have some thin content on the site that hasn't been beefed up yet, I also noticed that a lot of my pages target very low volume, obscure search terms. They are very valid in our niche, but Google's Adword's keyword tool shows they only yield a few hundred queries globally per month. It made me start to wonder if that was my problem more than just thin content - but rather having too much low interest content? Could having too many pages that look targeted to obscure terms be hurting our higher volume terms? What's interesting is that the low volume pages rank very well for those obscure terms. Usually in the top three positions, and frequently at #1. However our higher volume terms have all been demoted by Panda. Thoughts?

And a related question, what are current thoughts on noindexing offending pages for Panda recovery? I see some people say that it works and for others, it doesn't. My plan was to noindex low volume and thin content pages until I could either develop them or merge/301 them but I don't want to make Panda more angry at me either.

 

tedster




msg:4451976
 9:00 pm on May 10, 2012 (gmt 0)

Interesting idea! I'd say check your actual search results traffic. If a page is getting even a trickle, then it's OK as long as the actual traffic spends some time on the page. After all, obscure content is one of the joys of the web.

cr1t1calh1t




msg:4451993
 9:34 pm on May 10, 2012 (gmt 0)

Very interesting insight, getcooking. I can see where having, as you say, too many pages targeted to obscure terms, which would have very low search volume, could help your site get profiled as a content farm, or as having thin content.

This might explain why ecommerce and directory sites were hit by panda, no matter how high quality their content might have been. The sheer volume of the obscure or lesser traffic'd pages and search terms overwhelms the best parts of the site - the higher volume pages... hmm...

I'll warn anyone reading this post, though - take it all with a grain of salt. I'm so turned around by pandas, above the folds, and penguins - I don't know whether I'm coming or going. I spent the last months of 2011, first few of 2012 overhauling a site from a panda hit - only to see a modest gain, then get womped by the above the fold. I just had a different site get hit by panda 3.5 and 3.6, down >50% from its norm, and I'm just crestfallen, battered and beaten. I'm not doing an overhaul on this site - because I have no clue what to overhaul, and honestly I think Google has 'it wrong' and is still 'thinking' - meaning the ML at work is still learning, and will very well improve itself over time... All the work I did on the first site after being hit by panda was apparent - i had it coming, but with the latest - I really have no clue, not even sure it is panda because of all the other updates that happened around the same time...

As for the noidexing question - I really think it helped in my situation. When I was hit by panda in oct 2011, my site had a forum that had been taken over by spammers, cleaned out, then subsequently locked down - therefore it was a barren wasteland of empty profiles, archive pages, shallow threads, etc. I no-indexed the whole entire forum - and I think it really helped. In this case I think noindex was a no-brainer, but I would be really reluctant to noindex pages just because they got low volume, or were obscure. Strangely, some of the best information and best writing on my sites gets low traffic, and I agree with tedster - obscure content is one of the things that makes the web so special.

beaten, but not broken,

cr1t1calh1t

ken_b




msg:4452012
 11:00 pm on May 10, 2012 (gmt 0)

It made me start to wonder if that was my problem more than just thin content - but rather having too much low interest content? Could having too many pages that look targeted to obscure terms be hurting our higher volume terms?
Looking back at the impact of Panda on my site this makes a lot of sense.

Even though my main topic is pretty popular, many of my pages featured less sought out and sometimes rare and even extremely rare widgets and text related to them. Although I don't recall checking it specifically, I would expect search volume for those pages would very, very low.

If a page is getting even a trickle, then it's OK as long as the actual traffic spends some time on the page.

Well that could be an issue for me, these are basically image pages with expanded captions. Maybe I need to expand those captions more.

.

getcooking




msg:4452046
 1:11 am on May 11, 2012 (gmt 0)

I'm so turned around by pandas, above the folds, and penguins - I don't know whether I'm coming or going


I hear ya! while I (fortunately) haven't suffered from anything other than Panda, I can relate to the frustration of not knowing where to begin sometimes. And the frustration of having changes not do a darn thing.

And thank you Tedster for reminding me to look at other analytics for these pages. Bounce rate and time on site could be very telling. I'm looking into this now...

I agree that obscure content isn't necessarily bad content, but it makes me wonder if I've crossed some sort of threshold quantity-wise. The trickle of traffic we're talking about is only maybe 10-20 pageviews per month per obscure term (I haven't yet identified how many obscure terms we're looking at). If noindexing actually works then my gut is telling me to remove these from Google until I can do something more with them. It pains me to remove pages that actually rank well, but I just have this feeling that they are doing more harm than good. I have no concrete evidence to back this up yet, it's based purely on staring blankly at our site data for the last year. And frankly, nothing else I've done has made a difference (and I've done some extensive changes).

Our fortune 500 competitors didn't seem to be hit by Panda, but they also haven't expanded to cover some of these obscure terms. Some of our smaller competitors have tried to target these terms, and I just recently talked to one of these site owners and found out they were also hit by panda. It's like I have all these puzzle pieces, but nothing quite fits together yet.

Bluejeans




msg:4452107
 6:51 am on May 11, 2012 (gmt 0)

I understand your reluctance to no-index. I think the massive no-indexing that followed Panda has made it harder to find answers to obscure tech questions.

Anyway. I got hit on the January iteration, no-indexed my forum (also a wasteland) and have seen a slow recovery since. I'm still not where I "should" be but disaster has been averted. I also made other changes though, such as removing top AdSense, but I do think the forum no-indexing helped. Just a sense. Now, I'm trying to expand the captions on my many photo gallery pages. Users love them. I think Google doesn't.

aakk9999




msg:4452217
 1:13 pm on May 11, 2012 (gmt 0)

I disagree that low search = low interest that could harm the site. There are many niches where no single term has more than 1000 searches per month. Many websites focus on such niches and cumulative number of searches for low volume keywords within the niche may bring decent traffic. However:
- if, upon landing on such page the visitor backs out quickly and/or does not progress to the rest of the site, this could be a 'low quality' signal
- if the site has many low search volume pages, its focus is shifted towards more obscure terms and it might not be seen as relevant enough for more generic high volume searches

So I think it depends on the focus of the site.

For example if you want to rank for a country name and your site has lots of individual pages about small towns within that country (low volume searches), you may rank for towns but the fact that they all belong to the same country does not mean you will rank for that country ( high volume phrase)

Planet13




msg:4452244
 2:33 pm on May 11, 2012 (gmt 0)

@ getcooking

I also noticed that a lot of my pages target very low volume, obscure search terms.


I would like to know more about what you mean when you say that your pages *target* low volume search terms.

How much difference in text is there between each low-volume page?

And might it make more sense in today's google to combine several of them into a slightly broader topic?

I ask this because I often see a lot of the same pages ranking for more varied search terms. So maybe having less pages but with more developed "sections" on that page (for the different search terms) might help out.

getcooking




msg:4452282
 4:46 pm on May 11, 2012 (gmt 0)

I would like to know more about what you mean when you say that your pages *target* low volume search terms.

How much difference in text is there between each low-volume page?


The pages in question are category/subcategory pages. When I say "target" I mean the category was developed to house those particular related articles. It was so visitors could quickly narrow down what they were looking for instead of paging through a more broad parent category and maybe not knowing that these specialty pages existed.

I believe the text is different enough on the pages. The subcategories of course build on an element from their parent category, but they include an element that typically the parent category doesn't have (the obscure terms).

An interesting note is that visitors are visiting these pages via our site navigation and internal search. So they are coming to the site from a broader search term but navigating to the more specific pages. They just don't seem to be searching Google for that specific page. I'm not quite sure what to make of that or how it might be affecting things.

I do agree that in some cases it would make more sense to merge some of these into a parent category or broader category. The problem is some of these categories can get unwieldy without some of the more refined pages. We've been around longer than our competitors and as a result have a lot more content than most of them. Managing it all has always been a challenge.

For example if you want to rank for a country name and your site has lots of individual pages about small towns within that country (low volume searches), you may rank for towns but the fact that they all belong to the same country does not mean you will rank for that country ( high volume phrase)


This is basically what I'm seeing on our site. To use your example, we rank well for the "small towns" but they don't bring in traffic. The broader you go with terms about the country, the worse our rankings. We rank awful for what would be the country name in this example. This wasn't the case for us pre-panda. And that's where I get stuck trying to figure it out how to remedy it. :)

Planet13




msg:4452353
 6:36 pm on May 11, 2012 (gmt 0)

We rank awful for what would be the country name in this example.


using this example, you do have a "country" page that would serve as a hub to all of the "town" pages, right?

How well developed is that "country" page?

would it make more sense to have the "country" hub pages on their own subdomain, similar to about and their widgets.about.com format?

You have something of an interesting situation: you rank well for the longtail but horribly for the head; MayDay and other updates were supposed to have made that more difficult (from what I understand), and it sure seems to me that high "authority" sites are doing much better on longtail terms.

getcooking




msg:4452359
 7:00 pm on May 11, 2012 (gmt 0)

Basically, our homepage would be the main country page in this example. The whole site is about that one country. There is a subpage (which doesn't rank either) that acts as a hub to the town information and other subpages off the homepage that lead to additional (general and related) information about the country - history of the country, weather, languages spoken there, major airports, etc. All related to that one country. There really is no information on the site that isn't related to that one country. Yet, the homepage won't rank for that country name. The town information is the "meat" of the site containing the info that people come to the site for. But if they don't know what town they are looking for (which in our niche is extremely common), they should (in theory) be able to find us by the country name and then browse our info to find the town. But Google thinks otherwise with our site apparently.

It wasn't this upside down before Panda, we ranked better (but not great) for the country name and our more popular towns ranked extremely well (as did longtail searches). It's like all we were left with was longtail post Panda.

Planet13




msg:4452366
 7:15 pm on May 11, 2012 (gmt 0)

Yet, the homepage won't rank for that country name.


could you clarify a little bit more about what you mean by "won't rank," both pre-Panda and post-Panda?

Are you not in the top 20 pages? Not in the top 100 pages? Not in the top 500 pages? Not in the index?

Did you ever rank well and then suddenly drop?

Have you ever filed a reconsideration request (a long shot, but not knowing the ranking history, thought I would ask)?

Also, assuming you have a left hand category tree 9or other type of main navigation on the home page) is the "country" name repeated over and over in it? Is it possible that you might have been cited as spam from long ago?

getcooking




msg:4452368
 7:26 pm on May 11, 2012 (gmt 0)

Our homepage is in the index - searching for our domain/trademark has us come up #1 with sitelinks. We're not in the top 20 pages (I stop counting after that) for the country name. We did rank well previously, how long ago I'm not sure - it's been a few years since we ranked well (and by well, I mean in the top 5 pages). I honestly never paid that close attention to that keyword and our ranking until I realized we weren't near the top anymore. I did file a reconsideration request (suggested by our AdSense rep) but was told there was no manual penalty.

We do have the country name in the navigation - but in our niche, there really is only one word to describe our content. I use both the singular and plural of that word on the site and I do vary the anchor text where appropriate in any non-navigation/breadcrumb links. I've looked and looked to see if the site might be overoptimized but I can't figure out how to really deoptimize it any further without it starting to look really strange to the visitors!

aakk9999




msg:4452388
 9:23 pm on May 11, 2012 (gmt 0)

All related to that one country. There really is no information on the site that isn't related to that one country.

Yes, you should look the over-optimisation angle too.

it's been a few years since we ranked well (and by well, I mean in the top 5 pages)


The above does not mean that you ranked well. I mean, I am wondering how much traffic you did get by being top 5 pages (top 50 positions) - and I assume you were not on page 1 because otherwise you would have said so, I presume.

Interesting is that you say it has been few years since you ranked in top 50 positions and only now this became an issue. To me it almost look like you see this as a way to recover traffic you perhaps lost because of other keywords. If you have lost the traffic because your main keyword dropped, and it was noticeable, then you should have been addressing it few years back.

So I am thinking along the lines that you should see where you in fact lost the traffic over the last 12 months (which wasn't because of your main keyword) and address this rather than trying to put "country" in the top 50.

getcooking




msg:4452407
 10:03 pm on May 11, 2012 (gmt 0)

Actually, I didn't initially bring up the primary keyword not ranking as my issue. What is the issue is that the broader the keyword, the lower it now ranks. The more specific, the higher it ranks. My ponderings were about whether or not having too many pages about obscure long tail terms could hurt the more broad terms (not just "country name")

So, as an example

country name, ranks poorly
country name + country capital, ranks a little better but not great
country name + big city name, ranks even better
country name + big city name + obscure additional term, ranks quite well
country name + small town + obscure additional term ranks #1

We used to (pre Panda 1.0) rank great for country name + country capital. That's where most of our traffic came from. We still get traffic to those pages from Google but our rankings are lower (and traffic is lower, of course). The issue with our homepage not ranking for our key term is definitely something else - it started well before Panda. I'm not saying it's not why we got hit by Panda (if it is, then I'm totally lost as to what I need to do to fix things) but it's definitely not what I'm focusing on right now. Getting back our not-so-longtail traffic is my primary concern.

stever




msg:4452481
 6:05 am on May 12, 2012 (gmt 0)

You left out the interesting one there, getcooking:
country name + small town

(I would guess better than country name + big city name but both really not high enough to make a difference?)

Added: I see you say in a post above "To use your example, we rank well for the "small towns" but they don't bring in traffic."

What does 'rank well' mean in terms of the small towns? (#1-3?)
(It would also be interesting to differentiate between popular small towns and small towns that nobody cares about.)

To take an example, Telluride and Nederland are both small towns in Colorado - the search volume and level of competition is very different, however.

Rasputin




msg:4452513
 8:31 am on May 12, 2012 (gmt 0)

I have a country level travel site, escaped from panda April 2012 having been hit April 2011. Unaffected by penguin and ATF updates. The main things that seemed to matter were:

- rewriting ALL thin and low quality content pages (over the course of a year, this was a very big job and involved actually visiting a great number of places)

- deleting any pages that had too little content or too many affiliate links (I am pretty confident that noindex didn't work - many of the pages were noindexed May 2011, but only deleted in March 2012, just before the recovery)

I do have lots of (high quality) mash-up pages that have very little traffic individually but I kept because they are useful and overall get decent visitor levels - so they are found for obscure terms - so I don't think pages ranking only for occasional or very limited search terms is a problem.

I should add that I make no effort with particular keywords, just do the best I can with every page, and don't really expect to rank for country + important city, there are just too many sites that can do a better job, run by residents from the city. I do OK (i.e. rank roughly where I think I should rank) for 'country + medium sized town', 'small town' etc

Lastly, despite the pressure to increase page speed I have kept, and often increased, the number of images on the pages since I think visitors like them. So it's pretty slow loading (bottom 25% of sites)

martinibuster




msg:4452516
 8:40 am on May 12, 2012 (gmt 0)

Longtail is where the bulk of queries are. That's a good place to be provided your site offers a lot of long tail query results.

Many of the "short tail" phrases have been losing queries. Take a look at the historical graph of queries on Google Trends. The graph is quite often descending from left to right for one and two word queries, meaning there are less people making those kinds of queries.

Focusing on content, seems like topically diffuse content on a single domain is more of an issue.

stever




msg:4452523
 9:29 am on May 12, 2012 (gmt 0)

I think, as Rasputin implies, that it is hard (and a lot harder than many people who try to do 'just enough to rank' think) to be an expert on places (or niches) when you are spreading yourself too thinly.

And sometimes (and I hold up my own hand here on occasion) the lure of the affiliate buck and the way certain techniques have been shortcuts to ranking can distract from the editorial integrity of the overall product.

On the other hand, if you do concentrate on what you are doing and do valuable 'things' that other people (including Google) aren't - and aren't likely to be able to in the future without an equal amount of work - then you are likely to build your competitive moat. This - especially for those who are working on location-based sites - is your edge in the competition.

I have a number of features on my sites that users find very valuable. It would however be trivial (in their terms) for Google to be able to recreate those features.

There are other valuable features on those sites that it would be extremely difficult for search engines to duplicate since they involved time, value judgements and luck (on some occasions) and those are exactly the kind of things that a machine-learning-led global company is not good at doing.

Other webmasters could, in time, do the same things. (One has in one niche.) But most won't be bothered or will attempt to bluff it at a lower level.

That's moved away a little from the original threadstarter so, to return to the original point, I also don't agree with the low-interest vs thin quality notion.

Where I do think it can become a problem is when a) those low-interest pages become a massive proportion of the site and, referring to the points above, b) when those low-interest pages start looking too much like one another.

getcooking




msg:4452886
 2:32 pm on May 13, 2012 (gmt 0)

Thank you everyone for all the insight! This might get away from my original questions a little, but I've been going through our stats endlessly the past few days based on all the info here (you gave me a lot to think about) and have found something interesting that I'm surprised I hadn't noticed earlier that seems to relate to some of the points brought up here about long tail searches.

In my niche there is an implied term that relates to most of the content on my site. If you are talking about something on my site, it's usually one of these "things". Before the March 23rd Panda update, it looks like most of our category pages (including the lower-trafficked ones) ranked well without searchers having to use that implied term. So we ranked for [keyword keyword]. After March 23 (it was definitely this update that did it), it looks like now we only really rank (in most cases) for [keyword keyword] [implied term]. And, to use the example earlier in this thread, the implied term would be the "country name" that our homepage DOESN'T rank for.

The pages do use this implied term once in the title and once in the meta description, but it's only used when absolutely necessary in the content on the pages themselves. It is not in the H1 tag and it's not used in anchor text anywhere other than linking to the hub page that the silo of categories are under. Like I said, it's pretty much an implied term so overusing it sounds weird. Also, since it's implied, most searchers (in my experience) don't use it that much when searching (which kinda explains our traffic drops).

I'm hoping someone smarter than me with this stuff (which clearly would be most lol!) might have an idea what if anything this might indicate.

Planet13




msg:4452910
 4:53 pm on May 13, 2012 (gmt 0)

Hmmm...

Do you have several synonyms of the implied term in the page content?

I know it is hard to figure out what google considers a synonym and what it doesn't, but use your best guess.

getcooking




msg:4452929
 6:13 pm on May 13, 2012 (gmt 0)

Well, there aren't really any direct synonyms for this thing. There are plenty of supporting/related words that get used in regular discussion about this particular thing that google might consider to fall into the same category of synonyms.

I was going to use an example of a website about flowers to describe what I'm talking about, but now you got me thinking. My example was going to be that "flowers" was the implied term (forgive me though - I know nothing about gardening so this is purely just for illustrative purposes). I was going to offer that you wouldn't use the phrase "marigold flowers" since "marigolds" would be sufficient and imply that your page was about these types of flowers since that would be the most common use of the word marigold. However, that made me wonder if using the implied term in my title tags for those pages was overkill? Because it's implied, maybe it shouldn't be there? It reads naturally that way (with the implied term) - but it's maybe not needed in Google's eyes?

The singular form of the implied word is used in the titles of the "article" pages (and once in the meta description and not in the h1 tag or anchor text to those articles) and those pages have never had trouble ranking for what they are - but those are definitely long tail searches since they are narrowly focused.

I feel like I'm just grasping at straws here. I have no idea what direction to head in anymore!

getcooking




msg:4452948
 7:13 pm on May 13, 2012 (gmt 0)

p.s. I should add that it's totally the norm to use that implied term in title tags, etc in my niche - but I'm wondering if my site is suffering from something else that maybe it's putting us over some sort of threshold.

Planet13




msg:4452955
 7:48 pm on May 13, 2012 (gmt 0)

Well, it might be an idea to test it out, both with and without the implied term, on certain pages and see how it works.

but that is the best I could suggest. And yes, i do know what it feels like to grasp at straws, too.

Bluejeans




msg:4453334
 3:43 pm on May 14, 2012 (gmt 0)

I'm spending a lot more time grasping at straws than I am at creating content although I'm sure Rasputin has the right idea. I'll accept at face value that Google's aim is to provide better searches, thus making the web a better place, but I wonder if the result is more webmasters poring over stats and pondering the latest algo than creating content. I know I shouldn't but. . .

FaceOnMars




msg:4453824
 4:32 pm on May 15, 2012 (gmt 0)

GC: I too have a 17 year old site which has displayed what appears to be a close similarity to what you've described. I also wonder if I've crossed some sort of "threshold" for thin/duplicate content. March 23rd was also my first big drop day of about 15% total traffic & also lost an additional 10-15% total traffic over the next two Panda updates combined. I can't help but wonder if there's a clue in so far as not having "dropped off the cliff entirely" vs. the nagging loss via attricion? Otherwise, I did have one general "country" term which ranked page one #7 for at least 7 years (which recently went to page #17), but mostly never really ranked will for most other country terms ... rather incurred the bulk of traffic through med/long tail queries.

My site is more of a specialty directory (vs. straight out ecommerce) with an internal search, so naturally there are going to be a large number of indexed pages which contain listings which correspond to all of the permutations along all the primary and secondary categories. While each full listing page (would be considered the "product page" on a true ecommerce site) has a fair amount of content, it's possible it has at least one foot in the door of being viewed as "recycled" content in terms of being the equivalent of generic product info provided by the manufacture ... although I'm fairly confident that due to the nature of my particular niche, that there is at least a big foot in the other door of it being somewhat unique. So, I'm not certain the full listing record content is the issue ... especially since many listing ("product") pages rank very well on their own.

Rather I think it may involve how the summaries for such listing (as part of the "listing results" tier of my interface - for any given category/sub-category) are possibly getting repeated too much across various permutations of category & sub-category pages without sufficient contextual unique content. Often times, the activation of a single listing on my system will trigger a new link for a Country + State + Town page and ONLY have that particular listing populate the new node if the node didn't exist previously. There are also many instances of Country + State + Activity pages which only have one or two listing abstracts populating these pages. On the whole, there are a lot of these pages and I'm considering noindexing such nodes unless there are at least 3+ (?) listings per node. I'd prefer not to, since I've heard mixed results about lifting a noindex tag after having it in place. I also don't want to subvert what is also a similar situation of discarding landing pages which garner a decent amount of med and long tail traffic.

One theory I have is that perhaps there are simply too many very thin pages with duplicate content - which might be creating cumulative baggage toward some sort of "threshold" or perhaps a possible general "dilution" of relevancy of the overall content on the entire site?

However, I've recently been told by at least one in the SEO field that Google simply disregards internal duplicate content (vs. penalizing), but I have my suspicions that this may not actually be the case ... even if it might be technically true. In other words, while there might not be an explicit penalty, it may have a negative impact due to some sort of threshold or dilution effect? I just don't know at this point.

I've done a lot of housecleaning in terms of canocalization, moving a blog off a sub-domain on to primary domain, cleaning up duplicate titles/meta tags, noindexing "orphaned" pages (once a listing has been deleted), noindexing deleted listings' pages - which provide minimal information that the listing is no longer active, move content above the fold on front page.

I have also tried adding a substantial amount of new unique content to the home page & some a bit to some of the more key State pages, but have seen mixed results with minimal gain ... although crossing my fingers for the next update. Just as I'm hesitant to noindex and redact pages from the site, I'm also hesitant to embark upon a manual content production campaign (unfortunately, visitor supplied content has it's own set of baggage, but is a possibility) due to it being a vast undertaking without any positive proof that this is the core issue. I know it can't hurt in the long run, but I can't help but wonder if there's primary underlying issue related to "thresholds" or "dilution" as a product of the nature and/or architecture of the site as a whole?

netmeg




msg:4453825
 4:48 pm on May 15, 2012 (gmt 0)

Are you allowing your search results pages to be indexed?

getcooking




msg:4453841
 5:31 pm on May 15, 2012 (gmt 0)

Are you allowing your search results pages to be indexed?


Nope. Also, anything with url parameters (not that there is much on the site) are noindexed or blocked in robots.txt and always have been.

I do have many category sections that span multiple pages. I use Google's recommendation for the prev/next tags but I'm wondering if they still don't like spanned pages like that. Many of the "index" pages of those categories, however, rank great. I had tried noindexing the interior pages of the category for a few months but didn't see any change so I went back to the prev/next option. I'm wondering if I should just noindex them in case Google doesn't have as good a handle on pagination as they suggest?

FaceOnMars




msg:4453846
 5:41 pm on May 15, 2012 (gmt 0)

Not sure if you're addressing me netmeg, but if so ...

I've disallowed crawlers into the directory where both the internal keyword and advanced search scripts reside, so was under the assumption this would effectively not allow results for explicit (post) queries. Same goes for "see next page" results. But haven't explicitly noindexed results from these scripts on the assumption that robots was keeping them out ... haven't seen any actual indexed pages appear, but have about 7,200 total pages indexed (so not 100% sure).

On the other hand, "categorical queries" which corresponds to my internal link structure were allowed to be indexed ... although canaconicalization was somewhat of an issue (until recently cleaned up).

FaceOnMars




msg:4453848
 5:51 pm on May 15, 2012 (gmt 0)

Get cooking: have you tried playing around with the increasing or decreasing the number of results per page for any given category?

This 54 message thread spans 2 pages: 54 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved