Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Anyone tried slashing pages to improve rankings?

         

londrum

11:34 am on Mar 9, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I have quite a large site, and a proportion of it is database driven. So there are lots of paginated index pages, for example (most of them noindexed). Lots of business pages with unique content, but not a huge amount of text (maybe 400 words on each).

Ive never been hit by panda or anything like that, but after studying all my competitors I'm thinking of getting out the chainsaw and slashing entire sections off my site to cut the number of pages in half.

My competitors all seem to get more traffic with just a 1/10th of the pages. I'm guessing it must have something to do with the link juice. With ten times as many pages I need ten times more backlinks, or better backlinks, to spread around the same amount of link juice. Mine must be very dilute. I figure if I cut the number of pages down whatever remains will immediately benefit from more link juice.

Has anyone ever tried something as drastic this? Did you get any benefit out of it?

aristotle

1:53 pm on Mar 9, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Well it's easy to control which pages get the most link juice. Just use the navigation to point a lot of links at those pages, and remove most of the links to the other pages.

As for deleting pages, the obvious candidates for removal would be pages that don't attract any search traffic and/or pages that don't have any incoming links from other sites.

JesterMagic

2:14 pm on Mar 9, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If the pages are truly unique and would offer value to a visitor I wouldn't remove them. I would do what Aristotle suggests and take a look at your navigation.

I am not one for removing content unless I have a good reason. Even if say a business goes under I would update the page with that information and stick it into the "out of business" category (or whatever) that is buried deeper in the navigation but still accessible incase anyone comes looking for info on that company.

keyplyr

2:56 pm on Mar 9, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I think chasing link juice in today's ranking schema is a futile effort. Time better spent building new traffic sources through social media & apps.

buckworks

4:49 pm on Mar 9, 2017 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Even though many other things are joining link juice in the SEO mix these days, it still IS part of the mix. So it makes sense to consider whether the juice you already have could be used more effectively.

I agree with the suggestion to rethink your site navigation to see if more juice could be channelled towards the most commercially useful pages.

FWIW my most profitable site ever (since sold) was under 200 pages.

So, I also agree that reducing the number of pages would likely help the cause. Are there any page types whose information could be consolidated so that the info from two or more pages could be presented on one page? Trimming pages wouldn't necessarily have to mean sacrificing content.

londrum

6:38 pm on Mar 9, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



i've pretty much done all the consolidating i can do. i've exhausted that.
i was also thinking i might try and narrow down my subject as well, so it's more tightly focused.

the biggest drop i ever had in traffic was when i took my successful small site and added thousands of pages to it (tens of thousands) -- all at once.
i know now that it was a dopey thing to do, but i still don't think it was a 'penalty'. i think it was more to do with me diluting all my link juice, and expanding the subject matter of my site, so google no longer saw my site as being about 'one thing'.

i'm thinking that slashing a couple of sections off might help for both of those reasons.

lucy24

6:47 pm on Mar 9, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Funny this should come up right now. Last week* I was looking at an obscure directory filled with pages that are thin to the point of emaciation. I decided this is ridiculous, and simply consolidated each of three subdirectories into a single page, with redirects to fragments corresponding to the original page content. It felt good in a weird way to slash the site's overall page count; I think I'll do some more by and by.

:: uneasily wondering if this is, in fact, not inherently beneficial but simply the www equivalent of being a "cutter" ::


* OK, I checked. It was the last week of February, so I've misplaced a week. Nothing new there.

ergophobe

8:49 pm on Mar 9, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Have a look at this discussion

Pruning Low-Quality and Outdated Content
[webmasterworld.com...]

tangor

1:20 am on Mar 10, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



In the case of a bloated site sometimes "noindex" can be a handy tool.

Consolidation of thin/short content into larger collected pages often makes sense not only for site management but user engagement as well.

Not a rule of thumb, but I've always tried to keep articles at a minimum of 800 words (preferably more). Anything less than that is merely a description and often works better in a data table/definition presentation.

Halaspike

1:05 pm on Mar 10, 2017 (gmt 0)

10+ Year Member Top Contributors Of The Month



@londrum yeah i slashed about 50k pages from my site on 27th Feb and since then my traffic and rankings have been improving.

graeme_p

7:15 am on Mar 11, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



What should one do with out of date pages? In one case I am thinking of events that are over. There are no internal links pointing to them, but they are up if requested.

Halaspike

4:23 pm on Mar 11, 2017 (gmt 0)

10+ Year Member Top Contributors Of The Month



@graeme_p i think it's best to "noindex, follow" out of date pages.

lucy24

6:40 pm on Mar 11, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If there are no internal links pointing to a page,* and it's noindexed, then unless you have your own site search ("performances of The Sorcerer in April 2013"), there's no longer any point to the page existing at all.


* If it's accessible through normal site navigation, that counts as a link.

londrum

8:32 pm on Mar 11, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It might have a backlink pointing to it, so it might be worth keeping

LeoKesler

11:27 pm on Mar 11, 2017 (gmt 0)

5+ Year Member



I have a doubt about this "slashing" old pages...

My site have almost 10 years of articles. Some old articles get low visitors and I thinks some of it can be put in a single article and updated with some new information.
So, what I want to do is join two or more articles into a single one and make a 301 redirect from old articles to the new article.
By the way, if I am using 301 redirect, can I remove the old articles ? Do I need to make something about the canonical url ?
I am using wordpress and the plugin simple redirect manager. And until now I already have ~100 301 redirects.

Any opinion ?

lucy24

1:52 am on Mar 12, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



By the way, if I am using 301 redirect, can I remove the old articles ?

Yes. Once the redirect is in place, the visitor does not know--in fact, cannot know--whether the old material still exists. (You may be thinking of javascript redirects, which are issued after the visitor has arrived on the old URL. Server-based redirects are different.)

And until now I already have ~100 301 redirects.
Any opinion?
Yes: I think you might be able to handle the redirects yourself via htaccess--assuming they follow a consistent pattern--instead of dumping them all on WP.

Oh, oops, you were asking about Google. The one thing I know for sure is: If you've got a lot of redirects, you will periodically find Google asking for nonexistent pages--urls such as jtgykmyubyjvhlh.html--to check for "soft 404s". Google does not approve of these. (In fact, I think they invented the term.) So make sure your site returns a 404 response to requests for nonexistent URLs.

phranque

2:26 am on Mar 12, 2017 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



regarding your paginated content - i would consider using the link rel element to define a collection of documents.

as discussed in this thread:
Pagination rel="next" "prev" ...best use for articles? - Google SEO News and Discussion forum [webmasterworld.com]

This markup provides a strong hint to Google that you would like us to treat these pages as a logical sequence, thus consolidating their linking properties and usually sending searchers to the first page.

Nutterum

8:02 am on Mar 16, 2017 (gmt 0)

10+ Year Member Top Contributors Of The Month



@lucy24 - I am so frustrated with these Google invented soft 404 pages. I was doing on-page work for a huge IT-sector forum and after we restructured the content and categories and implemented the redirect rules, not a full month passed before Google started phishing for soft 404 pages. I thought people were the cause of this, but no. The server logs never lie! Still, can't figure why Google invented this behavior with the Google bot requesting such non-existing pages.

lucy24

5:24 pm on Mar 16, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Still, can't figure why Google invented this behavior with the Google bot requesting such non-existing pages.

It's a programmatic response to meeting a high number of redirects--that is, high in proportion to total requests, not necessarily in raw numbers. They want to make sure that you don't respond to requests for nonexistent pages by globally redirecting to some other page--most often the home page. (I am 100% behind G### on this. As a user, I hate it when a site redirects to the home page. It makes it impossible to know whether I've mistyped the URL, or clicked a link with extraneous punctuation, or any of the other things that can lead to an URL not resolving. And, of course, it doesn't bring me any closer to the page I actually wanted.)

The correct and appropriate response is to do nothing. Allow the Googlebot to pick up its 404s; it's both what they want and what they deserve. (Admittedly, the two do not always go together.) And if you've expressly deleted a page (no redirect target) return a 410 so they'll stop requesting it sooner.

londrum

6:19 pm on Mar 17, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



i'm going to remove 10,000 pages and see what happens. it's three complete directories, so now i'm wondering whether to just 404 them all, or 301 them to the homepage?

i know the correct answer is 404, but these pages have a got a lot of backlinks pointing to them and if i just delete them then presumably i lose the lot. if I 301 them to the homepage then maybe i can keep some of the benefit. but would google frown on that, do you think?

tangor

6:56 pm on Mar 17, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



A redirect should be to content. If there is no content then there should be no redirect. Like lucy24 I agree that g views this tactic as a soft 404 and penalizes as a result. IOW, don't do it.

g, and other se's use crawl budgets for each (especialy) large sites. If you remove 10,000 pages you no longer want crawled, that means a different 10,000 pages on your site will now be crawled, and that would be a greater benefit to the site than having 10,000 dead urls redirect to a home page just to chase link juice for a short period of time. After all, g (and others) will soon learn that the content is no longer there.

buckworks

7:36 pm on Mar 17, 2017 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



whether to just 404 them all


Before you eliminate the pages, check in the Search Console to see whether any of them have links coming in from outside sites.

For the pages that do, redirect them somewhere relevant ... preferably more relevant than just the home page! For pages that have no link love, let those go 404 or even better, 410. Either way, be sure to have a user-friendly custom error page.

lucy24

8:44 pm on Mar 17, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



i know the correct answer is 404

It isn't. If you have intentionally removed content (not the same thing as removing pages) the correct answer is 410.

Ordinarily, search engines don't send a referer when requesting pages (they sometimes do for non-page content, especially stylesheets). So this is one time when it might be legitimate to send requests to different places depending on referer.

koan

6:13 pm on Mar 18, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



but these pages have a got a lot of backlinks pointing to them and if i just delete them then presumably i lose the lot.


If you use the "noindex, follow" value for the robots meta tag, your pages will be deleted from google's index yet still distribute any link juice you might have to your other pages.

lucy24

7:56 pm on Mar 18, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



You can only do the "noindex" thing ("follow" isn't strictly needed, as it's the default) if you keep the pages alive, allowing Googlebot to see them while returning a 301/404/410 to all other visitors. That seems a bit sketchy. In particular, I'm pretty darn sure they would quickly figure it out if humans get a 410 while the Googlebot gets either a 200 or a redirect to the home page. Google may be sneakier than bing when it comes to humanoid visits, but they definitely still do it.

[edited by: Robert_Charlton at 11:37 pm (utc) on Mar 18, 2017]
[edit reason] Fixed typo at poster's request [/edit]

deriklogov

12:52 am on Apr 3, 2017 (gmt 0)

5+ Year Member Top Contributors Of The Month



@londrum Have you got any luck with traffic improvement after removing those pages ? I got same issue, I decide to remove 300k pages with noindex (which i did on mar 5) but its almost a month and those pages are still in index, i was even tracking google bot , so bot visits one of those pages on Mar5 and didnt comeback after but page still indexed

lucy24

2:40 am on Apr 3, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



With that many pages, you may have better luck going into GSC and removing them from the index right away. That's assuming they're grouped in convenient directories; I don't think you even can type in 3 lakhs of separate URLs in the "remove from index" area.

londrum

12:29 pm on Apr 3, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@derik. Nothing to report yet. I've removed about 5,000 of them and most of them are out of the index, but my rankings haven't changed. It's been a couple of weeks.

I didn't noindex my ones. I just removed them and let them go to 404.

ergophobe

8:14 pm on Apr 3, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>>go to 404

Not 410? That might clean things up faster.

ergophobe

8:15 pm on Apr 3, 2017 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Oh wait, I see that 404/410 discussion already sort of happened above.

I haven't tested it, but in theory at least a 410 should result in quicker cleanup
This 36 message thread spans 2 pages: 36