Welcome to WebmasterWorld Guest from 18.210.28.227

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Anyone tried slashing pages to improve rankings?

     
11:34 am on Mar 9, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 12, 2006
posts:2710
votes: 116


I have quite a large site, and a proportion of it is database driven. So there are lots of paginated index pages, for example (most of them noindexed). Lots of business pages with unique content, but not a huge amount of text (maybe 400 words on each).

Ive never been hit by panda or anything like that, but after studying all my competitors I'm thinking of getting out the chainsaw and slashing entire sections off my site to cut the number of pages in half.

My competitors all seem to get more traffic with just a 1/10th of the pages. I'm guessing it must have something to do with the link juice. With ten times as many pages I need ten times more backlinks, or better backlinks, to spread around the same amount of link juice. Mine must be very dilute. I figure if I cut the number of pages down whatever remains will immediately benefit from more link juice.

Has anyone ever tried something as drastic this? Did you get any benefit out of it?
1:53 pm on Mar 9, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member aristotle is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 4, 2008
posts:3673
votes: 374


Well it's easy to control which pages get the most link juice. Just use the navigation to point a lot of links at those pages, and remove most of the links to the other pages.

As for deleting pages, the obvious candidates for removal would be pages that don't attract any search traffic and/or pages that don't have any incoming links from other sites.
2:14 pm on Mar 9, 2017 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:Feb 5, 2004
posts: 619
votes: 108


If the pages are truly unique and would offer value to a visitor I wouldn't remove them. I would do what Aristotle suggests and take a look at your navigation.

I am not one for removing content unless I have a good reason. Even if say a business goes under I would update the page with that information and stick it into the "out of business" category (or whatever) that is buried deeper in the navigation but still accessible incase anyone comes looking for info on that company.
2:56 pm on Mar 9, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:12913
votes: 893


I think chasing link juice in today's ranking schema is a futile effort. Time better spent building new traffic sources through social media & apps.
4:49 pm on Mar 9, 2017 (gmt 0)

Moderator

WebmasterWorld Administrator buckworks is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 9, 2001
posts:5838
votes: 165


Even though many other things are joining link juice in the SEO mix these days, it still IS part of the mix. So it makes sense to consider whether the juice you already have could be used more effectively.

I agree with the suggestion to rethink your site navigation to see if more juice could be channelled towards the most commercially useful pages.

FWIW my most profitable site ever (since sold) was under 200 pages.

So, I also agree that reducing the number of pages would likely help the cause. Are there any page types whose information could be consolidated so that the info from two or more pages could be presented on one page? Trimming pages wouldn't necessarily have to mean sacrificing content.
6:38 pm on Mar 9, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 12, 2006
posts:2710
votes: 116


i've pretty much done all the consolidating i can do. i've exhausted that.
i was also thinking i might try and narrow down my subject as well, so it's more tightly focused.

the biggest drop i ever had in traffic was when i took my successful small site and added thousands of pages to it (tens of thousands) -- all at once.
i know now that it was a dopey thing to do, but i still don't think it was a 'penalty'. i think it was more to do with me diluting all my link juice, and expanding the subject matter of my site, so google no longer saw my site as being about 'one thing'.

i'm thinking that slashing a couple of sections off might help for both of those reasons.
6:47 pm on Mar 9, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15943
votes: 890


Funny this should come up right now. Last week* I was looking at an obscure directory filled with pages that are thin to the point of emaciation. I decided this is ridiculous, and simply consolidated each of three subdirectories into a single page, with redirects to fragments corresponding to the original page content. It felt good in a weird way to slash the site's overall page count; I think I'll do some more by and by.

:: uneasily wondering if this is, in fact, not inherently beneficial but simply the www equivalent of being a "cutter" ::


* OK, I checked. It was the last week of February, so I've misplaced a week. Nothing new there.
8:49 pm on Mar 9, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8639
votes: 287


Have a look at this discussion

Pruning Low-Quality and Outdated Content
[webmasterworld.com...]
1:20 am on Mar 10, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:10600
votes: 1128


In the case of a bloated site sometimes "noindex" can be a handy tool.

Consolidation of thin/short content into larger collected pages often makes sense not only for site management but user engagement as well.

Not a rule of thumb, but I've always tried to keep articles at a minimum of 800 words (preferably more). Anything less than that is merely a description and often works better in a data table/definition presentation.
1:05 pm on Mar 10, 2017 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:Apr 6, 2016
posts:154
votes: 21


@londrum yeah i slashed about 50k pages from my site on 27th Feb and since then my traffic and rankings have been improving.
7:15 am on Mar 11, 2017 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member graeme_p is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 16, 2005
posts:3022
votes: 214


What should one do with out of date pages? In one case I am thinking of events that are over. There are no internal links pointing to them, but they are up if requested.
4:23 pm on Mar 11, 2017 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:Apr 6, 2016
posts:154
votes: 21


@graeme_p i think it's best to "noindex, follow" out of date pages.
6:40 pm on Mar 11, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15943
votes: 890


If there are no internal links pointing to a page,* and it's noindexed, then unless you have your own site search ("performances of The Sorcerer in April 2013"), there's no longer any point to the page existing at all.


* If it's accessible through normal site navigation, that counts as a link.
8:32 pm on Mar 11, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 12, 2006
posts:2710
votes: 116


It might have a backlink pointing to it, so it might be worth keeping
11:27 pm on Mar 11, 2017 (gmt 0)

New User

joined:Mar 8, 2017
posts: 11
votes: 1


I have a doubt about this "slashing" old pages...

My site have almost 10 years of articles. Some old articles get low visitors and I thinks some of it can be put in a single article and updated with some new information.
So, what I want to do is join two or more articles into a single one and make a 301 redirect from old articles to the new article.
By the way, if I am using 301 redirect, can I remove the old articles ? Do I need to make something about the canonical url ?
I am using wordpress and the plugin simple redirect manager. And until now I already have ~100 301 redirects.

Any opinion ?
1:52 am on Mar 12, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15943
votes: 890


By the way, if I am using 301 redirect, can I remove the old articles ?

Yes. Once the redirect is in place, the visitor does not know--in fact, cannot know--whether the old material still exists. (You may be thinking of javascript redirects, which are issued after the visitor has arrived on the old URL. Server-based redirects are different.)

And until now I already have ~100 301 redirects.
Any opinion?
Yes: I think you might be able to handle the redirects yourself via htaccess--assuming they follow a consistent pattern--instead of dumping them all on WP.

Oh, oops, you were asking about Google. The one thing I know for sure is: If you've got a lot of redirects, you will periodically find Google asking for nonexistent pages--urls such as jtgykmyubyjvhlh.html--to check for "soft 404s". Google does not approve of these. (In fact, I think they invented the term.) So make sure your site returns a 404 response to requests for nonexistent URLs.
2:26 am on Mar 12, 2017 (gmt 0)

Administrator

WebmasterWorld Administrator phranque is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Aug 10, 2004
posts:11875
votes: 246


regarding your paginated content - i would consider using the link rel element to define a collection of documents.

as discussed in this thread:
Pagination rel="next" "prev" ...best use for articles? - Google SEO News and Discussion forum [webmasterworld.com]

This markup provides a strong hint to Google that you would like us to treat these pages as a logical sequence, thus consolidating their linking properties and usually sending searchers to the first page.
8:02 am on Mar 16, 2017 (gmt 0)

Preferred Member from BG 

5+ Year Member Top Contributors Of The Month

joined:Aug 11, 2014
posts:547
votes: 175


@lucy24 - I am so frustrated with these Google invented soft 404 pages. I was doing on-page work for a huge IT-sector forum and after we restructured the content and categories and implemented the redirect rules, not a full month passed before Google started phishing for soft 404 pages. I thought people were the cause of this, but no. The server logs never lie! Still, can't figure why Google invented this behavior with the Google bot requesting such non-existing pages.
5:24 pm on Mar 16, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15943
votes: 890


Still, can't figure why Google invented this behavior with the Google bot requesting such non-existing pages.

It's a programmatic response to meeting a high number of redirects--that is, high in proportion to total requests, not necessarily in raw numbers. They want to make sure that you don't respond to requests for nonexistent pages by globally redirecting to some other page--most often the home page. (I am 100% behind G### on this. As a user, I hate it when a site redirects to the home page. It makes it impossible to know whether I've mistyped the URL, or clicked a link with extraneous punctuation, or any of the other things that can lead to an URL not resolving. And, of course, it doesn't bring me any closer to the page I actually wanted.)

The correct and appropriate response is to do nothing. Allow the Googlebot to pick up its 404s; it's both what they want and what they deserve. (Admittedly, the two do not always go together.) And if you've expressly deleted a page (no redirect target) return a 410 so they'll stop requesting it sooner.
6:19 pm on Mar 17, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 12, 2006
posts:2710
votes: 116


i'm going to remove 10,000 pages and see what happens. it's three complete directories, so now i'm wondering whether to just 404 them all, or 301 them to the homepage?

i know the correct answer is 404, but these pages have a got a lot of backlinks pointing to them and if i just delete them then presumably i lose the lot. if I 301 them to the homepage then maybe i can keep some of the benefit. but would google frown on that, do you think?
6:56 pm on Mar 17, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:10600
votes: 1128


A redirect should be to content. If there is no content then there should be no redirect. Like lucy24 I agree that g views this tactic as a soft 404 and penalizes as a result. IOW, don't do it.

g, and other se's use crawl budgets for each (especialy) large sites. If you remove 10,000 pages you no longer want crawled, that means a different 10,000 pages on your site will now be crawled, and that would be a greater benefit to the site than having 10,000 dead urls redirect to a home page just to chase link juice for a short period of time. After all, g (and others) will soon learn that the content is no longer there.
7:36 pm on Mar 17, 2017 (gmt 0)

Moderator

WebmasterWorld Administrator buckworks is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Dec 9, 2001
posts:5838
votes: 165


whether to just 404 them all


Before you eliminate the pages, check in the Search Console to see whether any of them have links coming in from outside sites.

For the pages that do, redirect them somewhere relevant ... preferably more relevant than just the home page! For pages that have no link love, let those go 404 or even better, 410. Either way, be sure to have a user-friendly custom error page.
8:44 pm on Mar 17, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15943
votes: 890


i know the correct answer is 404

It isn't. If you have intentionally removed content (not the same thing as removing pages) the correct answer is 410.

Ordinarily, search engines don't send a referer when requesting pages (they sometimes do for non-page content, especially stylesheets). So this is one time when it might be legitimate to send requests to different places depending on referer.
6:13 pm on Mar 18, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:June 18, 2005
posts:1868
votes: 90


but these pages have a got a lot of backlinks pointing to them and if i just delete them then presumably i lose the lot.


If you use the "noindex, follow" value for the robots meta tag, your pages will be deleted from google's index yet still distribute any link juice you might have to your other pages.
7:56 pm on Mar 18, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15943
votes: 890


You can only do the "noindex" thing ("follow" isn't strictly needed, as it's the default) if you keep the pages alive, allowing Googlebot to see them while returning a 301/404/410 to all other visitors. That seems a bit sketchy. In particular, I'm pretty darn sure they would quickly figure it out if humans get a 410 while the Googlebot gets either a 200 or a redirect to the home page. Google may be sneakier than bing when it comes to humanoid visits, but they definitely still do it.

[edited by: Robert_Charlton at 11:37 pm (utc) on Mar 18, 2017]
[edit reason] Fixed typo at poster's request [/edit]

12:52 am on Apr 3, 2017 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:Nov 9, 2016
posts: 79
votes: 1


@londrum Have you got any luck with traffic improvement after removing those pages ? I got same issue, I decide to remove 300k pages with noindex (which i did on mar 5) but its almost a month and those pages are still in index, i was even tracking google bot , so bot visits one of those pages on Mar5 and didnt comeback after but page still indexed
2:40 am on Apr 3, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15943
votes: 890


With that many pages, you may have better luck going into GSC and removing them from the index right away. That's assuming they're grouped in convenient directories; I don't think you even can type in 3 lakhs of separate URLs in the "remove from index" area.
12:29 pm on Apr 3, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 12, 2006
posts:2710
votes: 116


@derik. Nothing to report yet. I've removed about 5,000 of them and most of them are out of the index, but my rankings haven't changed. It's been a couple of weeks.

I didn't noindex my ones. I just removed them and let them go to 404.
8:14 pm on Apr 3, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8639
votes: 287


>>go to 404

Not 410? That might clean things up faster.
8:15 pm on Apr 3, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member ergophobe is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 25, 2002
posts:8639
votes: 287


Oh wait, I see that 404/410 discussion already sort of happened above.

I haven't tested it, but in theory at least a 410 should result in quicker cleanup
This 36 message thread spans 2 pages: 36