Welcome to WebmasterWorld Guest from

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

How should I treat expired pages on my site?

10:50 am on Aug 28, 2009 (gmt 0)

New User

5+ Year Member

joined:Aug 27, 2009
posts: 1
votes: 0

I manage a site with a lot of temporary, limited time user generated pages that expire. Right now we keep the pages, telling the visitor that the information has expired.

But I consider handling this a new way because:

We have a big load of new user generated content every day. Theoretically it becomes a huge number of pages on my site over time. (theoretically because in practice it does not all get crawled and indexed by Google).

It will take a lot of resources for Google to crawl all these "old" pages, and thus it takes away resources for the crawl from other more important things on the site.

Google can/will not keep all these pages in its index and simultaneously index all of our new content. We face that problem now.

I guess the easiest way to solve this would be to remove the expired pages and return a 404 or 410 header response to Google. So they would be removed from the index and not crawled again.

But this way you will lose all the link juice that is created when people link to their own content from other sites.

So my best guess is to do a 301 redirect to a search result page from a similar search. This way you keep the link juice within your site. When the user (coming from the search engine) comes to this page, you could eg. give a small pop up that says: "This content has expired but here is a lot of other pages with similar ......". This pop up comes naturally only when the user was redirected from an expired page.

But I am a little worried about the large number of redirects will have negative effect in Google and I don't know if there are any other disadvantages. How should I handle this? What is your suggestion? Thanks in advance.

[edited by: tedster at 11:24 am (utc) on Aug. 28, 2009]

2:08 pm on Aug 28, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:May 26, 2000
votes: 0

It's an interesting question. My first observation is that Google is pretty good at knowing which pages are short-term. They are also getting good at knowing when a search term requires fresh results.

Because of all the above, it's very likely that the backlinks you are concerned about will begin to lose power quite quickly, even if the webmaster leaves them online after the expiration date. So "squeezing too hard" on the potential link juice is probably not a worthwhile prospect.

But there is something to it, and I understand why you would want to take a look at various approaches.

Have you considered a hybrid? The day after the content "expires", you add an extra notice to the page that the date has passed. Still keep the url live for another 30 or 60 days, and then return a 404 or 410 status. That would make extended use of the available link juice, even as it decays, but it would also avoid many problems.

Note that Google has made it clear that they don't want to see search results indexed in their search results - so I would not suggest going in that direction for your expired content. Offering a lot of urls that are only dynamically generated search results based on the referer could result in penalties after a while.

Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.

Google Webmaster Guidelines [google.com]

2:18 pm on Aug 28, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Feb 12, 2006
votes: 92

i've got a similar problem on my site.
i list a load of upcoming events, and as the events pass by the pages become useless, so i don't want them crawled. but they've quite often got a few backlinks attached to them.

i'm lucky though, because as the events pass by the links to them drop off the calendar, so all my in-site links disappear.
that means they never get crawled, but still remain in place when people visit them.

i've got a little script that checks the date and attaches a message to the top saying it's out of date, and suggesting alternative pages for them to visit.


Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members