cheesy_snacks - 5:05 pm on Dec 15, 2011 (gmt 0) [edited by: tedster at 7:30 pm (utc) on Dec 15, 2011]
Hi guys I'm trying to reduce/remove/eliminate the amount of low quality pages on my site.
To identify them I'm using the following process:
- Using site:www.example.com in google
- make a note of all these pages
- navigating to the final page to get to the
In order to show you the most relevant results, we have omitted some entries very similar to the 100 already displayed.
If you like, you can repeat the search with the omitted results included.
- Go back to page 1 and see which pages have been added and compare the two.
My first query is is this the easiest way of identifying low quality pages?
Once these pages have been identified what is the best way to deal with them?
- Simply remove the files from the server?
- Add noindex no follow in the meta tags of the page?
- Remove the pages using webmaster tools?
Any guidance appreciated!
[edit reason] switch to example.com [/edit]
[edited by: tedster at 7:30 pm (utc) on Dec 15, 2011]