| 5:44 am on Feb 15, 2012 (gmt 0)|
Or maybe deleting pages that don't rank and trying again from scratch would be better?
| 5:50 am on Feb 15, 2012 (gmt 0)|
I work mainly on observations from GWT sometimes they can be misleading but with time you might get the hang of it.
| 10:04 am on Feb 16, 2012 (gmt 0)|
By clicking on any keyword you get to know the pages ranking for it. This can then be used to ensure that you are targeting the same page for those key terms and are not directing your SEO efforts on other pages that may lead to cannibalization.
Also, from the HTML errors report - make sure that the pages having duplicate/short/long title and descriptions are fixed. This will help in better optimized site. Also, try and correct the link pointing to 404 pages. These things surely add to the efforts in a better optimized site.
| 10:29 am on Feb 16, 2012 (gmt 0)|
i sure hope 404's dont have an impact. I get 100's daily from poorly formed links on scraper sites or just part shown text urls that google still crawls all ending up as a never ending onslught of 404's. perhaps you just mean internal 404's?
| 11:32 am on Feb 16, 2012 (gmt 0)|
@santapaws: Yes primarily the internal ones should be sorted out and can also try getting some external links point to right URLs especially if they come from good (authority/contextual) sites
| 11:53 am on Feb 16, 2012 (gmt 0)|
yes its just that for me 100's of 404's are coming from google simply crawling the truncated text url on scraper sites. No chance of getting that cleared unfortunately.