seoholic - 8:33 am on Dec 2, 2010 (gmt 0)
I run a legitimate 2year old site and earned some trust and authority within my niche. My site is a content aggregator and I get most of my content via direct input from some bigger content creators currently.
My site offers additional value by optimising the content and presenting it in a unique way.
A section of my site is aggregating news:
I started to post and archive feeds on my site, currently a few thousand articles.
Now I want to get my site to the next level:
I have the opportunity to get 200k news articles from a topic related news archive with round about 200 words each and without the need to link to the archive.
My plan is to create a new page for each news item and put some links to my existing content in them (wiki style).
I also want to optimise the content by replacing words with synonyms, deleting useless words, deleting useless sentences like "please come to our event" from this old news. To do this I will have to hire a Computational Linguist.
I want to build an "encyclopaedia" by identifying relevant words and word combinations. Quotes from my optimised articles and relevant links to my existing content and the news articles will be the only content.
Only one of my competitors is publishing all the 200k articles. He is ranking for 200k+ keywords (searchmetrics) and more than 10 times more backlinks. My competitors site is 10+ years old and has at least 10% server errors (5xx) because of performance problems for years(!).
1) Is it possible to outrank such a site with very similar content (original news + links)?
2) If the answer to 1)is yes: Would you advise to show both versions of the article (on the same page/on different pages?) or just one version (which one)?
3) What are your experiences with using Computational Linguist(ic)s to change tons of high quality content Google already knows and Googles reaction?
4)Which mode of publishing should I use (1 post per x minutes or bigger packages / starting with the oldest or the newest news)?