Msg#: 4177074 posted 4:17 am on Jul 27, 2010 (gmt 0)
I have not been able to read all the messages here, but what I do not se much of is a discussion about business models.
People come here with a problem about traffic dropping and people try to help by asking technical questions about the website in question.
I often wonder what is the business model of the site that dropped. Because Google appears to be grading business models. Example. I have a prospective client who wants me to fix his traffic loss. He has 20 domains each one for each brand. No real info on any of his sites and he is selling some information on each brand. I can see some value to his service but I am sure Google considers it a spam service. 20 sites all the same linked selling the same product. No amount of tweaking to this site or that site is going to fix his problem in my opinion.
Msg#: 4177074 posted 6:31 am on Jul 27, 2010 (gmt 0)
What a great question - and I'd say you're absolutely right, Google does look at what we might call the "business model" and fit sites into various classifications according to what they can measure. We already know that:
1. Google likes brands 2. Google hates thin affiliates 3. They are trying to get a handle on content mills
The idea that you can make money just by running a script and generating pages and even whole websites is one they'd like to stamp out - you might call it the "automated model".
Msg#: 4177074 posted 9:57 am on Jul 27, 2010 (gmt 0)
The question you have always got to ask yourself is would you find the site useful, if your honest answer is not really then it is unlikely to do well in the listings. Certainly google specificly looks for 'bad' indicators which would suggest the site has little use such as 20 domains all pointing to each other.
However, the thing that always has to be considered is where the line is drawn. Google is an automated system and is likely to score sites in terms of probability of usefulness to a searcher. Say this is a score from 1 to 10 the line might be drawn at 5 so any site that scores less than that doesnt feature - it is blocked by the spam filter. If the site gets through the spam filter it is then scored on the ranking algo to determine its position.
This is automated so yes you could improve the sites but you would need to eliminate the 'bad' factors. When it comes to automated sites generaly they are 'bad' however it doesnt mean all automated sites fall into this category - one site we run is fully automated however the value isnt in the site it is in the algo we have developed for the site (the site was created as a test bed) which can take a paragraph of text and re-write it, essentially an AI which 'understands' what is being said. This in essence makes what would be a duplicate site which gets penalised by google into a completely unique site. However, for the tens of thousands of hours it has taken and £££££ so far to develop this script you could just hire some people to rewrite the site by hand......
The answer is think unique, think useful and dont try to outsmart google as they have made it virtually impossible.
I'd be more inclined to believe that google and other SE's look for a clear patern of behaviour , that can be readily identified in code and therefore recognised by the Algo.
An Algorithmic solution is scalable, sites fitting a certain html code profile get nuked :)
Sites in certain Gwmt profiles get undue attending. For example, Folk who have submitted reconsideration requests, already been penalised for something, recieved links from similar pages to other penalised sites,,