|A Suggestion About Google SERPS|
| 6:25 pm on Oct 18, 2011 (gmt 0)|
I've been reading all the posts on Panda, and darned but some of them are "hostile", one reason why I've chosen not to participate.
I have a "theory" about what's happened, that fits the available data, and Google behavior. It's not particularly good news.
Simply put, Google has crossed the border between having manageable algorithms that are predictable and controllable to the world of uncontrollable and unpredictable algorithms, typical of super complex systems.
Super complex systems are interesting, because they are characterized by so many variables, and interactions between them (i.e. signals), that if you change any of them, the results are no longer predictable. IF you try to improve things, overall, the chances are you will make things worse, not better.
There's a point where systems become SO complex that nobody, not even the "owners" can control them. Kind of like a version of the Forbin Project.
Google has hit the wall, and so have webmasters, because no amount of fiddling will result in CONSISTENT improvements (better SERPS for Google, and better rankings for webmasters.
It's not about smarts. It's not about agendas, or even money. It's about a system that is out of control.
The implications are pretty obvious:
There's no point trying to figure out how some sites are ranked high and others not. It will be different for each site, and there are NO general rules or guides.
Blaming Google is pointless. Attributing evil motives is pointless, because even if they are perfectly intentioned, THEY don't have the capability to control their own super complex system.
SERPS will get worse and worse in terms of value for searchers. As Google tweaks and becomes even more complex, results will get worse. It's interesting that search engines that appear to operate with less complexity and fewer signals appear to be providing better results. (your mileage may vary - Google serps are not useless to me).
There are some potential solutions, and Google needs to look at them -- what they have in common is more simplification, not less. I suspect Google IS on a downward slide now. I don't see a way out so long as they try to run a search engine to index the whole web in it's entirety, because the WEB changed, but they have tried to bandaid things by adding complexity. It doesn't work.
I'm interested in comments on this.
| 7:59 pm on Oct 18, 2011 (gmt 0)|
Your thinking is very much parallel to mine, coachm - with one caveat. I still think Google will modify their system to bring it back into a more controlled situation. I use Google search all the time. Even with the obvious anomalies and wrong note that appear more often than they used to, I still find what I'm looking for almost all the time.
The long term "answer" will involve moving away from band-aid solutions and a top to bottom rethinking of their scoring methods. As you said, a simplification.
| 8:02 pm on Oct 18, 2011 (gmt 0)|
I believe 100% in the coachm theory that it is no longer possible to predict what the SERPS will produce.
Putting myself in Google's position, I too would be pissed off with so many so called SEO experts trying to game the results. Google have now knocked that profession on the head and made them redundant.
That seems to have screwed up some of their SERPS, agreed. But in a year's time SEO experts will no longer be able to convince anyone of their use. So maybe Google plan that they can then get back to "business as usual" after that time?
And taking a step back and being entirely impersonal, why should I, you or anyone else be able to SEO our pages so that we get to the top of the SERPS., That's not good for the users of search engines. They want impersonal, good search results. And maybe, just maybe, we (as SEO manipulators) have been messing them around.
The new Google SERPS have stopped our ability to do that. Maybe that's good for us all. Posted from "not sure dithering".
| 8:27 pm on Oct 18, 2011 (gmt 0)|
coachm, true to a point but they have made quite a few exceptions, namely for brands. When you say we can't hit this and that, what's left? Us.
Second, we are up against math: Google has said that they want to send more visitors to "high quality sites" so others will be left to die soon or later. What's a "good quality" site, we know.
And even worst for us, Google is seeking to send even less and less visits to webmasters. They say this on their 10K filings with SEC and latest numbers show it, they had an increase of 13% more clicks within Google than the previous 3 months. A 13% increase in just 3 months! [webmasterworld.com...]
So many of the Panda demotions, might be not be demotions at all, just less clicks being sent down.
I would buy your theory 100%, if all types of sites were being hit, but it's month 8 and many small businesses have gone BROKE because of loss of serps. A 10% increase in traffic to, say Sears, can bankrupt 100 small stores. And yes, it's easy to say don't depend on Google, like saying to Google not to depend on electricity or on the Windows Operating System because Microsoft can pandalize you from their system.
As for Google fixing it (whatever fixing means) one day? Hah! They could have given panda less clout, say instead of the 'normal' 60% drop, sites would have suffered a 15% or 30% drop until Panda was fixed (if it's broken) but they chose not. In fact they tightened the screws pretty much after every update.
I am adding all the pages I deleted, stupid me could have gotten some referrals meanwhile but I listened to Google.
Btw: Panda is not /was not inevitable. Anyone knows that site-wide penalties or promotions are very biased and don't produce the best results. They punish many good pages and promote many bad pages. My competitors with hundreds of thousands of pages have a license to do whatever they want, add all the junk they can find because it will rank for many months and possibly years..
[edited by: walkman at 8:39 pm (utc) on Oct 18, 2011]
| 8:38 pm on Oct 18, 2011 (gmt 0)|
i actually think that creating an algo should be getting easier, not harder, because half the stuff that they used to rate sites in the old days no longer apply.
eg. onpage stuff is practically dead now. apart from obvious things like titles and maybe pictures.
even links are going to fall by the wayside soon, because 99% of users aren't in a position to drop links. its a webmaster thing. so it doesnt make sense to use it as a ranking signal.
its all going to be about social signals. and they are a lot easier to count up. how many people talk about this page, compared to this page? etc. how many people bookmark this page, compared to this page? etc.
given that google follows a sizeable chunk of the web through its browser, analytics, ads, +buttons and everything else, the data should be falling in their laps.
other than the fact that the web has got so big now that its impossible to crawl in its entirity (which is presumably why google has been so hot on trying to get us to remove dud pages and speed the other ones up), why should it be any harder to rank pages than it was 5 years ago?
what has actually changed? web pages are still the same as they ever were. its still pages, text and pictures.
if the algo is getting worse then i reckon its because they are overcomplicating it.
| 8:42 pm on Oct 18, 2011 (gmt 0)|
I had that thought as well - but apparently (I'm basing this on comments here) the noise factor from Panda is high - the sites that are popping up to the top or being filtered is not consistent. Tons of low quality sites filtering to the top, tons of good quality sites being filtered. And that apparently changes daily.
So it may be that it's too complex to understand, but the results don't seem like something Google would want.
And even if it is overly complex, they're should be some generalizations we can make as to intent and implementation. i.e. 'lots of unique, well written content'. But that doesnt' even seem to be the key.
Link building long ago got too complex for me to worry about individual solutions. Nevertheless I still know that relevant authority links help - and that's good enough to take action on. What's the action to take in terms of content and quality signals post panda?
I'm with tedster. I figure Google is going to either revert, change, or tighten the amplitude of this update.
| 8:53 pm on Oct 18, 2011 (gmt 0)|
I disagree. Google will hire quality checkers. There you have groups of humans verify quality results. No getting around any algo in that respect. I'm sure every employee has a trash button they can press at any time during their day to say a site sucks and shouldn't rank where it is.
Also, they aren't using an algo as much because they have this thing called Chrome web browser. That can collect more than enough data about behaviour. They have 97% market share in mobile search so I'm sure there are many signals that will be used more and more.
In my opinion, it's more about "people powered" than "algo" powered. They could in theory hire 5,000 people who take the top keyword searches and do that 365 days a year, and ensure their quality results that way.
Because rankings make no sense now, it's the most feasible direction I see. For all anyone knows, a lot of humans could be the source of the recent flux. You can't figure it out because humans are actually controlling the results by saying which are good and which are bad. What are those recent hired in the UK doing with their time? Sure there are billions of sites, but I see only important searches getting Panalized.
Lastly, when you bang your head on a wall, you realize it hurts and you stop doing it. However, when you do something and it increases your profits, you aren't going to go back to the way it was before when you made much less money. As in, hey, if we keep doing this, we are increasing revenues. Something is working here, we just can't nail it down in the complex algo right now. Don't worry, don't hurry. All things are looking good right now. One little QA department really isn't calling the shots at Google. Be honest about that. Sure they are the face, but they aren't the boss.
| 8:58 pm on Oct 18, 2011 (gmt 0)|
That is like saying "the result will be a a win, or a lose or a draw".."predictions" don't come any safer or all encompassing, than that..where do you want me to send the check Madame Zaza..or doubtless you'd prefer cash..really wheel ..c'mon.
|I'm with tedster. I figure Google is going to either revert, change, or tighten the amplitude of this update. |
| 9:19 pm on Oct 18, 2011 (gmt 0)|
I probably wasn't clear. I'm saying that I suspect Panda will have to be restricted eventually.
Until then, all this up and down I think is bad for Google. I don't think they'll keep up with this up and down yoyo forever. By mid2012 it'll likely have settled down as the result of some overt action taken by Google.
| 1:16 am on Oct 19, 2011 (gmt 0)|
I think Panda is fully under control an is doing fairly precisely what they wanted it to do, ab initio.
However, methinks evolving corporate priorities an emergent realities are modifying the intructions to the "panda".
| 7:21 am on Oct 19, 2011 (gmt 0)|
|its all going to be about social signals. and they are a lot easier to count up. how many people talk about this page, compared to this page? etc. how many people bookmark this page, compared to this page? etc. |
Huh? Sounds great... There's no way people will be able to game that system of thought...