Welcome to WebmasterWorld Guest from 188.8.131.52
Seems to be a lot of consensus that the shuffling is about links and link value. I am in a highly competitive industry and I definitely concur. I've spent the past three days doing in-depth backlink analysis on the competitor sites that jumped ahead of our site (pushing us to #11 from #6) and they exhibit obvious link building practices that Google supposedly frowns upon...mainly link purchases. I'm coming across a lot of run of sites. We have been steadily cleaning up our paid links, which seems to have been a mistake.
I've had this nagging thought in the back of my head that just gains more credence with each new Google update. I think Matt Cutts accomplishes through PR and dictum what Google is NOT able to accomplish algorithmically. You can only program a machine to do so much, evidence by the fact that we still don't have robot servants or cars that can drive themselves.
So how could the largest ad agency in the world (oops, I mean search engine) control the factors that they can't through algorithms? Why not create some sort of demi-god that respectable, white hat SEOs will flock to and follow without question? I think a lot of us have been duped and now the spammers and less-than-white hat SEOs are reaping the benefits.
Seeing as how many of us are seeing poor quality sites with poor quality backlinks beating out older, quality sites, is it too far-fetched to suggest that maybe Google had turned off a big portion of their algos that try to filter out paid links? Perhaps because after several months of launching a PR campaign against them, maybe they feel that enough sites have cleaned up those links? Or maybe because they only real filter they have the "Report Paid Links" database that they've been building?
PageRank isn't the biggest PR in SEO anymore, it's Press Relations and we all know what that's about...how to "spin" things.
[edited by: tedster at 6:37 pm (utc) on April 5, 2008]
Think about it guys, Google isn't out to screw us all over. Calm down, take a Vicodin, and wait it out.
Or maybe because they only real filter they have the "Report Paid Links" database that they've been building?
Are you suggesting that they are letting all the "black hat" SEO's sites rise to the top, so we get all annoyed and report them to Google. Therefore doing all the work for them?
Why that is a diabolical yet genius plan!
Interesting theory. I may slip some tinfoil on my head and join you in that theory ;-).
[edited by: Rugles at 6:50 pm (utc) on April 4, 2008]
I have noticed that, at present, changes mostly start at data centers 72.14.207.XX (example [184.108.40.206...] ) and then migrate to the rest of data centers. That might change of course.
To locate only one result I have to use a "3 word keyphrase" to find it in Google at the bottom of the first page of only 800+ results however the most remarkable thing is the #1 result is for a Live Search Local in San Francisco!
Are we all sure there's no new algo update slightly out of control! :-))
What do you believe on that ?
Are we all sure there's no new algo update slightly out of control! :-))
I tend to believe Matt Cutts that there is no algo update at present.
However I do believe that a software update (infrastructure update) is taking place. I wouldn't be surprised to see Google facing problems with issues.
During the latest BigDaddy software update the folks at the plex had problems with few issues [webmasterworld.com] too.
However I do believe that a software update (infrastructure update) is taking place.
This may be a naive question but how would a software update completely lose certain sites for specific keyphrases and bounce others all over the place?
Probably Google is now coming to an end with its Algo idea what is revealed by this currently exploding amount of spammy domains and inbound links submitted by bots at multinational industry strength.
You can`t stop the rain ... and Google engeneers might no longer be able to hide their main weakness.
How many links can they control ? A few millions newly added per month reads like a mission impossible.
Mainly niche kws with little traffic and individualy monitored big kw result pages show some decent top 20.
Anything popular will get attacked by more spamdomains and inbounds.
I think Matt Cutts accomplishes through PR and dictum what Google is NOT able to accomplish algorithmically. You can only program a machine to do so much, evidence by the fact that we still don't have robot servants or cars that can drive themselves.
The reason we don't have them is because it is not cost effective to mass produce them for resale.
Budgetary restraints is not Google's problem! But yes, they control us MOSTLY through the stuff Google-Reps say on boards like these and at conferences. They tell us that they penalise things so we don't even try them, then they don't have to work on perfecting an algorithm for it. Guess the Indian SEO firms didn't read the announcements and went ahead anyway. Good for them, caught Googy with their pants down ... or pulled back the curtain ... one or the other ...
Lots of speculation - is it an update, tweak, glitch, algo change.
I do not see any real reason why Matt would completely reveal anything about the inner workings of Google as it relates to updates corresponding to rankings in a forum where there are heavyweight SEO's.
One would say he is likely doing his job best if he is skewing the ability for people who use tactical advantages to rank better to do their jobs...
Just my 0.02 :)
I desphinned because the title says "Google Confirms Algorithm Change" and I didn't. Here's what I said over on the SERoundtable post:
"Just to be clear, the reason that I asked people to send in queries was because we *were not* seeing major differences between data centers. So this is more for seeing what people are talking about and less for collecting feedback on some big update."
Reseller, you mentioned the Bigdaddy update. It's interesting that my site experiencing the rankings drop also got spanked out of the SERPS for about three weeks during Bigdaddy.
Very interesting, indeed.
While in netmeg case [webmasterworld.com], 135 sites were not affected by BigDaddy and are not affected by the current Update Dewey!
So in the two cases (yours and netmeg), we see sites are "reacting" in the same way to both Update BigDaddy and Update Dewey!
We know Update BigDaddy was a software update (infrastructure update). And I believe that the current Update Dewey is a software update too ;-)
Just like what occurred during Update BigDaddy, I expect during April to see Update Dewey to display all kind of strange instable serps. Maybe there will come a time where some sites would be hit very hard too.
However, when this thing is over we would see better quality and stable serps, hopefully.
Lets all pray and wish the youngsters at Google Search Quality Team, Good Luck. Because just like during Update BigDaddy, those youngsters would need all the luck they can get ;-)
May the sun shine over GooglePlex :-)
Well, that's obviously not what he said then.
Sheesh. He said they didn't think datacenters had big differences. Datacenters dont mean crap.
They've had an algo tweak that tossed things around, that's all. There seem to be some odd results of the tweak, and they want to see what those are but they update their algorithm every week, so lets be productive and focus on what just happened.
"In the summer of 2003... Google switched to an index that was incrementally updated every day (or faster)."
From my perspective it looks as if Google is using some of the newly acquired Doubleclick data and technologies now.
We also figured out a 301 exploit which is all over the blackhat boards and notified the Google security team and search team.
This exploit came out exactly at the same time as this update started to roll. The two are tied to a very big change in Googles system.