Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: open
So your though is that we went down on the old algo just by chance ... not because they changed it also?
With OLDGOOGLE ... we ranked say 5 before yesterday then 11 yesterday (THAT WENT DOWN), with NEWGOOGLE we ranked 170 before yesterday then 7 yesterday (that went up) ...
What I am wondering ... is, IF there really are two separate lawyers of test now ... not just one new algorithim as some writers postulate?
My idea was ...Why would they be tinkering with an old algorithim unless they really are using a two step process now the old algo PLUS something additional? But maybe 6 other sites just made changes that put them ahead of us under the old algorithim ... and it was not changed ... I am skeptical that they do just have one algorithim myself ... since we still have not shown up in the other dozen search terms for our sites, just back in one term ... and that could mean that they just quit applying step 2 to that term?
that could explain why we showed back up in ONE TERM only, and not the dozens of others we normally rank on ... if they select which terms use which algo ....
BUT ... if there are only two algos .... how can we rank differently on the same one ...? unless both changes are just coincidental ...
maybe i am missing something elementary here ...
Established sites ranked 5th to 15th or so often get pushed back when fresh junk gets introduced, but then slowly crawl back up a few spots as Google more accurately weighs the value of the fresh junk, usually deciding it is actually lightweight on other algorithm factors (besides anchor text and page title) and thus they drop down. So the established sites works its way back uo again, and lo and behold Google introduces some more fresh junk, and the process occurs again. This will happen under any ranking algorithm I suppose, but the old Google was particularly suseptible to it because it was so reliant on anchor text. It would see a "fresh" page with tons of self-generated links and immediately vault it towards the top.
it seems to me that theoritically that is a different event than searching for "joe" and "joe -asdasdad" to bring up the old google ... since even though you are searching for the same thing using different terms - you are not trying to EXCLUDE anything in the first example ....
BUT ... if there was ONLY one algorithim going on before, why different results ...?
so, maybe there always has been some filtering going on and now they just changed that a lot?
Google has taken great measures to not allow one site to damage another, so no, i doubt it. More probable is a decrease in value now given to anchor text, since it is so easy to mainpulate.
Following on from your 'buy whatever', 'buy a whatever' question I have noticed a similar one:
'kw1 AND kw2' solicits the following message from Google:
"The "AND" operator is unnecessary -- we include all search terms by default."
Again, this suggests that Google should do the 'kw1 kw2' search; but if you try that search it gives different results from 'kw1 AND kw2'
To explain the 'kw1 kw2 -waffle' search results it has been suggested that because you know what you don't want (waffle in this case) you definitely know what you DO want (kw1 kw2) so Google does an exact match and not a broad match (broad match being suspected to be the key to the Florida results - including me).
Your example of the search with 'a' in could be argued that whaen you use the 'a' Google suspects that you want an exact match so again does an exact match and not a broad match.
Both the above situations are cases where (it is suggested) Google uses the context of the search to decide whether to do broad matching or not.
However, the AND term cannot be explained in this way. Google sees AND as an operator (not another search term) and tells the user that by default all search terms are connected using the AND operator. So 'kw1 kw2' SHOULD evaluate internally to 'kw1 AND kw2' - the context of the search should be the same in both cases. Yet both cases DO give different results.
I am having problems resolving this situation.
I am having problems resolving this situation.
I confess i am not an expert on googles search methods ... and i have not studied the "and" thing ... but i thought they just started the "broad" matching thing last month ... before that i thought "buy whatever" was always an exact match ... i do know for sure , for example, when i searched for say ... "widgit for sale" i always got different results from a search for "widgetS for sale" ... today i understand the deafult includes both ...
Based on what I am seeing ... it appears to me that google is looking MUCH closer at the content of the sites linking to you ... and valuing those incomming links much different than before ... so maybe your links in are just not being counted like before .... i also doubt there is a penalty for them like the other poster did.
It seems that links from sites NOT CLOSELY RELATED in content are not valued like they were before ... (ie an incomming link from a big widget site does not appear to help a travel site like it used to)
I also think there is a LOT more going on besides that ... but that is at least one element which i personally think is big piece of the puzzle ...
I have seen the same thing ... it seems they are NOT revising the PAGE RANK you see on their toolbar by way of this incomming link re-analysis .... it stays the same, just the page dont show anymore .... so this revised measure is invisible it seems ...
we had a pr6 site dissappear also .... it had lots of links from high page ranks sites ... but most were in a different general classification of content ...
Also, in our link of business, it seems the sites that got pushed UP ... to fill the vaccum of content sites that fell out were what i consider to be fairly spammy DIRECTORIES ... with a lot of mostly NO TRAFFIC sites linking to them ... but they have thousands of links in and the sites linking in are of similar classification ...
so the moral i draw from that is google is valuing a lot of links form no traffic sites with similar content more than a lot of links from high trafic sites which are in unrelated categories ... or at least pretty different categories ...
I've heard different stories but this is consistent with what we see ...
Again however ... it is big puzzle ... and this is only one piece .... i doubt if that alone would make a site dissappear .... in cases where a site dissappeared i think there are also other factors ... maybe one or more of 3 or 4 others ... when they combine you dissappear ... and they are all pretty clean stuff ... we see a number of "black hat" sites doing fine so it seems the new systems lean most on CLEAN optimizing
The other problem is that you cannot control what directories, websites, etc. link to you. I do not reciprocate links, so thought I should be credited for all related links linking to my site. I have about 100 category links pointint to me with PR4or better, however before I got kicked out of google it seemed the majority of the links google showed pointint to my site were from unrelated categories. Go figure.
Thank you much in advance.
This will happen under any ranking algorithm I suppose, but the old Google was particularly suseptible to it because it was so reliant on anchor text.Anchor text and fresh listings are two completely different things. They are not part of the same equation, as the anchor text is factored in at a separate time IMO.
Our sites were clean also ... highly optimized ... but clean practices, like you read about around the net .... but we had them dissappear, ONE that we test "DE-Optimized" is starting to show back up ... good placement on one keyword today, and later today some other keywords are showing up for it also, but POOR PLACEMENT in the results ... like spot # 60 compared to top 10 before .... (we did rank good on dozens of keywords, both before last month and before the last year when it seems to me that employing more optimization techniques really became essential just to keep up with the jones' so to speak) the other sites are still GONE ... so Clean "high optimization level" techniques themselves could at least be a potential issue ..., however we have seen competitors with similar "optimization levels" stay there ... also some competitors with invisable images with alt tags loaded with keywords still there for example ... but i noticed the black hat sites were not ALSO doing ALL the other optimization things at the same time .. like heavy use of links with key word ancor text ... or hyphenated file names including keywords ... the best spam i have seen lately seems to have thrived with the new ggoogle and it is totally contrary to googles policies about building pages with REAL CONTENT for users .... it seems if you make a page with just TWO WORDS ON IT ... in H1, bold, and if those are your key words ... you dont need much else to thrive today ... but how useful is that for a user? I have seen those sites rise up lately ... our sites are real "wordy" so there may be a lot of keyword usage on a page naturally ... but as a % of the total text content ... it is no where like the 2 word page of course ...
Add this all up and it looks entirely possible that there is either some randomized pouring out or rotating of some sites OR some kind of "good site indexing" going on which essentially exempts some sites from these events or the new rigors ... my personal guess is the latter ... they could be using some historical feedback from their toolbars to assist in developing this "good site" index (kinda like the alexa rankings).... that could explain why some popular sites stayed in and at the top when they look no different than others that got poured out ... my money is one some feature like that being in place which no one will be able to find out about unless you get a boiler room of people using the google toolbar to pour over your sites enough to tip the balance of their historical data .... and see if things change for you then ....
I would be curious to know if anyone out there could ask GOOGLE if they would go on the rocord to state whether
1) there is any "random rotation" being employed in the rankings now, or
2) any feedback from their toobar is being used to affect rankings now ... if the toolbar results are being used, this is something which could really skew results dramatically ... and it is also subject to manipulation ... first, it is not statistically valid as a sample since MANY people do NOT want to use their toolbar and people who want to use it naturally have different proclivities than those who do not so maybe the different personality types like different kinds of web sites also, and black hats can place the toolbar on a robot to feed google some bogus data ... we have in fact seen competitors do that with the alexa toolbar to affect their ranking there ... and it is not that hard with a THIN volume of users visiting sites with not a HUGE traffic to start with. Two or three robots can make a HUGE difference after it is EXTRAPOLATED to the entire internet population based on a SMALL SAMPLE.
So, you can see, I am a bit skeptical about a lot of what i have been reading .... particualrly since we similarly test "de-optimized" another site and it is NOT appearing again anywhere yet (with all things being as close to "equal" as the real world allows with the site which is showing back up. Also, i do think there is a fair probablility that there is a lot of "PR" (and i do not mean page rank) explinations floating around and the bottom line may have no relationship to those explinations at all ....
The good news is I am still seeing additional keywords starting to showup for the one site, so maybe this will continue to go on for a few days ... it seems like they are processing a complete update early this month?