| This 215 message thread spans 8 pages: < < 215 ( 1 2 3 4 5 6 7  ) || |
|2:Google Updates and SERP Changes - June 2010|
| 11:10 pm on Jun 12, 2010 (gmt 0)|
< continued from [webmasterworld.com...] >
I'm currently looking at an e-comm site where sales for the four complete weeks since May 16th are down 77% compared to the previous four weeks.
So far, I have no additional words of wisdom to add to this thread, and not much of a clue how to fix it.
Traffic levels haven't fallen by much, but instead it's as if Google is sending completely the wrong type of visitors.
Number of pages reported by site: search kept on falling, but in recent days have started to go up. Sales returned for two days, and then dried up again.
[edited by: tedster at 12:05 pm (utc) on Jun 15, 2010]
| 4:21 pm on Jun 22, 2010 (gmt 0)|
Backdraft7, I know Google is aware of this one site in particular. They list some 3k products - identical products on over 15 sites and rank for all of them. No action has ever been taken. In fact they're being rewarded more each day as I watch the number of indexed pages grow.
This is what drives me nuts. I think it's all been one big scare tactic by google. They willfully let people slam their WMGL and we can now verify some are knocking it out of the park with mirrors/spam/duplicates in Mayday/Caffeine.
If I'm a single company selling the same product on 15 different sites with different keyword combinations isn't that a violation of the WMGL? It may well be but it's a very rewarding tactic right now.
| 4:49 pm on Jun 22, 2010 (gmt 0)|
thanks for sharing the Bruce Clay Video - I learnt alot -
I agree it makes sense G does not have all the filters on at once so they can measure the results of their updates better and also to keep the DC moving quickly as they adjust.
| 6:07 pm on Jun 22, 2010 (gmt 0)|
|Is there anything special with the former Caffeine IP these days? I think it's just back in the main pool for Google now. |
Still different Tedster, [188.8.131.52...]
| 6:15 pm on Jun 22, 2010 (gmt 0)|
I think this is appropriate to this thread as to the condition of G being broke of sorts;
I just had my WMT show crawl errors which i have had zero until an hour ago and they are all 404's (there are 8 of them). The thing of note is actually that when i click on the linked from link the pages that are showing are from as long ago as 2006 and have been pages that in fact do not exist and have not existed for a long time and no, they where not 404'd properly back then.
So it seems that G is not only reindexing the entire web but also every page they have ever seen in the past. Safe to say they never delete anything and they are reviewing everything? Would explain the chaos and also shed a glimmer of hope that it will all get closer to "normal".
I would say that if they are re hashing every bit of info they have ever seen and currently seen, they could be happy with the way things are "working" and it's just a matter of time until it alllll reconciles.....?
| 6:21 pm on Jun 22, 2010 (gmt 0)|
|The idea is that Google would get a created long tail term because folks would type in equipment, then not finding what they wanted would type in tools, still not finding what they wanted , they finally type in hammer, and collect data then come up with the new long tail result equipment tools hammer - again I ask who searches like that? |
From the data I've read, most people...
They usually type a query, then if they don't find what they're looking for on the first page they try again by adjusting the query.
His example is also just that, an example, so it might not be one search right after the other, it might be a few days ago you searched for nail, lumber, etc. and today you realize you need something to put one into the other and search for it... When the system is in place and everything is up and running you would get different results than the person who searched for 90s music, emcee (not usually spelled out), etc.
< continued here: [webmasterworld.com...] >
[edited by: tedster at 6:56 pm (utc) on Jun 24, 2010]
| This 215 message thread spans 8 pages: < < 215 ( 1 2 3 4 5 6 7  ) |