Yeah, ask Yahoo! and some of the other SEs what happens when they serve ads and don't have great results too... For people to think they're serving bad results on purpose with Bing blasting away at them is not very well thought through IMO. How many search engines that serve(d) ads failed to grow or retain market share because even though they served ads they didn't have the best results, and do you think Google wants to take the chance of following in their footsteps? Personally, I doubt it...
The two results sets make sense to me from what I'm seeing too.
I was seeing this on the Caffeine IP for over a week and today I have also seen it on Google.com.
A pro link builder aquaintance said for the last year or so he suspected that one day G weighted links from relevant sites higher then the next day a link from any site, not in the niche would have good value. So on alternative days, the site would do better from on topic site links and the next day all links are taken into account. I have felt there was some thing to what he was saying and now based on what you are saying it seems more possible.
Are your anchor text links mostly from relevant or related sites?
@scottonline, interesting, keep checking though as Monday/Tuesday this week was our best for months! It still could be total coincidence, the only way to find out would be to switch it back. If i were in your shoes i would be reluctant to do that until everyone knows things have settled down. Good luck ;)
I have noticed something interesting.
If you do a search for widgets for example and look at page 1 of the SERP and then go to page 2 of the SERP, page 1 will appear purple at the bottom of the SERP. If you do a search for widgets again later on, the pages that you visited earlier will appear purple at the bottom.
I am also seeing sometimes that if you do a search for widgets and look at page 1 of the SERP and then go to page 2 of the SERP, page 1 does not appear purple at the bottom of the SERP. Also, if you do a search for widgets again later on, the pages that you visited earlier will not appear purple at the bottom.
Could this be some type of test by Google to analyze web surfing patterns?
I have added my heart felt 2cents a day or two back, and now for those who are keeping stats on G's gyrations here are some observations since 6-11-10
for my main & most important key 2 word term.
on 6-11 6-12 & 6-13 AM the #1 site was a new site to the first page SERP that only had one word of the search term in the title which was the plural of the word, and this is the important part OUT OF 18,400,000 results -
our website was #8 & The Global Brand name site was #2
the evening of 6-13 everything was the same except there were 31,400,000 results
6-14 all the same but our site moved up to #7
6-15 in 11 AM the Global Brand name site moved to #1 where it had been since Mayday started there were 32,300,000 results
by 11:30 PM on 6-15 all was the same except the results were 29,700,000
6-16 The Global Brand name was #1 but had an indented page under it in the AM but just the top spot by PM - our site back to #8 - there were 20,600,000 results
the evening of the 16th I experimented with my page title and removed the key two words of the search from the title this was uploaded at 9:30PM on the 16th
6-17 AM The Global Brand name held at #1 - we were at #7 but the title change had not been picked up yet - there were 20,600,000 results
6-17 at 4PM they were showing our new keywordless title page
the Global Brand name was still #1 our site was #8 with out any mention in title for the key 2 word term
--->but catch this,--- there were only 13,900,000 results
So it took less than 24 hours for our home page without keywords to be indexed, that's pretty fast I think, and it did cause me to be booted off the 1st page - but what was up with the total number of results? how does Goog go from 18.4 million to 32.3 million back to 13.9 million in the past 6 days or so?
It doesn't look like G is settled down yet, no? do you guys get that much variation in total results?
I meant to say DID NOT cause me to be off 1st page. that surprised me a little, so I learned that I didn't need the main 2 key word term in my title in order to be shown on the 1st page SERPS, and all this time it was beaten into my head for years that YOU MUST have the keywords for which you want to be ranked highly for, in the page title, OH well another one bites the dust.
Got a feeling google is finally going to do a SERP update this weekend .... wouldn't it be nice, the last real one was only 6 month ago ... anyone else got that feeling ;)
Add to the list one more webmaster hit June 2nd. No changes or link campaigns or anything to the main sites.
I truly miss the good-old-days when Google delivered search results, not this useless voodoo hoodoo.
1) Searched for a driver: 30 first results from Google had nothing but spam/pay-for-free-content -- I thought the driver isn't released yet. First 4 results in Bing delivered the driver, no nonsense.
2) Searched for a preview for sports game (country1 vs country2) - There are thousands of previews. Google returned pages about travel in country2. Bing gave me the best source in the field.
3) Searched for a very very very well known product. Google returned spam and link to the manufacturer (end-user rarely needs this information). Bing returned local sites selling the thing.
4) As a superbonus, once Google returned a result which didn't have the search term anywhere! It's like, it doesn't matter what user wants, Google just decides to give you what they want to give you -- search term is irrelevant.
Since I'm one those perverts who actually want relevant information when I use search engine, I have turned - surprisingly quickly - to a regular bing user.
Strange times, even non-tech people I know are kind of starting to look for an alternative, and month after month the same industry news: Bing gains share... Reminds me of the altavista: industry standard, too big to fail, stopped delivering good results, started adding extra stuff in their homepage. Then Couple of jackasses made a simple search engine and it took over effortlessly. I have laughed at the previous "Google will perish just like Altavista" - theories, but if average people are beginning to move...
Tinfoil Hat Anyone?
After spending the last 2 weeks looking through each of the sites for common factors:
Affected sites had the same...
- WMT Account
- Adsense pub-id, with at least 1 ad or Google custom search per site
- Domain registrar
- Domain owner info in Whois
- Funneled through same SMB Gmail account
Has anyone else looked at the tin-foil hat theory? Haven't I read that Google MIGHT use things like Whois profiles to better their serps? (Obviously, nobody knows for sure)
It would not at all be hard for G to mix a Whois or WMT flag into the algo.
- WMT Account - CHECK
- Adsense pub-id, with at least 1 ad or Google custom search per site = CHECK
- Domain registrar CHECK
- Domain owner info in Whois - PRIVATE
- Funneled through same SMB Gmail account - WE USE Y MAIL NOT G
Yet we lost over 90% on our biggest site and around 20% on some other ones.
Nobody else has mentioned this, but last night i saw a shift in my longtail traffic - came at about 9pm GMT - no, my longtail traffic is still down about 25%, but there was a big spike in my conversion rate - noticeable, like 25%! that has mostly held through this morning. i think these serps are a moving target and that lots of changes in the longtail serp are still underway - the google engineers can't really be pleased with the longtail serp quality - maybe some more data got dumped in or maybe somebody nudged a dial on trust/longtail - better conversions on longtail mean better serps - i can't help but wonder about the data - they say caffeine is up and it will be faster, but lots of us have seen big reductions in bot activity - how can that be consistent with better serps (especially on longtail)? maybe there are still some kinks in caffeine and they can't really run the expanded dataset (the one with all the longtail stuff) yet - i don't believe the payday theory - ultimately they will get the longtail serp quality issue fixed
I have one KW that is I think tripping the algorithm and subsequently being filtered ~10 places. If I open up browser A, it will be in position 5 and then open up browser B, it'll be in 15th. This is what is making me think it is a keyword specific filter. It only applies to this one term.
Does that mean that different DCs are applying filters? This has happened for two weeks now.
I hope we are right wingslevel as I also believe that google is still broke. Google has always been the place to get the most accurate results and thats not the case now. I agree with you and cant imagine Google being satisfied with the search results. I'm sure they are really happy about the mases who are joining and activating their adwords accounts (more rec$$) but in the long run if they dont get things under control, people will start flocking to other search engines i.e. bing and yahoo. Google is smart and needs us to succeed just like we need Google to be really successful online.
btw, today our googlebot activity has gone down to zombie like numbers once again and its about 5% of what it used to be prior to caffeine.
|If I open up browser A, it will be in position 5 and then open up browser B, it'll be in 15th. |
Maybe one browser is logged into a Google account and the other is not. Also, each has its own Google cookies unless you take care to remove them.
|If I open up browser A, it will be in position 5 and then open up browser B, it'll be in 15th. |
Not logged in, with personalised search turned off. Happens at work, and at home.
This week had picked up a bit, now today seeing a drop in sales conversions again with "normal traffic levels, as if we're getting more zombie traffic. Haven't seen a double digit sales day in exactly one month.
Just 2 cents here: with 100,000s of small servers doing the job, I would suspect that the "infrastructure update" of big G is never finished. I also doubt that they switch off the old machines quickly... In my opinion we are looking at 2 infrastructures right now and will for a few months to come.
So, there are 2 infrastructures / search architectures in place and in these 2 there are different versions of the index live. I suspect the analysis of SERPs we see strongly varies by factors even google has not 100% control over.
If they do load balancing with standard cisco hardware, I believe the balancers scatter each and every request around...
"The caffeine update is thru" IMHO just means that the majority of requests land on the new infrastructure - but if that infrastructure does not respond fast enough for the balancers, we land on older boxes...
I think I read the data storage method differences between Big Daddy and Caffeine are not compatible, and they each contain so much data they cannot store the data from each in the same DC, and they did not double the size or number of their Data Centers, so IMO it's not likely they still have both running...
I would guess the way they store the data is Caffeine format now, but there could be two different versions of the algo running, like 'Original Caffeine' and 'Updated Caffeine' for testing purposes.
I really doubt if they are still storing data in Big Daddy format, but anything is possible I guess.
Just to lighten it up a bit... Maybe this is all part of the cloud transition, and they have hired former weathermen to handle the algo?
You know... The Weatherman only has to be right 50% of the time, and even then, they are always right!
On a serious note - this was the worse week I have seen since the transition on June 2-4th. Month over Month, and YOY traffic from Google ONLY, is down 90% or more!
Traffic from other engines has followed an almost flat ebb/flow of our historical data.
Just makes no sense.
Bowdeni you are correct in what you are seeing. I call it the kiss of death for the keyword when I see it occurring. Either it returns back to normal or the keyword gets lost in the shuffling and ends up with a much lower ranking. IMO Google is monitoring traffic patterns over extended periods of time to see if the page is deserving. Some of these things are much more sophisticated as compared to a year ago. As Iíve always said this type of test can produce erroneous conclusions because it assumes temporary increases in rankings can produce increased traffic or sales. The fact is unstable results more often than not decrease traffic and sales.
Mad is right on that one pontifex; the data types between big daddy and caffeine were incompatible, so a spillover wouldn't make much sense in that regard.
I have been thinking about co-occurance being currently more important than the occurance itself and adding to that the Udi interview that stated language controls being at a toddler level -- from that point of view, I suppose I'm not surprised to see some of the crazy "~keyword -keyword" results. The semantic siloing is getting polluted too easily, making it difficult to really zero in on intent, and thus the elimination of secondary searches.
Long term I think it'll get fixed, but long term in natural language understanding is 5-10 years away; short-term we're going to have to see that variable tweaked back a bit, or perhaps with a greater bias placed on the theme of backlinks in order to take out some of the injected noise.
Mike McKnight - Ive noticed quite a few hits on our sites and SERP results for reasonable strength searches that dont have the kws in the title at all, but have an exact match phase (and not even always exact, sometimes the words are split up in different places) just in the body text.
|language controls being at a toddler level |
No Joke: Out of curiosity, I tried 'gah' twice, followed by 'goo' twice (no quotes), and they got it right on... Didn't complete my search; didn't change my results to something I didn't search for, and gave me an exact match first, so I guess they were telling the truth... The algo realized I knew exactly what I was looking for because I was searching at the level it understands... Toddler! LMAO
pontifex, I surely have the same doubt you do.
They are probably observe what gain them more revenues, Old or new structure..
ok, so here 2 more cents regarding solely the algo at the moment:
I dont know if the WebProNews interview with Matt from the 11th of June has already been discussed. The threads are becoming quite HUGE again.
Matt states in the video that the "Mayday Update" ( @dusky: ;-) ) was an algo update to filter out "questionable content" better. At least that is, what I understood.
With that, the "soft 404" errors suddenly show up in the WMT. Always communicated with the mantra "it is good for the user".
I think they internally already know that the algo is not perfect and mix different results to find out, which set appears to be the best under "whatever their criterias are".
On my eCommerce site, the SWF files with short previews of the products (which are crucial for potential buyer information) are marked as "soft 404s" - another flaw with Google. These short files are CRITICAL to sell the products we have and now Google sees them as a potential negative factor? Matt even states it is good to remove them in general.
The "thin content" or "questionable content" detection mechanisms are just another half-baked idea like the "restaurants soho" error they fixed meanwhile. So the fluctuations in the SERPs might be a testbed and we are the beta users... again.
"We are the biggest Kingmaker on the web" - yeah!
Like supporting a monarchistic culture is a good idea? I would have preferred: "We are the mirror for the democracy of the web" as a picture.
Buttom line: Googles results at the moment are a vehicle to
a) filter out content they consider "questionable or poor" automatically (where collateral damage happens and is accepted)
b) make Kings of the web under their moral guidelines, which they do not take lightly (oh, I morally shot you in the head - sorry)
That Matt Cutts video about Mayday [webpronews.com] was touched on earlier - but not a lot.
One comment from Matt:
|If you are affected you might ask yourself "how do I make sure that other people and sites and users, everyone, knows that my content is high quality. |
Matt goes on to mention factors like "how long do people stay" and if the "content across your entire site" tends to be lower in quality - in other words some kind of aggregated metric across the site, and not just the page.
And yes, I also noticed the "soft 404 reporting" coincidence, with swf files showing up as well as other false positives for soft 404 that might also be false positives for the Mayday algorithm.
On one site I see a lot of soft 404 errors where the site uses a 302 to a log-in challenge page (served as a 200). A log in challenge status "should be" a 401 Unauthorized I guess, but apparently this site's configuration has been picked up as a signal of many low quality pages.
Seems like Mayday's idea of "site quality" might be generating a lot of false positives by confusing technical missteps with lower quality, at least if the soft 404 report is involved in any way.
Just a quick one before i watch some more football(after Englands dismal performance yesterday!). Best week in months & weekend sales also the best in months(& it's only Saturday!). Not checked stats etc but on sales only great news. Will be interesting to see if things remained settled next week or whether we end up back at square one. Anyone else seen an improvement this week?
We have been redirecting our homepage for technical reasons for quite some time to something like
and we saw a drop recently for some keywords. I'm not sure if that would fall in the "soft 404" type category or not, but we are scratching our heads a little.
The technically correct status for redirecting the domain root is a 302, not a 301. If you're using a 302, you're OK on that score.
From what I've seen so far, a soft 404 warning needs to be a redirect from MANY urls to the same final page. Just one to one is not a problem.
|Matt goes on to mention factors like "how long do people stay" |
What is the implication by Matt Cutts here?
Is it that google is now able / willing to measure visitor time on page and bounce rate on our sites?
If so, how are they doing it? Through google analytics data?
What about sites that DON'T have GA installed?
And couldn't webmaster's hire a couple of kids / bots to browse their sites for long periods of time, thus increasing time on page and pages visited, and reducing the bounce rate?
Or maybe they are able to measure the rate at which a user clicks on a link in the google SERPs and then clicks the back button to return to the same google SERP page?
If that were the case though, I would click on the link to my competitor's pages from the google SERP and then immediately click back to the SERP to lower their value...
| This 215 message thread spans 8 pages: < < 215 ( 1 2 3 4  6 7 8 ) > > |