Forum Moderators: Robert Charlton & goodroi
We need to keep this thread focused on the followings:
- Changes on your own site ranking on the serps (lost & gained positions or disappearance of the site).
- Changes you have noticed on the new serps (both google.com and your local google site) especially in regards to the nature of the top 10 or 20 ranking sites.
- Stability of the serps. I.e do you get the same serps when you run the same query within the same day or 2-3 successive days (both google.com and your local google site).
- Effective ethical measures to deal with the above mentioned changes.
Thanks.
One of my sites found its way to page 1 on Yahoo for a 100,000 result "five word pop culture phrase" resulting in significant traffic to my site.
This same "five word phrase in quotes" does not even show up in the first 200 results on Google.
In several respects, Google is broken, and I will let the world know, and start looking for new "alternatives" to Yahoo (& MSN).
>Sorry to 'break the rythm' of this thread, but I am dealing with the consequences of Bourbon by deciding to tweak my text for Yahoo.<
Good point!
Maybe we can add a new point to our list of how to deal with the consequences of Bourbon:
-Optimize your site for other search engines (like Yahoo, MSN ..)
- Do nothing as probably more changes on the way
- Subtle page changes and monitor SERP changes
- Do a 301 redirect regarding yoursite.com vs. www.yoursite.com (canonical url problem)
- Removing 302 redirects
- Removing duplicates
- Optimize your site for other search engines (like Yahoo, MSN ..)
- Transfer your affected site to a spare/emergency domain
Any other suggestions?
Re: 64.233.183.99 & 64.233.183.104
These are the DCs that GG said watch for new serps - however they look like they have followed other dcs this morning IMO - rather than leading the way.
Have not heard from GG in a while in the forums :( - hope he is ok and not put off by all the negativity.
>Have not heard from GG in a while in the forums :( - hope he is ok and not put off by all the negativity.<
GG is used to the "negativity" and we are used to GG codings/decodings lines ;-)
I guess the majority on these forums highly appreciate GG´s feedback.
Btw, words on the street say that GG would be in the PubConference. Look for a "Larry" there ;-)
Yep.
>>>Look for a "Larry" there
Larry Lol - dont think so.
I had fully functional shops (running of the same product database as other sites) and will most likely be running it off a new url with Google exclude or do people here recommend putting it in a sub domain with Google exclude? Maybe I should just kill it altogether :(
I think many of us have question for GoogleGuy (its a shame so many people have posted naff questions on thread)
Once that issue is sorted out hopefully traffic will return for some of us. :) Some of us need a really good crawl though.
If so - well done Google (esp GG for taking feedback to where it matters)
We also need people to stop using 302s.
Then we need people to stop copying and pasting content.
Then we need a good crawl and things to be sorted out.
Then, maybe then, I might make the money I did 3 months ago.
Right now, it looks like I've lost 20000 a year and my retirement fund.
Absolutely fantastic.
IMO it looks like Google are sorting out Canonical URLs. I would suggest that [64.233.161.105...] is one of the more advanced dcs for sorting out this issue.
I think not! [64.233.161.105...] is terrible for me even though its very US .com forcused it still has by site penilised.
I renamed several pages that were being hit hard by scrappy scrapers sites and had gone url-only. This was in an attempt to break them lose from the scrapers.
I didn't re-write the content of the page so that is likely why Google thinks the content of the scraper sites is 'original' since their pages were created earlier.
So now I'm adding/rewriting content on the pages - even though all main pages have been fully indexed within last 10 days.
I also noticed on one page in particular... This page only had 3 paragraphs of text (~250 words) and several bullet lists, and several screen shots images.
A scraper site had copied 1.5 paragraphs and used it on 3 of their pages. This is the only 'main page' that is not indexed and it is obvious that G thinks the scraper site has the original content.
So I've had to rewrite the content and add some. And the moral to this story is that Long pages (>700 words) likely won't get affected by a scraper taking a large paragraph but smaller pages will. I guess they have a % duplicate test.
But did the scraper site also copy your page's Title or Description and make it part of the content? This is very common for the scraper pages. If so, Google can figure out eventually that your page is the original. Unless the scraper copies your Title and Description and makes an identical page, your page should win out in the long run if a "sort" is being done.
That's why I believe Google is and has been doing a long winded bubble sort of web pages, actually checking uniqueness of Titles and Descriptions. It's going to take a long time, they will keep kicking sites out of the index and then crawl them back in when they think they have a true original Title and Description. The scraper sites help by providing a link (or at least what looks like a link) to the source of the original Title and Description.
I know this idea is farfetched but more and more the sypmtoms and evidence bear it out. My primary site was down to 20% indexed pages, now it's back up to 60%. Before the update it was 100% indexed for 2 years.
Given all this I realized there is one important thing to do!
1. Make sure your site doesn't look in any way like a scraper. When thinking about this I had a lot of index pages linking to my articles. There was a paragraph describing each article, and what did I put at the end.
Read more .... LINK
Looks like a copied snippet, and I said, thats not a good idea! Even though my content is all original, in some cases it looked a little like scraper pages on my index pages. So needless to say a lot of "....." strings are gone from my pages.
2. I'd also think at this time, a sitemap.txt, or sitemap.xml would help Google sort out what is truly the correct URL to use to fetch your pages. Your telling Google like this "www.google.com" not "google.com" when you ultimately submit your link to your sitemap.txt or xml. This can help Google get rid of those redundant copies of your own pages (although I don't think they've started, yet..... (oops scraper like))
SEE Google:
[google.com...]
then sign up if you haven't and submit a sitemap.
To sum up, and ad to the thread todo list:
1. Be patient, Google is sorting out originals by Title and Description?
2. Don't look like a scraper
3. Help Google out with a sitemap.
Just to follow up if my crazy theory were correct here are some: .....
Don't Do's
1. Don't change your page's file name or path.
2. Don't change your titles (for now)
3. Don't change your descriptions (for now).
4. Only change your content on existing pages a little.
These are the only things, at least today, that make your site unique and are easy to sort.
count me on the very happy camp too. I had no dupe or canonical issues though. Just my domain was "sandboxed". Now I'm doing very good on all DCs.
My suggestion is to wait till GoogleGuy announces that they fat lady has sung. Anything you while in panic mode will come back to bite you.
if that taft tool is accurate...thank google for filters. my site goes from top of the serps to oblivion with the filters off.
OldPro, I don't think it is. I get the exact same results with OR WITHOUT the filter! Figure that one out.
Clint - it could be my 301 kicking in or it could be Google sorting things out.
I tried it with the www and got over 200 results, with again only one page of 5 hits showing this time.
Check out some of the results G shows for the NON www search:
The first two hits are on my domain, two of my pages.
The 3rd hit looked like a 302, but it's not...
[TheWazzooDomain...]
The 4th hit is a payment link from some long ago defunct payment processing service! It's not even on my domain!
The next hits are as follows:
MyDomain.com=20/
Similar pages
MyDomain.com%5B7/
Similar pages
MyDomain.com%5B1/
Similar pages
www.MyDomain.com%09/
Similar pages
ALL of these 4 links above are BAD LINKS, clicking them gives you "the page cannot be displayed". How the heck did those get in there and what does that mean?
Then the 9th and final hit is a 302 (hijack?)
"HTTP/1.1 302 Found". URL is like:
[SomeOtherSubdomain.OtherMainDomain.com...] . Note the ?u= and the %2F in that URL, which DO NOT exist in the real URL. Also, that full link works, it goes to the correct webpage on my site.
Doing the command with the www, shows the exact same results for 1st, 2nd and 3rd spots; 4th is another one of the www.MyDomain.com%09/ and 5th is the same as the last result (9th) above.
Does this tell anything of use to anyone?
Thanks.
I question this as Adsense earnings on my site follow the same pattern routinely: 1st 10 days - not so good; 2nd 10 ten days - usually the best of the month; last 10 days - back to not so good.
So, this could be just the pattern repeating. Also, I've changed some of the pages (actually redid nine main pages and changed their names, titles, KWs), began using dynamic content instead of straight HTML for news articles.
Next steps are publicity and marketing the site via anything but Google. My site is still way down in the SERPs for many queries. Unrelated sites appear well before me and the 1st page of results are generally less-than-fully-relevant, though I am seeing fewer scrapers.
My tactic is now to ignore Google almost completely. If traffic begins to come from there, fine. If not, fine also because I am using other traffic-generating methods.
Clint - it could be my 301 kicking in or it could be Google sorting things out.
Well that's good for you. They haven't sorted out JACK for me. :( :( :( I still have been erased from existence. I'll give them till next weekend and if I don't have my life back then, I WILL raise a stink and ruckus the likes of which G and the national media has never seen--and I will NOT be alone. It will be time for someone to put a collective representative face on the countless thousands of victims of this atrocity. I will have no other choice, and nothing else to lose. Hopefully it will never come to that.
OldPro, I don't think it is. I get the exact same results with OR WITHOUT the filter! Figure that one out.
i get totally different results with the filters off for my keyphrase. right now i am analysing why this is so...just in case the tool is accurate. so far, i have identified 3 or 4 possiblities, but don't want to post them until i am somewhat sure my assumptions are correct.
my sector is unique in that it was one of the first "commercial" sectors on the WWW. Lot's of old sites including mine. the one possibility i am confident of is that there seems to be some sort of "grandfathering/legacy" type filter.
On another note, in the aftermath of bourbon it seems to have had strong undertones of the hilltop algo. This could explain the fallback on dmoz, yahoo directory and the late 1990's alta vista crawl of "expert sites". It also could explain why site's with crosslinking schemes and canoncal issues where hit the hardest...and the fact that the Page Rank algo seems to have little influence on the serps.
If they are giving hilltop more influence in their overall algo, this could explain the new google evaluation lab project...to develop yet another set or sub-set of "expert sites".
I have no idea if the tool is even halfway accurate, but the results of using it were astoundingly different for my sector. Assuming it is accurate, then it could be that different filters are being applied to different keyword clusters since you saw no change.
We are all shooting in the dark at a moving target here with trying to come up with how to deal with bourbon. The few things we have identified as possible negatives are...www and non www situations, 302's, crosslinking schemes and scrapper generated duplicate content. At least with those things we can adjust. On page factors are of little importance at this point.