Welcome to WebmasterWorld Guest from 126.96.36.199
BigDaddy update has affected several sites, as you might have noticed on many threads.
It isn't enough, IMO, to keep talking about sites dropped, lost pages and lost rankings. We need to discuss measures and steps of how to bring dropped pages and sites back to the paradise of Google and we wish to see our sites rank well on the serps again.
I guess, not all publishers of dropped pages and sites can afford the "Lets wait and see" approach. Lets talk solutions and tips. Most important ethical solutions and tips within the limits of Google Webmaster Guidelines.
For example, I'm sure now through practical testing that BigDaddy is a duplicate content killer, and you might risk that most "suspected" pages will be deindexed. Is there anything we can do to deal with such problem?
So you tell us, what have you done or suggest to do to deal with the consequences of BigDaddy, for example:
- dropped pages
- dropped sites
- penalized sites
- lost rankings
- possible canonical issues
- filing reinclusions requests
I also had a lot of new content placed on the affected site but with no succes. I removed everything what might be considered duplicate content by an algoritm but till now my former main site stays down. All pages are in the index though. Only very much down. Still hoping for succes after another update later.
Everytime I start musing, "Hmmmm...wonder if it's time to start devising a plan of action to recover lost ground" you post as you just did.
I can date significant loss of Google referrals since April 11th.
I see two causes that may or may not be related:
1.) Big Daddy flux overall
2.) Google's robots.txt issue where for no apparent reason robots.txt is blocking googlebot from indexing sites, sitemaps, and, in my case, individual pages, one of which has been the highest trafficked and highest earning page for quite some time -- even though I have not yet submitted a sitemap.
Although Vanessa Fox posted on "Inside Google Sitemaps" that the robots.txt issue had been resolved, their diagnostic tool still shows those three URLs blocked.
From my experience I can only give the following ideas:
1) Home Page canonical problems.
Change the name of your home page (to another default supported by your host) and then change all internal links:
So, if your home page was "index.htm" - change the name to "index.html" and change ALL internal links to go to your domain e.g "www.mysite.com". Create a new "index.htm" with a single link to "www.mysite.com" saying the home page has moved - do not 301 or 302 redirect, Google does not deal with it. Use Matt Cutts blog as an example.
This in my experience over the past few weeks causes Google to recrawl fresh as it is not using a cache for the page "index.htm" as it no longer exists. You can see the result by putting adsense on it to see if it is picked up now that Googlebot and Mediabot use the same cache.
2) Dropped Pages
Look at your web page template. If you can reduce the amount of whitespace, html comments and any other "white noise" in your page templates, and therefore increasing the content density in relation to the page size (sounds mental, I know but it has worked, especially in blogs). Things like head tags (get rid of anything but title,keyword,description).
And more, just need to take a break!
Do a site:www.mysite.com search.
If your home page is not top then there could be a problem (I know some sites do ok but it is a sign of impending doom for many other smaller guys)
This could be a problem with canonical home page or just that you have many more links to internal pages than the home page.
If you have a site with not many pages then there may be a problem - follow steps 1 and 2 and check for recrawling. After a week or two I would contact Google or enroll in sitemaps (which I should say may be the only option shortly).
4) Addition to canonical problem 1.
Make sure that any other domain names redirect to your main site with a 301, and that includes the non www verson of your site. If you are using a domain registrar that can be quite difficult, so a good alternative is to put up holding pages for each domain (including the domain minus the www e.g. "mysite.com") and putting a link to "www.mysite.com".
If you suspect penalisation ( point 3) then getting into sitemaps may be a good idea as apparently Google are feeding back any penalties through this programme.
5) Can not stress the duplicate content filter enough - check that after optimising the HTML as I mentioned that there is enough content on the page - maybe out of a 20k page that 5-10k is textual content (only a guess) but upping this will only help.
6) With all the above in mind make the reinclusion request - I think the url can be found on the webmaster part of Google's site, I know I can't post urls here.
7) Depending on what you find (i.e. point 3 may be bad if your home page is not on the first few pages on Google), get a few more links in to your domain from relevant websites - if Google has a problem, MSN and Yahoo probably don't!
Google is having real problems with redirects so if you change something like this it could cause issues.
The rule is always to point non www to the www version - with a 301. However in the current climate I am in favour of using a holding page with a link on it at present - just like mattcutts blog.
Just my experience at present.
Many thanks for your generous contributions and great ideas and tips to deal with consequencies of BigDaddy Update.
In my first post, I menioned that my testing has shown that BigDaddy a duplicates killer.
In that connection, I saw our generous host, Brett Tabke, posted something relevant:
"Back from fresh recon mission to the 'plex. Deepsearch says - not an update at current, but more of the playing we've done over the last 3 weeks. He/she also says, dupe content filters may be being fine tuned on some sites........."
So we really need to pay more attention to the duplicates issue and find solutions and tips to deal with it.
As I mentioned in my first post, you might need to file a reinclusion request as a solution to deal with the consequencies of BigDaddy.
If your site has been subjected to a spam penalty, filing a reinclusion request wouldn't hurt ;-)
Matt Cutts has been kind to post today on his blog a comment including some relevant info to this thread:
Matt Cutts Said,
May 5, 2006 @ 9:03 am
...., last week when I checked there was a double-digit number of reports to the email address that GoogleGuy gave (bostonpubcon2006 [at] gmail.com with the subject line of “crawlpages”).
There will be cases where Bigdaddy has different crawl priorities, so that could partly account for things. But I was in a meeting on Wednesday with crawl/index folks, and I mentioned people giving us feedback about this. I pointed them to a file with domains that people had mentioned, and pointed them to the gmail account so that they could read the feedback in more detail.
So my (shorter) answer would be that if you’re in a potentially spammy area, you might consider doing a reinclusion request–that won’t hurt. In the mean time, I am asking someone to go through all the emails and check domains out. That person might be able to reply to all emails or just a sampling, but they are doing some replies, not only reading the feedback.
And here is a great post including tips in connection with filing a reinclusion request:
Filing a reinclusion request
Thanks, Matt. Much appreciated.
Things have changed since my last above post on this thread.
Our kind fellow member lgn1 has posted today [webmasterworld.com] some wise words
"At least things are stable enough now to do SEO optimization for the bad data centers."
And Google's latest generic feedback [mattcutts.com] has supported lgn1 statement.
There is no need anymore to sit back and watch your grass grow. Its time for action. Its time to save your webmsites.
Whether your site or pages have been dropped. Whether you lost ranking. Whether your site indexed-deindexed off-on on Google is happening.
Time for new thoughts and tips on how to deal with The Consequences of BigDaddy.
Lets contribute to this thread and help each other. Many of our affected kind fellow members need all the help we can give.
And as it was once said: Give of yourself and you shall be rewarded 10-fold.
God bless our WebmasterWorld community.
Google says sitemaps are for pages that can't be found by other means.
A sitemap will NOT get your site/pages indexed quicker (if at all).
What counts now are *quality* inbound links, and *quality* content. I had one of my sites fully indexed within 2 weeks after getting a link from an on-topic .edu page. Other sites I have don't even appear in the SERPs after months, despite a bunch of inbound links.
For my part right now my 500 unique visitors a day now see the yahoo search box on my site instead of the google box. Not only do they have 25X more pages indexed so the site search actually works, but maybe having users get used to seeing yahoo again will help them get a bit more share.
Just doing what I can to help.
Reseller, I doubt SERPS are stable. Just a few days ago my site that got a 22 september hit and Big Daddy kick popped up for two days with the same amount of visitors as before 22 september (a peek of 300%).
But you are right about not waiting. Maybe the up and down movements in the SAERPS will never stop. Adding quality to sites will never hurt, updates or not.