homepage Welcome to WebmasterWorld Guest from 54.227.12.4
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Dealing With The Consequences of BigDaddy
Solutions and Tips!
reseller




msg:767247
 6:10 am on May 3, 2006 (gmt 0)

Hi Folks

BigDaddy update has affected several sites, as you might have noticed on many threads.

It isn't enough, IMO, to keep talking about sites dropped, lost pages and lost rankings. We need to discuss measures and steps of how to bring dropped pages and sites back to the paradise of Google and we wish to see our sites rank well on the serps again.

I guess, not all publishers of dropped pages and sites can afford the "Lets wait and see" approach. Lets talk solutions and tips. Most important ethical solutions and tips within the limits of Google Webmaster Guidelines.

For example, I'm sure now through practical testing that BigDaddy is a duplicate content killer, and you might risk that most "suspected" pages will be deindexed. Is there anything we can do to deal with such problem?

So you tell us, what have you done or suggest to do to deal with the consequences of BigDaddy, for example:

- dropped pages

- dropped sites

- penalized sites

- lost rankings

- possible canonical issues

- filing reinclusions requests

Thanks!

 

Tinus




msg:767248
 9:24 pm on May 3, 2006 (gmt 0)

Sit and wait is never a good solution for any problem. But to make changes when Google changes also is like shooting in the dark. You don't know beforehand what you will hit.
My main site went totally down since 22nd september last year. So during the process of waiting I made new content for two other sites I already had. I also have a freelance author to write quality content now. Those sites give most of my traffic at the moment. What I learned from all this is:
- branding is everything for online bussiness
- succes in Google search should be considered as extra, not as a basis for a bussiness.
- Advertisements are a possibility to stay in bussiness and become less dependend on SE.

I also had a lot of new content placed on the affected site but with no succes. I removed everything what might be considered duplicate content by an algoritm but till now my former main site stays down. All pages are in the index though. Only very much down. Still hoping for succes after another update later.

montefin




msg:767249
 12:59 am on May 4, 2006 (gmt 0)

Compliments on your timing yet again, reseller.

Everytime I start musing, "Hmmmm...wonder if it's time to start devising a plan of action to recover lost ground" you post as you just did.

I can date significant loss of Google referrals since April 11th.

I see two causes that may or may not be related:

1.) Big Daddy flux overall

2.) Google's robots.txt issue where for no apparent reason robots.txt is blocking googlebot from indexing sites, sitemaps, and, in my case, individual pages, one of which has been the highest trafficked and highest earning page for quite some time -- even though I have not yet submitted a sitemap.

Although Vanessa Fox posted on "Inside Google Sitemaps" that the robots.txt issue had been resolved, their diagnostic tool still shows those three URLs blocked.

Swanson




msg:767250
 1:16 am on May 4, 2006 (gmt 0)

Excellent idea, good to see a positive plan.

From my experience I can only give the following ideas:

1) Home Page canonical problems.

Change the name of your home page (to another default supported by your host) and then change all internal links:

So, if your home page was "index.htm" - change the name to "index.html" and change ALL internal links to go to your domain e.g "www.mysite.com". Create a new "index.htm" with a single link to "www.mysite.com" saying the home page has moved - do not 301 or 302 redirect, Google does not deal with it. Use Matt Cutts blog as an example.

This in my experience over the past few weeks causes Google to recrawl fresh as it is not using a cache for the page "index.htm" as it no longer exists. You can see the result by putting adsense on it to see if it is picked up now that Googlebot and Mediabot use the same cache.

2) Dropped Pages

Look at your web page template. If you can reduce the amount of whitespace, html comments and any other "white noise" in your page templates, and therefore increasing the content density in relation to the page size (sounds mental, I know but it has worked, especially in blogs). Things like head tags (get rid of anything but title,keyword,description).

And more, just need to take a break!

Swanson




msg:767251
 1:40 am on May 4, 2006 (gmt 0)

3) Dropped or penalised sites:

Do a site:www.mysite.com search.

If your home page is not top then there could be a problem (I know some sites do ok but it is a sign of impending doom for many other smaller guys)

This could be a problem with canonical home page or just that you have many more links to internal pages than the home page.

If you have a site with not many pages then there may be a problem - follow steps 1 and 2 and check for recrawling. After a week or two I would contact Google or enroll in sitemaps (which I should say may be the only option shortly).

4) Addition to canonical problem 1.

Make sure that any other domain names redirect to your main site with a 301, and that includes the non www verson of your site. If you are using a domain registrar that can be quite difficult, so a good alternative is to put up holding pages for each domain (including the domain minus the www e.g. "mysite.com") and putting a link to "www.mysite.com".

If you suspect penalisation ( point 3) then getting into sitemaps may be a good idea as apparently Google are feeding back any penalties through this programme.

5) Can not stress the duplicate content filter enough - check that after optimising the HTML as I mentioned that there is enough content on the page - maybe out of a 20k page that 5-10k is textual content (only a guess) but upping this will only help.

6) With all the above in mind make the reinclusion request - I think the url can be found on the webmaster part of Google's site, I know I can't post urls here.

7) Depending on what you find (i.e. point 3 may be bad if your home page is not on the first few pages on Google), get a few more links in to your domain from relevant websites - if Google has a problem, MSN and Yahoo probably don't!

cbartow




msg:767252
 2:49 am on May 4, 2006 (gmt 0)

Interesting ideas... now I'm pondering pointing everything to domain.com instead of www.domain.com to see if it will fix the problem.

Now it's a decision do I wait it out longer, or actually try this.

Swanson




msg:767253
 3:03 am on May 4, 2006 (gmt 0)

Pointing at this point could be a problem.

Google is having real problems with redirects so if you change something like this it could cause issues.

The rule is always to point non www to the www version - with a 301. However in the current climate I am in favour of using a holding page with a link on it at present - just like mattcutts blog.

Just my experience at present.

cbartow




msg:767254
 3:06 am on May 4, 2006 (gmt 0)

The other problem is all the link building I did goes down the tubes since that is pointing to the www version.

At least supplements have been gone today, at least for my sites so maybe this is a good sign.

reseller




msg:767255
 7:58 am on May 4, 2006 (gmt 0)

Hi Folks

Many thanks for your generous contributions and great ideas and tips to deal with consequencies of BigDaddy Update.

In my first post, I menioned that my testing has shown that BigDaddy a duplicates killer.

In that connection, I saw our generous host, Brett Tabke, posted something relevant:

"Back from fresh recon mission to the 'plex. Deepsearch says - not an update at current, but more of the playing we've done over the last 3 weeks. He/she also says, dupe content filters may be being fine tuned on some sites........."

msg #:72
[webmasterworld.com...]

So we really need to pay more attention to the duplicates issue and find solutions and tips to deal with it.

Thoughts?

montefin




msg:767256
 8:52 am on May 4, 2006 (gmt 0)

Ahhh...just checked and googlebot is now no longer misinterpreting robots.txt. Hopefully, traffic to those pages will return after a while.

At least, now I can focus on if and/or how Big Daddy may or may not be affecting Google referrals and AdSense Impressions, Clicks and Earnings here.

tigger




msg:767257
 9:11 am on May 4, 2006 (gmt 0)

one thing I'm a little confused over how can we work out the possible consequences of BD when G has some, lets call them technical problems right now?

All I'm doing is just getting on adding content and hoping that once G sorts itself out this will just be a bad memory

reseller




msg:767258
 6:11 pm on May 5, 2006 (gmt 0)

Hi Folks

As I mentioned in my first post, you might need to file a reinclusion request as a solution to deal with the consequencies of BigDaddy.

If your site has been subjected to a spam penalty, filing a reinclusion request wouldn't hurt ;-)

Matt Cutts has been kind to post today on his blog a comment including some relevant info to this thread:
[mattcutts.com...]

===============================================
Matt Cutts Said,
May 5, 2006 @ 9:03 am

........

...., last week when I checked there was a double-digit number of reports to the email address that GoogleGuy gave (bostonpubcon2006 [at] gmail.com with the subject line of “crawlpages”).

I asked someone to read through them in more detail and we looked at a few together. I feel comfortable saying that participation in Sitemaps is not causing this at all. One factor I saw was that several sites had a spam penalty and should consider doing a reinclusion request (I might do it through the webmaster console) but even that wasn’t a majority. There were a smattering of other reasons (one site appears to have changed its link structure to use more JavaScript), but I didn’t notice any definitive cause so far.

There will be cases where Bigdaddy has different crawl priorities, so that could partly account for things. But I was in a meeting on Wednesday with crawl/index folks, and I mentioned people giving us feedback about this. I pointed them to a file with domains that people had mentioned, and pointed them to the gmail account so that they could read the feedback in more detail.

So my (shorter) answer would be that if you’re in a potentially spammy area, you might consider doing a reinclusion request–that won’t hurt. In the mean time, I am asking someone to go through all the emails and check domains out. That person might be able to reply to all emails or just a sampling, but they are doing some replies, not only reading the feedback.

====================

And here is a great post including tips in connection with filing a reinclusion request:

Filing a reinclusion request
[mattcutts.com...]

Thanks, Matt. Much appreciated.

reseller




msg:767259
 6:37 am on May 19, 2006 (gmt 0)

Hi Folks

Things have changed since my last above post on this thread.

Our kind fellow member lgn1 has posted today [webmasterworld.com] some wise words

"At least things are stable enough now to do SEO optimization for the bad data centers."

And Google's latest generic feedback [mattcutts.com] has supported lgn1 statement.

There is no need anymore to sit back and watch your grass grow. Its time for action. Its time to save your webmsites.

Whether your site or pages have been dropped. Whether you lost ranking. Whether your site indexed-deindexed off-on on Google is happening.

Time for new thoughts and tips on how to deal with The Consequences of BigDaddy.

Lets contribute to this thread and help each other. Many of our affected kind fellow members need all the help we can give.

And as it was once said: Give of yourself and you shall be rewarded 10-fold.

God bless our WebmasterWorld community.

whatcartridge




msg:767260
 7:36 am on May 19, 2006 (gmt 0)

Just a note:

It is well worth enrolling in the sitemaps program for the benefits of the Sitemaps Control Panel, however unless you use some wierd javascript menus there is absolutely no point in submitting a sitemap to Google. If the Googlebot can find a link to a page on your site or on the wider net, it will crawl that page.

Google says sitemaps are for pages that can't be found by other means.

A sitemap will NOT get your site/pages indexed quicker (if at all).

What counts now are *quality* inbound links, and *quality* content. I had one of my sites fully indexed within 2 weeks after getting a link from an on-topic .edu page. Other sites I have don't even appear in the SERPs after months, despite a bunch of inbound links.

hvacdirect




msg:767261
 7:50 am on May 19, 2006 (gmt 0)

I'm still figuring out how to get back in the index, but am keeping detailed notes to figure out what I did that works.

For my part right now my 500 unique visitors a day now see the yahoo search box on my site instead of the google box. Not only do they have 25X more pages indexed so the site search actually works, but maybe having users get used to seeing yahoo again will help them get a bit more share.

Just doing what I can to help.

Tinus




msg:767262
 10:40 am on May 19, 2006 (gmt 0)

"At least things are stable enough now to do SEO optimization for the bad data centers."

Reseller, I doubt SERPS are stable. Just a few days ago my site that got a 22 september hit and Big Daddy kick popped up for two days with the same amount of visitors as before 22 september (a peek of 300%).

But you are right about not waiting. Maybe the up and down movements in the SAERPS will never stop. Adding quality to sites will never hurt, updates or not.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved