homepage Welcome to WebmasterWorld Guest from 54.198.94.76
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 128 message thread spans 5 pages: < < 128 ( 1 2 [3] 4 5 > >     
A subdomains fix for Panda - Matt Cutts suggestion to HubPages
willybfriendly




msg:4339168
 7:24 pm on Jul 13, 2011 (gmt 0)

In May, Edmondson wrote an email to Google engineers...and asked whether he should break up his site into “subdomains,”...In June, a top Google search engineer, Matt Cutts, wrote to Edmondson that he might want to try subdomains, among other things.

The HubPages subdomain testing began in late June and already has shown positive results. Edmondson’s own articles on HubPages, which saw a 50% drop in page views after Google’s Panda updates, have returned to pre-Panda levels in the first three weeks since he activated subdomains for himself and several other authors.


[blogs.wsj.com...]

 

indyank




msg:4341077
 4:26 am on Jul 19, 2011 (gmt 0)

a very interesting post and discussion here on this subdomain theory - [seobook.com ] . Someone taking on Aaron wall on this subject (looks more like a googler and surely a google fan). He seem to suggest that there are no recoveries yet because sites are still breaking the "rules".

As the discussion extends to other things, the guy is also hinting affiliate sites are bad for the internet economy!

[edited by: indyank at 5:14 am (utc) on Jul 19, 2011]

walkman




msg:4341084
 5:02 am on Jul 19, 2011 (gmt 0)

Indy, he does not seem to be Googler, but a G fanboy. He mentioned 'I advise my clients' so I doubt he is from Google
Edit: Google his name, he's an SEO but a 'good one' I guess so no Panda for his clients, that's what he is trying to say.

Plus, they usually tell of the relationship and most importantly do not challenge people like Aaron Wall. They'd rather hang with the FanBoys on G+ now, Twitter yesterday and so on while looking down upon the serfs. They have essentially corrupted the top bloggers so one asks them any tough questions. Ask a not-so-nice question and you are no longer part of the 'club' and that can effect incomes, clients so we have to understand it.

Google pushed this as attacking "low quality" sites and many bought it, as Google keeps destroying sites with no salvation in sight.

netmeg




msg:4341250
 2:56 pm on Jul 19, 2011 (gmt 0)

Umm. Just about the last thing I think anyone would call Aaron Wall is a Google fanboy.

walkman




msg:4341253
 3:01 pm on Jul 19, 2011 (gmt 0)

@netmeg
read the content and the comment to which I replied. That was directed to a commenter on Aaron's blog.

thegypsy




msg:4341255
 3:09 pm on Jul 19, 2011 (gmt 0)

whew... cause ya knocked me offa my chair with that one. Might link to the comment or give the username next time. Makes things a little clearer.

/thread jack

walkman




msg:4341256
 3:13 pm on Jul 19, 2011 (gmt 0)

whew... cause ya knocked me offa my chair with that one. Might link to the comment or give the username next time. Makes things a little clearer.
Put down the cup of coffee, step away from the computer, wait 3 hours till decaffeination (?) kicks in and read it again :)

This is what Indyrank said (my bolding):
a very interesting post and discussion here on this subdomain theory - [seobook.com ] . Someone taking on Aaron wall on this subject (looks more like a googler and surely a google fan). He seem to suggest that there are no recoveries yet because sites are still breaking the "rules".

See my answer in this context: "That guy taking on Aaron Wall is not a googler but a Gfanboy..."

dataguy




msg:4342522
 1:45 am on Jul 22, 2011 (gmt 0)

I'm surprised we're not hearing more about this right now. I know there is a certain type of site that fits the hubpages model (large UGC with pages able to be divided by member) but this discovery is pretty big. I know the hub pages experience can reliably be reproduced, I guess no one wants to talk about it publicly.

walkman




msg:4342557
 4:38 am on Jul 22, 2011 (gmt 0)

dataguy, we don't how it will hold and changing locations etc has its own pitfalls--will the scraper seen as the original now for example?

SEOPTI




msg:4342577
 5:08 am on Jul 22, 2011 (gmt 0)

Parasite hosting is also a fix for panda.

dataguy




msg:4342688
 1:18 pm on Jul 22, 2011 (gmt 0)

dataguy, we don't how it will hold and changing locations etc has its own pitfalls--will the scraper seen as the original now for example?

That's why this should be tested and discussed. 301's work pretty reliably these days and from what I understand a proper 301 will retain the original discovery date. I've played with 301's since Panda first hit in February, but not with subdomains. I started experimenting with subdomains last Friday and some of the new URL's appeared in G's SERPs in less than 12 hours, which blew me away.

This siloing effect has resulted in many pages having traffic restored to pre-Panda levels, while at the same time, traffic to weaker pages has stayed the same or gone down slightly. At least I now know which pages are the weaker pages. I've spent years trying to figure out which pages Google liked and didn't like. With siloing each member on a subdomain, I can get a much better idea now.

I should reiterate that I don't think this is a 'trick' to restore rankings. I see this as the transformation of my site into a Blogspot or Wordpress type model where each member is isolated on their own subdomain. If you don't have a website where content from each member can be isolated, then that's something completely different than what I'm talking about.

The big question is what happens the next time the Panda data is updated. That's why I really want to be able to compare notes.

walkman




msg:4342689
 1:25 pm on Jul 22, 2011 (gmt 0)

dataguy, how many "good" pages do you have as far as you can tell?
Do you think Google 'stamped' each page that they know it's bad or what? If not, why they need Panda to run?

If your model is with different authors, you should definitely try it, of course.

dataguy




msg:4342792
 5:25 pm on Jul 22, 2011 (gmt 0)

Thanks for asking, walkman. Here are my stats.

I got subdomains working last Friday and picked which accounts to move to subdomains on Saturday, Sunday and Monday, so 5 days haven't passed since some of these changes have been made.

I've only switched over about 4% of the accounts, but here's what I have comparing Google traffic yesterday to a week earlier:

1,280 accounts each with a separate subdomain
7,911 pages belonging to accounts with subdomains

7/21 compared to 7/14 (same day previous week):
542 pages with gains in Google referrals, a total increase of 5,481 Google referrals total per day.
386 pages with losses in Google referrals, a total decrease of 680 Google referrals total per day.
6,983 with no change (most of them had 0 Google referrals on both dates)
Net gain of 4,801 Google referrals per day, 4-6 days after changing to subdomains.

Another way:
Last Thursday 7/14:
2,772 Google referrals without subdomains

This Thurday 7/21:
7,573 Google referrals on the same content, now with subdomains

An average increase of 273% Google referrals per day, 4-6 days after changing to subdomains.

Max Google referral increase: 423% (better than pre-Panda)
Max Google referral decrease: 100% (que sera sera)

indyank




msg:4342802
 5:43 pm on Jul 22, 2011 (gmt 0)

dataguy, that is nice information.

I've spent years trying to figure out which pages Google liked and didn't like. With siloing each member on a subdomain, I can get a much better idea now.


What do you see now? how do you think google differentiates the good and the bad?

You are saying that you had experimented 301 earlier and what did you find then?

walkman




msg:4342811
 5:48 pm on Jul 22, 2011 (gmt 0)

Very interesting. I hope it works, something has to :)

Interesting that some 'bad' content lost even more, maybe the main domain stats lifted them up a bit.
Even more interesting how Google can recognize 'bad' and 'good' content almost immediately. Or can it?

Please let us know how you do as time passes. If Panda effects them, new content boost or whatever....

conroy




msg:4342830
 6:07 pm on Jul 22, 2011 (gmt 0)

How is Google picking up these subdomain pages? Are they linked from your main pages or completely link separate from the main domain site? Any external links into the subdomains?

indyank




msg:4342833
 6:11 pm on Jul 22, 2011 (gmt 0)

The only problem with this is it won't work for all. Being an UGC site, it was possible for dataguy to move pages to subdomains on the basis of authors.

what will really help is his understanding of the good and bad pages as perceived by google.what are the features that distinguish a good page from the bad one.

suggy




msg:4342846
 6:45 pm on Jul 22, 2011 (gmt 0)

what are the features that distinguish a good page from the bad one


Precisely.

Dataguy, have you got a feel/ first impression based on what's gone up and what's gone down?

dataguy




msg:4342860
 7:44 pm on Jul 22, 2011 (gmt 0)

How is Google picking up these subdomain pages? Are they linked from your main pages or completely link separate from the main domain site? Any external links into the subdomains?

There are plenty of external links to many of the original URL's. I suspect that the reason that the URL's changed in the SERPs so quickly was because they had backlinks from outside of the site. There are still links to the new URLs from various places on the site as well.

My gut feeling is that Panda makes the good content be pulled down by the bad content, and the bad content pulled up by the good content. The net result of splitting author accounts into subdomains is more traffic for the good content, and slightly less traffic for the bad content. This is perfectly acceptable by me.

We have been trying to determine good content vs. bad by using referral stats and crawl frequencies. Our good content is crawled as much as 30 times per day, despite the fact that it hasn't changed in years.

For those of you who are still skeptical and think I deal only in thin, SPAM content, here is an example for you:

With over 50,000 members contributing their writing, we've received all kinds of content over the years. We've had paid editors (before Panda made us let them go) try to determine good from bad by rating content from 1-10, but there's more to it than what can be determined by human moderation.

Of course we could tell which articles promote affiliate products or the latest get-rich-quick schemes, but beyond that we would get articles from outside the U.S. which were unique, exclusive and written in perfect English and articles with the same subject matter written in the US, and almost every time Google would favor the articles written in the U.S. These articles are indistinguishable from each other otherwise, but some how, probably using some sort of semantic algorithm, Google would favor one over the other.

The thing is, only some of the time could we tell when articles originate outside of the U.S. so while our editors were trying to pick the best articles, how could they do so?

It seems to me that breaking each account into their own subdomains will negate this issue. Google can favor one author over the next for whatever reason they want, and that's fine by us. We are, after all, trying to give the best user experience.

I've worked practically non-stop since Feb 23 to try to figure out how to meet our user's needs and Google's needs at the same time. This sure seems like a workable solution.

walkman




msg:4342866
 8:06 pm on Jul 22, 2011 (gmt 0)

We have been trying to determine good content vs. bad by using referral stats and crawl frequencies. Our good content is crawled as much as 30 times per day, despite the fact that it hasn't changed in years.

Is that accurate as far as you're concerned or see? I ask becuase If I have, say, 1000 pages I get 700-800 googlebot visits each day, almost each asking for a page. Can't figure this thing out, even Matt Cutts said that 'low quality' pages will be less visited but...

seodudez




msg:4342871
 8:10 pm on Jul 22, 2011 (gmt 0)

dataguy:

You had mentioned that you tried subdomains previously and it didn't work - see [webmasterworld.com...]

Why do you think it will work this time?

seodudez




msg:4342914
 9:29 pm on Jul 22, 2011 (gmt 0)

thegypsy/Dave:

I noticed in your "Panda Fix" post and this comment - "Google is certainly aware of this situation and as such I'd likely advise just cleaning house instead."

Are you suggesting that Google will ultimately close the loophole for sites like wordpress, blogspot, posterous, and tumblr?

Or are you saying that the subdomain solution will not work after the next Panda update because the overall domain is pandalized?

Or?

My thought is that I have been waiting forever for my site to be cleared of pandalization( and I swear we should be clear of any penalties) and this solution might move it forward, but I don't want to do if my overall domain is going to bring down my subdomains in the next panda update.

MarvinH




msg:4343022
 9:37 am on Jul 23, 2011 (gmt 0)


Last Thursday 7/14:
2,772 Google referrals without subdomains

This Thurday 7/21:
7,573 Google referrals on the same content, now with subdomains


Hi Dataguy,

Do you think it is possible that Google is sending more traffic to your new URLs only because they are new? New-page effect?

It will be interesting to see how these new URLs still perform in a few weeks / months.

walkman




msg:4343030
 10:30 am on Jul 23, 2011 (gmt 0)

MarvinH, apparently it didn't work the first time around for him.

Dataguy on July 5th: "Ditto on tedster. I tried subdomains and they seemed to work for about a week. Then I discovered I wasted a week. "
[webmasterworld.com...]

MarvinH




msg:4343036
 11:36 am on Jul 23, 2011 (gmt 0)

Thanks, Walkman. That's a valuable feedback.

dataguy




msg:4343039
 12:11 pm on Jul 23, 2011 (gmt 0)

MarvinH, apparently it didn't work the first time around for him.

The first time around was me trying to remove what I was guessing was weak content from a different site, in the hopes of escaping the effects of Panda on the rest of the site. This didn't have any effect at all as far as I could tell. If you recall, that's what conventional wisdom and what MC said to do since the beginning of Panda until the article on HubPages turned up.

The difference between that and what I'm doing now is:
1. It's a different, stronger and larger site.
2. I'm separating the content by author account and not just guessing at which pages are weak.
3. I started with author accounts I knew were strong, changing the URL's on strong content instead of hoping that the old URL's would gain strength after removing weak content.

I think these things make a big difference. When I first heard of the HubPages experiment I was very skeptical because I thought I had already experimented with what they were doing. I'm glad I got over my skepticism.

Do you think it is possible that Google is sending more traffic to your new URLs only because they are new? New-page effect?

It will be interesting to see how these new URLs still perform in a few weeks / months.

You're exactly correct, @MarvinH. That's the big question and it's why I've only committed a small percentage of my pages to this experiment. I was hoping to get some feedback from others experimenting with subdomains so we could compare notes.

conroy




msg:4343043
 12:44 pm on Jul 23, 2011 (gmt 0)

There are plenty of external links to many of the original URL's.


Dataguy, did you 301 the previous URLs to the new subdomain URL or otherwise redirect them?

zoltan




msg:4343044
 12:46 pm on Jul 23, 2011 (gmt 0)

dataguy, what did you do with the old URLs? I assume you had so far URLs like: www.yourdomain.com/article/123456.html, now you have author.yourdomain.com/article/123456.html. Have you 301 redirect www to author?

dataguy




msg:4343049
 1:17 pm on Jul 23, 2011 (gmt 0)

@conroy & @zoltan, I used standard 301's from the old URL's to the new URL's.

walkman




msg:4343051
 1:34 pm on Jul 23, 2011 (gmt 0)

Amazing. Looks like Google has gotten faster in processing 301s and giving credit within a couple of days

zoltan




msg:4343053
 1:45 pm on Jul 23, 2011 (gmt 0)

Thanks dataguy. I assume you have a member name for all your authors. Or did you 301 redirect to something like accountid.yourdomain.com? This approach can work well if you have a unique membername for your authors. Otherwise a redirect like: 1338779.yourdomain.com might look quite silly...

seodudez




msg:4343057
 2:34 pm on Jul 23, 2011 (gmt 0)

DataGuy:

I am going to do a similar test. However, I want to ask you... do you have an index on your site? Did you change that too? Or just do a 301 redirect for those specific pages?

I have a people search site and we have an index/directory/sitemap of names/pages that is linked from our home page similar to how linkedin and facebook have it. We also have an xml sitemap. I assume you have something like this as well? Did you change all these indexes/sitemaps to have the pages at the new subdomain or did you just build a 301 redirect for these pages?

This 128 message thread spans 5 pages: < < 128 ( 1 2 [3] 4 5 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved