homepage Welcome to WebmasterWorld Guest from 54.166.113.249
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
My site has 11 links over 3 spots on page 1 - one of them with four "wwww"
kamikaze Optimizer

10+ Year Member



 
Msg#: 3522210 posted 5:58 am on Dec 7, 2007 (gmt 0)

Here is an interesting one:

My site is showing on page one for my two word KP in three different positions for a total of 11 links.

Number 1 position is the normal "Sitelinks" position, where Google shows my most popular 9 links (and my DMOZ title/description), this is not new, you have all seen this by now.

And the number 2 position is another link to an internal page on my site, also very normal and showing my onsite title/description (but I have never understood why Google does this, the same link is included in the Sitelinks mentioned above...)

The real kicker is...

I now also have the number 10 spot, going to my main index page, with my onsite title/description showing, but it is displaying 4 "w"'s in front of the URL, not the normal 3, such as WWWW.example.com.

My .htaccess file forces the 4 "w"'s to redirect to the normal WWW.example.com, which would never allow any spider or browser to see any content at WWWW.example.com.

Let's discuss.

:)

 

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3522210 posted 7:15 am on Dec 7, 2007 (gmt 0)

Well, clearly the four wwww's is a mistake. However, according to Matt Cutts here at PubCon, in just a few weeks all subdomains will cluster as one in an sewrch result. So the most any search will show then is two results for any domain name. Then the four wwww's will drop out for you.

In the mean time, it sure looks like somehow a wire got crossed, doesn't it? I'd double check that the wwww to www redirect is really delivering a 301 status.

kamikaze Optimizer

10+ Year Member



 
Msg#: 3522210 posted 9:05 am on Dec 7, 2007 (gmt 0)

Hi Ted;
I'd double check that the wwww to www redirect is really delivering a 301 status.

Based on some new info, I might have an issue here; I will let you know.

all subdomains will cluster as one in an sewrch result. So the most any search will show then is two results for any domain name

Are you saying that all subdomain links will be included in the 'Sitelinks'?

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3522210 posted 9:49 am on Dec 7, 2007 (gmt 0)

Sitelinks is a separate thing, a set of expanded internal links just for the #1 spot on certain specific searches. But that doesn't enter into this picture. Here's what I am saying.

Here's what happens now. The first step of results retrieval for any single search still has no limit on how many urls can be returned from a domain. In the early days of Google, a domain could even have all 10 first page spots and still keep on going. It could even be embarrassing!

Today, the preliminary, raw retrieval of roughly 1,000 results still puts no limit on how many urls can be returned from a given domain. But there's a further processing step - a filter kicks in. That filter is supposed to ensure that only 2 urls maximum from any domain will actually be shown.

If those two urls happen to be on the same page, then they will cluster together on that page rather than show at their "true" algorithmically determined position. But through all the total pages of any search result, any single domain is supposed to show up a maximum of 2 times.

Now here's where we've been able to game the current situation. Subdomains are treated like a separate domain, and so you can get two results for www.example.com, two more for sub1.example.com, two more for sub2.example.com, and so on.

Matt Cutts mentioned that Google is working on code to eliminate that possibility for most domains. That is, Google plans to treat most subdomains essentially like any other url on the main domain, and they will limit that domain, INCLUDING all its subdomains, to two positions total on any given search.

At that point, the whole subdomain vs. subdirectory decision will lose most of its importance - and your wwww urls will not show up, even though they may still be causing you trouble behind the scenes.

Miamacs

5+ Year Member



 
Msg#: 3522210 posted 12:21 pm on Dec 7, 2007 (gmt 0)

When checking the current handling of subdomains, I did a query on Google for site:yahoo.com

( I guess it's allright, since the subdomain thread mentiones a query for google already, we need to be fair with these two )

The first result was:

@www.yahoo.com

Yes, that's an @ in the subdomain ( host... hehe )
It disappeared within a few days time, and while outranked www.yahoo.com in the site: search, never made it to the regular SERPs.

...

I wonder what that was.

confuscius

5+ Year Member



 
Msg#: 3522210 posted 1:32 pm on Dec 7, 2007 (gmt 0)

Does this mean that a certain auction site and a certain bookshop will be treated in this way? I certainly hope so.

What are the odds on there being some exceptions?

What about subdomains like .org.uk or .co.uk or say .us.com where the websites on these subdomains are separate websites in their own right?

This is going to be fun!

Paul

blend27

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3522210 posted 2:40 pm on Dec 7, 2007 (gmt 0)

kamikaze Optimizer you might want to check on the wild card cases, I have a suspicion that you site will also allow to resolve to asdf.yourdomain.tld and thisainthere.yourdomain.tld, which I would assume be treated as a dup content if it resolves with the same content.

We had an issue like this a while back and it took out a healthy chunk of our site from the index.

blend27

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3522210 posted 6:49 pm on Dec 7, 2007 (gmt 0)

I should mention that I posted what I heard here at PubCon reluctantly, because I tend not to like discussing vaporware in any form - and until this change actually happens it's only hearsay or vaporware, even if this source should know what's being planned.

The clear need for some exceptions for some domains could make this change quite problematic and delay or even negate its "launch".

Are you saying that all subdomain links will be included in the 'Sitelinks'?

Subdomain links can already show up as Sitelinks. This happens for several of my clients who have a key subdomain with a prominent link on their home page. But it will never be "all" subdomains. Sitelinks has a limit of 8 total.

confuscius

5+ Year Member



 
Msg#: 3522210 posted 10:31 pm on Dec 7, 2007 (gmt 0)

Matt Cutts on a Sphinn story:

"This isn't a correct characterization of what Google is looking at doing. What I was trying to say is that in some circumstances, Google may move closer to treating subdomains as we do with subdirectories. I'll talk about this more at some point after I get back from PubCon."

Clear as mud then as usual!

Miamacs

5+ Year Member



 
Msg#: 3522210 posted 11:15 pm on Dec 7, 2007 (gmt 0)

...

Sure it is.

It's gonna be manual.
And evaluated on a case by case basis. *smirk*

I bet they already have a few targets on mind.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3522210 posted 2:48 am on Dec 8, 2007 (gmt 0)

Matt pulled me aside at the pub this afternoon to add some information. Here's what I understood:

This change will NOT mean that it's 100% impossible to rank subdomain urls in addition to urls from the main domain. The current plans are to make it harder to rank a third url, then even harder to rank a fourth, and so on with an increasing "damping factor".

Matt also did a video interview with Michael McDonald of WebProNews this afternoon, where he planned to bring more clarity to this issue. When that video goes live, we'll have even more direct information.

My apology for getting the details a bit messed up first time around.

kamikaze Optimizer

10+ Year Member



 
Msg#: 3522210 posted 4:25 am on Dec 8, 2007 (gmt 0)

Hi blend27:

Thank you for the input, I am looking into it.

It appears that my issue is due to the fact that I have setup 302 redirects (temporary) and not 301 redirects (permanent).

I intend to fix that this weekend.

On the whole sub domain / Matt Cutts comments (which is worthy of a new thread on its own,) I welcome this change.

I would love to see Google start with *.blogspot.com

Granted, BlogSpot has some great sites, but it has also been a breeding ground for some awful spam.

Maybe some kind of a Trust/PR factor would work well here.

[edited by: tedster at 6:33 am (utc) on Dec. 8, 2007]

webastronaut

5+ Year Member



 
Msg#: 3522210 posted 11:25 pm on Dec 8, 2007 (gmt 0)

I'm really confused on this new Google concept myself because of all the blog hosts including blogspot that treat a subdomain as a new user website account as do the auction sites and what not. What about all of Google's, Yahoo's and MSN's subdomains?

I have sites 10+ years old that have many "mysubdomains.mymain.com" because I thought it was logical and easy for the web surfer (10 years ago) to build these large sites with the subdomain structure.

Many clients have been enjoying their number 1 listings on Google.com with 6 to 8 subdomains from their site listed as those fancy "extra links" added. So are these sites going to get the shaft from Google because they have more than 2 subdomains?

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3522210 posted 5:21 pm on Dec 9, 2007 (gmt 0)

I didn't hear anything about getting the shaft - just lowering the current subdomain advantage at getting more than 2 positions on any one search result. Sitelinks are all in the #1 spot - so that counts as the first position with one more left before the damping would kick in, as I understand it.

webastronaut

5+ Year Member



 
Msg#: 3522210 posted 7:58 pm on Dec 9, 2007 (gmt 0)

that makes more sense to me Tedster as so many sites also get #2 listings after the expanded #1 listing. I really want to watch this new Google approach on subdomains evolve because it can indeed clean up duplicated pages in a big way but at the same time start killing off the innocent like any war does.

confuscius

5+ Year Member



 
Msg#: 3522210 posted 2:33 pm on Dec 10, 2007 (gmt 0)

Matt Cutts has now written about this on his blog.

I love the notion of 'really relevant' results as opposed to 'relevant' results - so does this mean that the plethora of results from a certain auction site for millions of long tail terms have ALREADY disappeared?

You tell me!

Looks like we have some manual manipulation of search results - am I wrong?

Paul

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3522210 posted 4:32 pm on Dec 10, 2007 (gmt 0)

Well, algorithms are written manually so the line between manual an algorithmic does get blurred a bit. The thing I note is that this change has already happened. So what ever parts of the sky will fall because of it -- well, they've already fallen.

Here's what Matt wrote [mattcutts.com];

In the last few weeks we changed our algorithms to make that less likely to happen in the future.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved