homepage Welcome to WebmasterWorld Guest from 54.226.180.223
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

    
Every SE likes my site but Google
What should I do?
viggen




msg:121691
 9:30 am on Dec 27, 2003 (gmt 0)

One of my websites ranks average to very good on every search engine exept Google, at Google only the main page is indexed, the rest (10 pages) are just with url no description. (Googlebot visits regular). This is now going on for about 10 months. The site has Pagerank 0 with about 20 incoming links (some pagerank 6)

As i don't have anything on this site that might violate, as far as I know, against Google's TOS, I can just assume that it must be a penalty of the domain or something. (I contacted Google, and reply is be patient, the engineers are looking into it)

I was thinking of getting a new domain, but hesitating as what will happens to my good rankings in the other search engines. So what should i do, forget Google as I did for the last 10 months. (I do get very good traffic from Inktomi, and it took a while to get this far). Or forget alll the others and think about Google?

The domain expires end of january and well, I need to make a decision. Any advise is apreciated.

 

Marcia




msg:121692
 11:45 am on Dec 27, 2003 (gmt 0)

>>The domain expires end of january and well, I need to make a decision.

If you're getting good Inktomi traffic why give it up, and the others as well? Why not renew the domain, keep it going for the others and do another site with new and different content and a different type of structure for Google.

>>rest (10 pages) are just with url no description

The one time I've personally seen that is when a site was moved and accidentally slipped out, but there could be other reasons, especially if you're being visited regularly.

Something isn't right, have you checked to see if any other sites out there are redirecting to you without your knowledge, or have duplicated your content on any other domains?

Spica




msg:121693
 2:22 pm on Dec 27, 2003 (gmt 0)

If this site is doing well in Inktomi, it wouldn't make sense to give it up at this point given the likelihood that Yahoo will soon switch to displaying Inktomi results.
As Marcia mentioned, making a different site optimized for the "new Google" is what many people are now considering. It's a shame, but it's also the unfortunate consequence of Google's latest algo.

dwilson




msg:121694
 2:32 pm on Dec 27, 2003 (gmt 0)

Are you linking to a "bad neighborhood"? (I think that means participating in a link farm.)

viggen




msg:121695
 2:38 pm on Dec 27, 2003 (gmt 0)

Are you linking to a "bad neighborhood"? (I think that means participating in a link farm.)

No defenitely not.

Thanks to all the other answers,
about making a new site just for Google; that would be then duplicat content, right? Last thing i need is that the other search engines will then bann me for doing that.

jsnow




msg:121696
 2:58 pm on Dec 27, 2003 (gmt 0)

I have a similar question along the same lines. I am putting together a site specifically for the UK market as some of the messages are distinctively diferent, however there is a lot of content which would be the same.

To what extent do you need to change things to avoid duplicate content penalty:

Structure
Filenames
File sizes
Graphics

And of course content!

As a side point I have the opposite problem of doing well in Google but terrible in all the others!

div01




msg:121697
 4:45 pm on Dec 27, 2003 (gmt 0)

Viggen,

I have a site with the exact same symptoms. Its had me scratching my head for the last 6-8 months.

bekyed




msg:121698
 5:02 pm on Dec 27, 2003 (gmt 0)

To what extent do you need to change things to avoid duplicate content penalty:

Structure
Filenames
File sizes
Graphics

You have just answered your own question :)

Bek.

Essex_boy




msg:121699
 5:20 pm on Dec 27, 2003 (gmt 0)

I know Mr G says that you shouldnt do this butttt I had the same problem - answer - submit EACH page of your site.= problem over

bekyed




msg:121700
 5:29 pm on Dec 27, 2003 (gmt 0)

Or to save time have a sitemap page and submit that, thats what we do with our ecom sites should be crawledby Mr G no problem.

Bek

viggen




msg:121701
 5:38 pm on Dec 27, 2003 (gmt 0)

@bekyed and Essex_boy

done that, I submitted the sitemap about 6 months ago, Google doesn't care, sitemap is not listed.

bekyed




msg:121702
 6:09 pm on Dec 27, 2003 (gmt 0)

sticky me the url

bek.

jsnow




msg:121703
 7:10 pm on Dec 27, 2003 (gmt 0)

When I said to what extent I know that each has to have some change but to what extent?

bekyed




msg:121704
 11:05 pm on Dec 27, 2003 (gmt 0)

Sorry J Snow,

As every seo worth his/her salt knows that it is trial and error trying to rank high.
There is no magic formula as the serps indicate with some pages that are density thick keywords ranking high, some with no keywords beating high ranking pages, some with hidden text etc.
Follow Bretts advice on building pages and you wont go far wrong.

Bek

bekyed




msg:121705
 11:07 pm on Dec 27, 2003 (gmt 0)

The link to bretts guide is here

[webmasterworld.com...]

Gus_R




msg:121706
 12:49 am on Dec 28, 2003 (gmt 0)

viggen

If it's the site at your profile, all pages are pr0.
Probably you're linking to an external penalized site.

Marcia




msg:121707
 1:57 am on Dec 28, 2003 (gmt 0)

As every seo worth his/her salt knows that it is trial and error trying to rank high.

Sorry bekyed, I'm afraid I'll have to respectfully disagree with that as a blanket statement. First off, not everyone around is a professional SEO, and given enough time there are certain things that when done can and do reliably contribute to high rankings. While it's true that a certain amount is trial and error, after some individuals have done a certain amount of trial and error testing over time their mileage and perceptions are not necessarily at the same level as others.

Point being is that it is not ALL trial and error, though for some individuals it's more so than for others. Also, there are certain things that aren't even subject to trial and error, which in some cases can be a perilous route to take, the safer way being taking calculated risks based on known probabilities.

linking to an external penalized site.

Gus, one or even a few wouldn't do it if there are enough off-setting factors like a variety of good inbound links and a variety of outbound links.

Getting back to the specific issue, there might just be something else involved.

viggen, there's another domain with the exact same URL, even with the .com - with the exception that a two letter country code follows it. No connection, no funny stuff, and it's in another language. But the presence of the .com in the url (as opposed to being like co.uk) may possibly be causing Google some type of error or confusion in including your site.

This is why those papers out there are so fun and exciting to read. Check this out in Hilltop: A Search Engine based on Expert Documents [cs.toronto.edu]:

2.1 Detecting Host Affiliation

We define two hosts as affiliated if one or both of the following is true:

They share the same first 3 octets of the IP address.
The rightmost non-generic token in the hostname is the same.

We consider tokens to be substrings of the hostname delimited by "." (period). A suffix of the hostname is considered generic if it is a sequence of tokens that occur in a large number of distinct hosts. E.g., ".com" and ".co.uk" are domain names that occur in a large number of hosts and are hence generic suffixes. Given two hosts, if the generic suffix in each case is removed and the subsequent right-most token is the same, we consider them to be affiliated.

E.g., in comparing "www.ibm.com" and "ibm.co.mx" we ignore the generic suffixes ".com" and ".co.mx" respectively. The resulting rightmost token is "ibm", which is the same in both cases. Hence they are considered to be affiliated. Optionally, we could require the generic suffix to be the same in both cases.

The affiliation relation is transitive: if A and B are affiliated and B and C are affiliated then we take A and C to be affiliated even if there is no direct evidence of the fact. In practice some non-affiliated hosts may be classified as affiliated, but that is acceptable since this relation is intended to be conservative.

Not saying this is being used for certain, but there has to be some means to identify affiliation. Which would be the unique rightmost token in the domains concerned. Could they possibly be considered to be affiliated in this particular case, even though there isn't the remotest connection between the two?

Let's look at these:
[google.com...]
[google.co.uk...]

And then at this
[google.com.tw...]

Server Response: [google.com.tw...]
Status: HTTP/1.1 200 OK

If you look at the backlinks for that last one - for which the PR is 9 not 10, incidentally, it is showing the backlinks for www.google.com - obviously in their case they're aware of the affiliation. But no way are those sites pointing to Google's .com.tw domain.

Again, not saying that Google is using this in the particular instance we're now discussing, but given the possibility, even if it's vague, I think I'd be inclined to write to Google again and ask them to take a look specifically at this particular issue. It certainly can't hurt.

I'm inclined to suspect that this may be a possibility in this case, well worth looking into.

viggen




msg:121708
 5:34 am on Dec 28, 2003 (gmt 0)

thanks marcia for looking deep into my site :)

yes i was aware of the taiwanese site, and as it has pagerank and it looks like a "normal" business site and other SE didn't seem to get confused, it never crossed my mind that this could be the problem.

If so, what else then (hoping that Google Guy jumps in and fixing it right away =P ) contacting the Google technicians, (what I already did), can I do?

Marcia




msg:121709
 11:28 am on Dec 29, 2003 (gmt 0)

viggen, it's kind of a shot in the dark, I think we all tend to grope and grasp when there doesn't seem to be an answer.

But I definitely wouldn't abandon a domain for a going site. A very spiderable second site with a different type of structure and format seems like it would be a sound solution.

I'm seriously curious whether this situation would exist if the site were not a .com but net or org. Technically, what I was looking into shouldn't be happening, especially considering the language and geographical difference, but I still can't help but wonder.

viggen




msg:121710
 9:35 pm on Dec 29, 2003 (gmt 0)

yeah marcia, i won't give up yet, but it is bad not to know. If I would only know the reason, I could make changes, but so I guess i have to wait longer, (hope not another year)

At least having one of the few site in the world that has more referers from wisenut then from google makes me kinda special. =P

I hope Google will get it right.

thanks for all your help,

viggen




msg:121711
 7:10 am on Jan 11, 2004 (gmt 0)

Just thought to give you guys an update.

After I contacted the Google technicians, and they seemed to fix whatever problem it was, I got spidered and indexed, have good ranking on several important keywords now, get lots of traffic from google, shows now over 100 backlinks, but still have pagerank 0 (even after the new update that is happening right now).

So Pagerank seems a bit overrated?

viggen




msg:121712
 11:50 am on Jan 28, 2004 (gmt 0)

My site has now Pagerank 5, after almost a year with pagerank 0.
thanks again to everyone. :)

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved