| 11:45 am on Dec 27, 2003 (gmt 0)|
>>The domain expires end of january and well, I need to make a decision.
If you're getting good Inktomi traffic why give it up, and the others as well? Why not renew the domain, keep it going for the others and do another site with new and different content and a different type of structure for Google.
>>rest (10 pages) are just with url no description
The one time I've personally seen that is when a site was moved and accidentally slipped out, but there could be other reasons, especially if you're being visited regularly.
Something isn't right, have you checked to see if any other sites out there are redirecting to you without your knowledge, or have duplicated your content on any other domains?
| 2:22 pm on Dec 27, 2003 (gmt 0)|
If this site is doing well in Inktomi, it wouldn't make sense to give it up at this point given the likelihood that Yahoo will soon switch to displaying Inktomi results.
As Marcia mentioned, making a different site optimized for the "new Google" is what many people are now considering. It's a shame, but it's also the unfortunate consequence of Google's latest algo.
| 2:32 pm on Dec 27, 2003 (gmt 0)|
Are you linking to a "bad neighborhood"? (I think that means participating in a link farm.)
| 2:38 pm on Dec 27, 2003 (gmt 0)|
|Are you linking to a "bad neighborhood"? (I think that means participating in a link farm.) |
No defenitely not.
Thanks to all the other answers,
about making a new site just for Google; that would be then duplicat content, right? Last thing i need is that the other search engines will then bann me for doing that.
| 2:58 pm on Dec 27, 2003 (gmt 0)|
I have a similar question along the same lines. I am putting together a site specifically for the UK market as some of the messages are distinctively diferent, however there is a lot of content which would be the same.
To what extent do you need to change things to avoid duplicate content penalty:
And of course content!
As a side point I have the opposite problem of doing well in Google but terrible in all the others!
| 4:45 pm on Dec 27, 2003 (gmt 0)|
I have a site with the exact same symptoms. Its had me scratching my head for the last 6-8 months.
| 5:02 pm on Dec 27, 2003 (gmt 0)|
To what extent do you need to change things to avoid duplicate content penalty:
You have just answered your own question :)
| 5:20 pm on Dec 27, 2003 (gmt 0)|
I know Mr G says that you shouldnt do this butttt I had the same problem - answer - submit EACH page of your site.= problem over
| 5:29 pm on Dec 27, 2003 (gmt 0)|
Or to save time have a sitemap page and submit that, thats what we do with our ecom sites should be crawledby Mr G no problem.
| 5:38 pm on Dec 27, 2003 (gmt 0)|
@bekyed and Essex_boy
done that, I submitted the sitemap about 6 months ago, Google doesn't care, sitemap is not listed.
| 6:09 pm on Dec 27, 2003 (gmt 0)|
sticky me the url
| 7:10 pm on Dec 27, 2003 (gmt 0)|
When I said to what extent I know that each has to have some change but to what extent?
| 11:05 pm on Dec 27, 2003 (gmt 0)|
Sorry J Snow,
As every seo worth his/her salt knows that it is trial and error trying to rank high.
There is no magic formula as the serps indicate with some pages that are density thick keywords ranking high, some with no keywords beating high ranking pages, some with hidden text etc.
Follow Bretts advice on building pages and you wont go far wrong.
| 11:07 pm on Dec 27, 2003 (gmt 0)|
The link to bretts guide is here
| 12:49 am on Dec 28, 2003 (gmt 0)|
If it's the site at your profile, all pages are pr0.
Probably you're linking to an external penalized site.
| 1:57 am on Dec 28, 2003 (gmt 0)|
|As every seo worth his/her salt knows that it is trial and error trying to rank high. |
Sorry bekyed, I'm afraid I'll have to respectfully disagree with that as a blanket statement. First off, not everyone around is a professional SEO, and given enough time there are certain things that when done can and do reliably contribute to high rankings. While it's true that a certain amount is trial and error, after some individuals have done a certain amount of trial and error testing over time their mileage and perceptions are not necessarily at the same level as others.
Point being is that it is not ALL trial and error, though for some individuals it's more so than for others. Also, there are certain things that aren't even subject to trial and error, which in some cases can be a perilous route to take, the safer way being taking calculated risks based on known probabilities.
|linking to an external penalized site. |
Gus, one or even a few wouldn't do it if there are enough off-setting factors like a variety of good inbound links and a variety of outbound links.
Getting back to the specific issue, there might just be something else involved.
viggen, there's another domain with the exact same URL, even with the .com - with the exception that a two letter country code follows it. No connection, no funny stuff, and it's in another language. But the presence of the .com in the url (as opposed to being like co.uk) may possibly be causing Google some type of error or confusion in including your site.
This is why those papers out there are so fun and exciting to read. Check this out in Hilltop: A Search Engine based on Expert Documents [cs.toronto.edu]:
|2.1 Detecting Host Affiliation |
We define two hosts as affiliated if one or both of the following is true:
They share the same first 3 octets of the IP address.
The rightmost non-generic token in the hostname is the same.
We consider tokens to be substrings of the hostname delimited by "." (period). A suffix of the hostname is considered generic if it is a sequence of tokens that occur in a large number of distinct hosts. E.g., ".com" and ".co.uk" are domain names that occur in a large number of hosts and are hence generic suffixes. Given two hosts, if the generic suffix in each case is removed and the subsequent right-most token is the same, we consider them to be affiliated.
E.g., in comparing "www.ibm.com" and "ibm.co.mx" we ignore the generic suffixes ".com" and ".co.mx" respectively. The resulting rightmost token is "ibm", which is the same in both cases. Hence they are considered to be affiliated. Optionally, we could require the generic suffix to be the same in both cases.
The affiliation relation is transitive: if A and B are affiliated and B and C are affiliated then we take A and C to be affiliated even if there is no direct evidence of the fact. In practice some non-affiliated hosts may be classified as affiliated, but that is acceptable since this relation is intended to be conservative.
Not saying this is being used for certain, but there has to be some means to identify affiliation. Which would be the unique rightmost token in the domains concerned. Could they possibly be considered to be affiliated in this particular case, even though there isn't the remotest connection between the two?
Let's look at these:
And then at this
If you look at the backlinks for that last one - for which the PR is 9 not 10, incidentally, it is showing the backlinks for www.google.com - obviously in their case they're aware of the affiliation. But no way are those sites pointing to Google's .com.tw domain.
Again, not saying that Google is using this in the particular instance we're now discussing, but given the possibility, even if it's vague, I think I'd be inclined to write to Google again and ask them to take a look specifically at this particular issue. It certainly can't hurt.
I'm inclined to suspect that this may be a possibility in this case, well worth looking into.
| 5:34 am on Dec 28, 2003 (gmt 0)|
thanks marcia for looking deep into my site :)
yes i was aware of the taiwanese site, and as it has pagerank and it looks like a "normal" business site and other SE didn't seem to get confused, it never crossed my mind that this could be the problem.
If so, what else then (hoping that Google Guy jumps in and fixing it right away =P ) contacting the Google technicians, (what I already did), can I do?
| 11:28 am on Dec 29, 2003 (gmt 0)|
viggen, it's kind of a shot in the dark, I think we all tend to grope and grasp when there doesn't seem to be an answer.
But I definitely wouldn't abandon a domain for a going site. A very spiderable second site with a different type of structure and format seems like it would be a sound solution.
I'm seriously curious whether this situation would exist if the site were not a .com but net or org. Technically, what I was looking into shouldn't be happening, especially considering the language and geographical difference, but I still can't help but wonder.
| 9:35 pm on Dec 29, 2003 (gmt 0)|
yeah marcia, i won't give up yet, but it is bad not to know. If I would only know the reason, I could make changes, but so I guess i have to wait longer, (hope not another year)
At least having one of the few site in the world that has more referers from wisenut then from google makes me kinda special. =P
I hope Google will get it right.
thanks for all your help,
| 7:10 am on Jan 11, 2004 (gmt 0)|
Just thought to give you guys an update.
After I contacted the Google technicians, and they seemed to fix whatever problem it was, I got spidered and indexed, have good ranking on several important keywords now, get lots of traffic from google, shows now over 100 backlinks, but still have pagerank 0 (even after the new update that is happening right now).
So Pagerank seems a bit overrated?
| 11:50 am on Jan 28, 2004 (gmt 0)|
My site has now Pagerank 5, after almost a year with pagerank 0.
thanks again to everyone. :)