Welcome to WebmasterWorld Guest from 18.104.22.168
1. Use Google Sitemaps to select your "Preferred Domain" to eliminate the duplicate content issues with domain.com and www.domain.com.
2. Upload a fresh sitemap so Google takes a look at your site.
3. Get rid of your duplicate meta description tags, and replace them with unique tags for every page.
4. Get rid of all links to any index.html file on your site. www.domain.com/index.html should be changed to www.domain.com/ and www.domain.com/subfolder/index.html should be changed to www.domain.com/subfolder/
That's what I've done this week, and I've seen big results. I've yet to remove most of my duplicate meta descriptions, focusing just on the pages that were big sources of traffic before.
I lost 90% of my traffic on June 28th, regained about half on July 27th, and then lost it all again on August 17th.
After these changes, I'm up to about 70% of my pre-June 28th traffic, and at 100% of my August 17th traffic. The big difference is that I never regained any of my Google Images traffic on July 27th. But that's garbage hits anyway. All my revenue generating traffic has returned!
Thanks for everyone who gave me their advice. I couldn't have done it without you! I hope this helps the rest of you still stuck in Google's mess.
I say hopefully because google moves the goal posts all the time. What is working today may well be broken tomorrow. It's like that with google. Just one big moving sticking plaster.
The true test is not to come back in a Data refresh, but to stay there through the next one.
(Not to rain on your parade, but I jave been see-sawing for more than a year now with every refresh)
If all of your internal pages link back to /index.html then that will be the one with Pagerank showing.
However, Google usually prefers to list www.domain.com/ and it will be without PageRank, or only have PageRank from some of your incoming links.
Combine the PR by linking only to / or to www.domain.com/ each time.
The trailing / is also required here.
If you can also do it, then do add the 301 redirect from non-www to www as well. That will also help.
But there's just no guarantee that no one will link to you in a way you wouldn't want them to.
On another site my friend makes, the www had a pagerank of 5, while the non-www had a pagerank of 4. The site also had a mirror site on a different domain, with no redirect to the original. That domain also was pagerank 4. Both www and non-www.
With the september 30th update i checked these domains ( to see if there was an update at all or is it only our site... ) and the non-www version of the main domain is now PR0 from 4, the mirror site is PR 3 from 4, and the non-www version of the mirror site is PR0 from 4.
Meaning you will face the canonical issue and duplicate content problem... at some point. Regardless of being affected before or not.
... thanks to g1smd since the 26th even these sites point to the one and only version, www.example.com. Duplicate meta tags still need to be issued though. Perhaps a tad too late.
And you know... i've seen the stats.
With all these changes haging in the air...
No change in traffic. ( yet? )
Yep. That is a visible improvement. The site has to do better with the fixes in place. I cannot see a way for it to fail to do so. Both "omitted" and "supplemental" are trying to tell you something [webmasterworld.com].
For well over a year traffic to pages in our site that related to country information for the particular country we promote dropped to near zero.
We sat back and assumed that Google was doing things and we would recover. So we have been adding about 5 pages per day, hand done, unique. Our inbound links have grown naturally.
Here is a synopsis of what one expert said when I asked for help.
“I could not find your website based on strong points of your site in Google search results.
Even with vague keywords that should display your website at top position, your site is nowhere, in some cases.
This below proves that Google has demoted your website;
If you do a search for; #*$!#*$!
Your website, I would have expected to show top in Google or at least in the first page. But it does not.
Other websites that are totally useless to an end user is presented by Google such as the page below. It has a simple link pointing to your website and nearly totally useless to an end user.
Why would an end user want to use Google then to be sent to another directory or search results? It is pointless. Why use Google in the first place if Google relies on other directories or search engines to display such a vague keyword as #*$!#*$! since your website is the best equipped for that combination of keywords.
I cannot see anything really wrong about your website that should warrant it being so low in the rankings and I did not detect any spamming or tricks on your website.
Google has for some reason known only to itself demoted your website in rankings.
Google has tanked your website for no apparent reason.
I can't find anything to fault about your website. I even looked at headers etc and found nothing that looks suspicious or anything done wrong.
ps, like I said, you have a nice site, nothing really wrong to warrant the things I saw. Simple links pointing to you are ranking long before your website. This is wrong by Google. “
Now I offered to pay this man for help. He declined pay saying my site is clean and he could do nothing.
So what to do? Who knows. Follow the rules and get screwed by Google.
After making several public comments I was contacted by Google to ask what domain I was saying was penalized. After I told them I didn't hear anything for a long time and so finally took matters into my own hands.
On a whim I started looking at sites ranking around my site which were similar to mine. So everything ranked 950 and above for my primary keyword. What I found is that they all looked like my site -- lots of internal links with high keyword density. After checking against some competitors who weren't penalized I found that my keyword density was slightly higher than all the rest.
It took a full day but I neutered my site (more like castrated!) with regard to the density of my primary keywords. I also removed a bunch of self-page links (the cookie bar page had a link to itself) and fixed some other minor issues.
This all happened around the 27th-28th and by the 29th I was starting to see my listings reappear in different data centers. They're appearing on about 70% of the datacenters now and spreading more each day.
So, my tweaking to remove a Google penalty or Google doing a algo change data push thing. No idea, just glad for the traffic again.
Oh, Google did confirm that I didn't have a "manual penalty" against my site and that I hadn't broken any of their guidelines. Too bad they didn't tell me more about the automatic penalty!
For what it's worth...
It's often helpful to get someone to check your site where possible that knows how to post those alternate URL's.
Simply reading about this in the last 3 months on this forum has given me the tools to fix two sites of mine, both duplicate listings and rankings.
There are signs that all supplementals have disappeared following a tidy up on meta titles, descriptions and body content. That's about 3-4 weeks ago now.
Overall, the results on the above site appear to have some filtering, which is putting the pages near the bottom of the SERP's. I haven't figured out what's doing this. PR, tags, descriptions, links and content look better compared to other sites above us.
Our other sites which have gone through a similar revamp at similar times, have not fully indexed. So if they are to return we have a way to go, per above.
In my case I find 100 pages of results that list pages in my site. Each is unique, hand done, not all use same templates, and all, I am told, are considered duplicate content.
I use a google provided site search engine to check for pages added in the last month and find they are not indexed. That is an easier way to check than to dig though 100 pages of results.
If anyone, including google, can tell me why a few thousand unique pages are considered duplicate content I am all ears. Maybe my problem can help others.
Have you tried a site:domain.com search to see what is actually indexed?
This is frustrating. It varies depending on whether i use www.google.co.uk ; www.google.com ; www.google.co.za ; www.google.com.au etc
One thing i noticed - the most likely accurate results or a site appear to come from the corresponding regional Google for a regional domain ie www.google.co.uk for site:domain.co.uk
[edited by: Whitey at 12:01 am (utc) on Oct. 5, 2006]
For almost a year one of our sites dropped about 50% in traffic. Our hosting company assured us that all the 301 etc ad nauseum issues were cared for.
Parties on this board looked at my site and told me it was clean. Everything was Googles fault.
In spite of what they said, my html assistants thought we had a duplicate content issue. We were told we were wrong.
I asked for people to sticky mail me if they were willing to help. Before my request was deleted, one kind fellow, who wishes to remain anonymous, voluntered to help.
Unlike everone else, he quickly said:
You have a duplicate content problem.
You have one site PR5 and a duplicate PR0.
Your 301 is not properly set up.
Google has not indexed some of your pages, for some reason, but that sometimes happens between data refreshes.
Made some other comments about things were were already working on.
People should stop complaining about google and look hard at a site.
Found our hosting company, though assuring us things were properly set, did not have 301 properly set.
We are lucky. I wonder how many others were hurt by this host. And how many others out there are being told that their sites are properly done while a quick check by someone who is not pushing his own agenda or theory may show otherwise.
Word of advice. Get more than one opinion and try to find what is wrong with your site. If you are told that it is Google's fault, don't walk away, RUN!
It is only that because I listened to my plain vanilla student html programmers that I did not give up and found someone who really looked beyound his own agenda.
That's the spirit - I have a site which has recovered recently, but I did some serious page by page checking (painful on a site with thousands of hand crafted pages) and found many problems like duplicate descriptions, identical titles etc which happened due to oversight.
My site is back with heavy traffic, but I am working on everything I think is not perfect and then will wait for a refresh or two to see the effects.
And yes, getting good advice is the toughest part. You were lucky.
Sorry you had to go through that - [ it often happens to me! ] - just ask them "why" , "what" "how" "when" - that should fix up some truth. And it should end up in an understandable technical framework that a novice should understand.
Although Google probably has many bugs, we can only assume it works and that which we do, requires careful technical adjustment. Have a look over here: [webmasterworld.com...]
[edited by: Whitey at 12:11 am (utc) on Oct. 7, 2006]
This thread is now sort of without focus on the opening topic, especially because the details of these fixes are being discussed in several other threads. Also, at least for the moment, the Google search operators used to do the relevant research are not behaving in the same fashion as they were just a short time ago.
So I'm locking this particular discussion. See you around the boards.