Forum Moderators: Robert Charlton & goodroi
So: if your pages are still supplemental, feel free to write to sesnyc06 [at] gmail.com with the subject line of "stillsupplemental" (all one word), and I'll ask someone to check the emails out.
Hope that helps, and I'm glad that lots of people are seeing a full recovery,
GoogleGuy
[edited by: Brett_Tabke at 5:20 pm (utc) on Mar. 22, 2006]
For my site Google bot crawl me only 20-30 hits and leave mysite.. what's going on to my site.. Is it ban or I have to wait for Gbot to crawl other site?
Thank you for advance...
are they targeting sites with sitewide templates?
there is nothing wrong with creating a template, but if you aren't adding useful content it's going to end up in the ghetto/bad neighborhood with lots of other 'useless' sitesduring the Q&A of the Duplicate Content Issues session at the most recent SES.
Templates simply form the basic structure of the page ensuring that layout, headers & footers can be branded to produce a corporate feel to most good quality sites.
Sometimes pages that have very useful content or links to forms pages or other useful content pages do not need regular updating to keep them current and very relevent to some searches!
If a page is designed as a "Widgets Centre" and has links to all the various widget tools and services then why would this page need updating with new content. It would always be the best page for Widgets.
This also applies to most major homepages. What is Matt saying....
He is saying that it basically has to have some content (Not sure what %) on the page - not just a template. (He says useful content so not repeated elsewhere on the site/web? perhaps)
Should not be read as adding - as in always adding.
We may be talking about two different templating issues. The most common template
one that provides the look/feel/navigation for the site.
Another that can provide the content within that template, the most gross example is a scraper site that uses at template to dymanically generate the pages content from scraped data.
A more common example is a shopping catalog that just reproduces a manufacturers product/data feed that is also provided by many other shopping sites.
This is affecting many large sites. And helping "copycat" sites.
Yep I know when Google will take it seriously: When they start losing users to other search engines - will be too late then.
Remember Internet Explorer? Firefox?
Some one mention the percetange of different on content in each page might be the issue. I was wondering if anyone can clarify on on that.
What's the percentage difference a page needs to have in order not to be consider duplicate content by big daddy.
lets say in my case the duplicate content is 75% and the difference in content is 25%
I really need help on this one, before I make changes on my page.
Thanks
Yep I know when Google will take it seriously: When they start losing users to other search engines - will be too late then.
I'm thinking that they're already losing users to other search engines. Over the last few days my traffic has started returning to its previous levels, but those folks are not coming from Google. Yahoo is sending 10 times the traffic, MSN is sending 3-4 times the traffic and ASK is at a tie. Google has almost fallen into the "Other" source of traffic.
I am not sure that running a template has any negative affect on Google.
My site runs primarily on a number of templates, and I am getting steady growth in Google traffic. If the URL's are distinct and the content within the templates is unique, then there shouldn't be an issue with templates, especially if they are bespoke, rather than coming from a general template [often seen as asp templates]
This is IMHO (sri for the expression) b*llsh*t...
They simply have a major database problem but don't have the courage to admit it.
As also university sites are affected but this "supplemental" / incomplete indexed stuff I doubt it is a thing related to trust.
Perhaps i need to explain my self better, my site is a download site, and each download has its detail page and in each detail page there are 5 links to the newest downloads of the particular download category. This can be sein as dupplicate content since all the downloads in that particular category have the last 5 newest downloads. so thats my dilema :)
Now this links take 75% of the page content and the other 25% belongs to the particular download detail.
So its not template related I should had refrace my post. Its more actual content.
So is that too much percentage taking by the newest links in each detail page.
Thanks again.
Perhaps i need to explain my self better, my site is a download site, and each download has its detail page and in each detail page there are 5 links to the newest downloads of the particular download category. This can be sein as dupplicate content since all the downloads in that particular category have the last 5 newest downloads. so thats my dilema :)
I am a little confused by this. But if you are saying that the 5 newest download links are different on each page, then thats fine. If you are saying the same 5 download links are on every page, then that would be construed as duplicate.
A new statement about the current development would be very helpful.
It would be also very helpful if you could answer the following questions:
1. Is in any case the Supplemental hell a bug, which would be solved? Or can it be that this is wanted on a long-term basis for some pages?
2. When can we expect a complete solution of this problem?
One has to get this in context, as I am assuming you have written content that is the same, referring to those download links. Is there other content on the pages? Are there similar images to click on for the downloads? How many pages are you talking about, in comparison to the total number of pages?
The greater percentage of duplicate content you have on subsequent pages, the more likely you are to be penalised, and maybe even banned at some point.
My domains are all up to date on this DC and completely indexed.
This dupe thing is weird since only the homepage is not supplemental...I mean we have many other pages that are 95% different, yet they still are supplemental. Maybe Google penalizes the domain after x% of pages are penalized? I don't know, I just know that Google is screwing this up
On the DC mentioned above at least my domains are all complete and actual cache is applied. This means that the data Googlebot is fetching is applied to some DCs, but not to BD DCs as these carry old to very old cache.
But this is something done in all the download sites, and i dont see them having the issue as I do, but their new links do take less percentage.
lol now this giving me a headache I just going to remove them lol
It sounds likeyou may have the wrong balance. I don't see why you can't have a small box in the corner of the page, with 'latest downloads',but it would probably be to your advantage to have the majority of the page as unique content.
Google is not out to hit sites that have similar layout throughout, as long as the content is different. For example, a categories or index type feature isn't going to get you penalised, in my opinion, as long as the internal links are there for a legitimate reason.