| 3:08 am on Mar 16, 2006 (gmt 0)|
are they targeting sites with sitewide templates? I'm curious to see if there is a connection
| 3:26 am on Mar 16, 2006 (gmt 0)|
|are they targeting sites with sitewide templates? |
If they are going to target websites built properly then it's time to switch careers.
| 3:36 am on Mar 16, 2006 (gmt 0)|
I am sure it has nothing to do with templates, the one site of ours that was hit by this uses the same well designed template as a couple of our other sites. This is something else.
| 3:42 am on Mar 16, 2006 (gmt 0)|
Really want to ask for people who is suffering supplemental hell about Googlebot/Mozilla activies... Is crawl your sites a lot or less activity?
For my site Google bot crawl me only 20-30 hits and leave mysite.. what's going on to my site.. Is it ban or I have to wait for Gbot to crawl other site?
Thank you for advance...
| 4:23 am on Mar 16, 2006 (gmt 0)|
|are they targeting sites with sitewide templates? |
Mr. Cutts said
during the Q&A of the Duplicate Content Issues session at the most recent SES.
|there is nothing wrong with creating a template, but if you aren't adding useful content it's going to end up in the ghetto/bad neighborhood with lots of other 'useless' sites |
| 11:25 am on Mar 16, 2006 (gmt 0)|
Hang on a minute,
Templates simply form the basic structure of the page ensuring that layout, headers & footers can be branded to produce a corporate feel to most good quality sites.
Sometimes pages that have very useful content or links to forms pages or other useful content pages do not need regular updating to keep them current and very relevent to some searches!
If a page is designed as a "Widgets Centre" and has links to all the various widget tools and services then why would this page need updating with new content. It would always be the best page for Widgets.
This also applies to most major homepages. What is Matt saying....
| 11:29 am on Mar 16, 2006 (gmt 0)|
>>>>What is Matt saying....
He is saying that it basically has to have some content (Not sure what %) on the page - not just a template. (He says useful content so not repeated elsewhere on the site/web? perhaps)
Should not be read as adding - as in always adding.
| 1:59 pm on Mar 16, 2006 (gmt 0)|
Read that way it makes more sense!
| 2:01 pm on Mar 16, 2006 (gmt 0)|
>>>>What is Matt saying....
We may be talking about two different templating issues. The most common template
one that provides the look/feel/navigation for the site.
Another that can provide the content within that template, the most gross example is a scraper site that uses at template to dymanically generate the pages content from scraped data.
A more common example is a shopping catalog that just reproduces a manufacturers product/data feed that is also provided by many other shopping sites.
| 2:29 pm on Mar 16, 2006 (gmt 0)|
Google isn't taking supplemental problem seriously. More than 2 weeks have passed and no improvement is to be seen...
This is affecting many large sites. And helping "copycat" sites.
Yep I know when Google will take it seriously: When they start losing users to other search engines - will be too late then.
Remember Internet Explorer? Firefox?
| 2:38 pm on Mar 16, 2006 (gmt 0)|
I think this ir right on topic, as of 2 days ago google drop 75% of my links, and i think its a template issue too.
Some one mention the percetange of different on content in each page might be the issue. I was wondering if anyone can clarify on on that.
What's the percentage difference a page needs to have in order not to be consider duplicate content by big daddy.
lets say in my case the duplicate content is 75% and the difference in content is 25%
I really need help on this one, before I make changes on my page.
| 2:44 pm on Mar 16, 2006 (gmt 0)|
|Yep I know when Google will take it seriously: When they start losing users to other search engines - will be too late then. |
I'm thinking that they're already losing users to other search engines. Over the last few days my traffic has started returning to its previous levels, but those folks are not coming from Google. Yahoo is sending 10 times the traffic, MSN is sending 3-4 times the traffic and ASK is at a tie. Google has almost fallen into the "Other" source of traffic.
| 2:54 pm on Mar 16, 2006 (gmt 0)|
I am not sure that running a template has any negative affect on Google.
My site runs primarily on a number of templates, and I am getting steady growth in Google traffic. If the URL's are distinct and the content within the templates is unique, then there shouldn't be an issue with templates, especially if they are bespoke, rather than coming from a general template [often seen as asp templates]
| 3:03 pm on Mar 16, 2006 (gmt 0)|
If Google is starting to penalize the usage of templates they can basically forget it, because 80% of the web is using templates (forums, blogs, CMS etc.).
This is IMHO (sri for the expression) b*llsh*t...
They simply have a major database problem but don't have the courage to admit it.
As also university sites are affected but this "supplemental" / incomplete indexed stuff I doubt it is a thing related to trust.
| 3:09 pm on Mar 16, 2006 (gmt 0)|
Eazygoin, Thanks for your reply.
Perhaps i need to explain my self better, my site is a download site, and each download has its detail page and in each detail page there are 5 links to the newest downloads of the particular download category. This can be sein as dupplicate content since all the downloads in that particular category have the last 5 newest downloads. so thats my dilema :)
Now this links take 75% of the page content and the other 25% belongs to the particular download detail.
So its not template related I should had refrace my post. Its more actual content.
So is that too much percentage taking by the newest links in each detail page.
| 3:25 pm on Mar 16, 2006 (gmt 0)|
|Perhaps i need to explain my self better, my site is a download site, and each download has its detail page and in each detail page there are 5 links to the newest downloads of the particular download category. This can be sein as dupplicate content since all the downloads in that particular category have the last 5 newest downloads. so thats my dilema :) |
I am a little confused by this. But if you are saying that the 5 newest download links are different on each page, then thats fine. If you are saying the same 5 download links are on every page, then that would be construed as duplicate.
| 3:40 pm on Mar 16, 2006 (gmt 0)|
Although the above may be duplicate links, I really doubt G would be penalising sites for stuff like this. Take blogs for example. Each page has a blogroll and link to entries grouped by month. Does G penalise all blogs? No. I think duplicate the content is much more advanced than simply having the same links on each page.
| 3:43 pm on Mar 16, 2006 (gmt 0)|
|If you are saying the same 5 download links are on every page, then that would be construed as duplicate. |
That's exactly my issue, SO I guess it time to take them out.
So is there a fix percentage the google has that anyone knows of what will make a link consider as duplicate?
| 4:01 pm on Mar 16, 2006 (gmt 0)|
A new statement about the current development would be very helpful.
It would be also very helpful if you could answer the following questions:
1. Is in any case the Supplemental hell a bug, which would be solved? Or can it be that this is wanted on a long-term basis for some pages?
2. When can we expect a complete solution of this problem?
| 4:03 pm on Mar 16, 2006 (gmt 0)|
security 56 >>
One has to get this in context, as I am assuming you have written content that is the same, referring to those download links. Is there other content on the pages? Are there similar images to click on for the downloads? How many pages are you talking about, in comparison to the total number of pages?
The greater percentage of duplicate content you have on subsequent pages, the more likely you are to be penalised, and maybe even banned at some point.
| 4:12 pm on Mar 16, 2006 (gmt 0)|
Could anybody check if their sites are ok on [22.214.171.124...]
My domains are all up to date on this DC and completely indexed.
| 4:19 pm on Mar 16, 2006 (gmt 0)|
126.96.36.199 is a non big daddy DC.
| 4:24 pm on Mar 16, 2006 (gmt 0)|
We are out of Suplemental land on that DC, only urls after the 110th out of 579 pages though...
| 4:31 pm on Mar 16, 2006 (gmt 0)|
Google can't go too far on the dupe thing, IMO, without hurting legitimate vendors. Virtully all use templates and their products sometimes have just a few lines of descriptions. Add the categories, navigation links, maybe a featured product, another promo and you got yourself a problem...
This dupe thing is weird since only the homepage is not supplemental...I mean we have many other pages that are 95% different, yet they still are supplemental. Maybe Google penalizes the domain after x% of pages are penalized? I don't know, I just know that Google is screwing this up
| 4:33 pm on Mar 16, 2006 (gmt 0)|
Yes, I know... but on all big daddy DCs which I can access, you have "basura" (read junk) and lots of outdated stuff.
On the DC mentioned above at least my domains are all complete and actual cache is applied. This means that the data Googlebot is fetching is applied to some DCs, but not to BD DCs as these carry old to very old cache.
| 4:38 pm on Mar 16, 2006 (gmt 0)|
travelin cat: Thanks for the note.
Would be nice to hear if other experience the same. My domain is completely there on this DC but as I have mentioned earlier I am not suffering from the "supplemental" problem - just missing pages on BD DCs.
| 4:58 pm on Mar 16, 2006 (gmt 0)|
Eazygoin, yeps i have new content in each detail page, but it takes only 25% on contrast to the lastest 5 downloads content which take 75% of the page, which include, the new download title, half of description.
But this is something done in all the download sites, and i dont see them having the issue as I do, but their new links do take less percentage.
lol now this giving me a headache I just going to remove them lol
| 5:33 pm on Mar 16, 2006 (gmt 0)|
It sounds likeyou may have the wrong balance. I don't see why you can't have a small box in the corner of the page, with 'latest downloads',but it would probably be to your advantage to have the majority of the page as unique content.
Google is not out to hit sites that have similar layout throughout, as long as the content is different. For example, a categories or index type feature isn't going to get you penalised, in my opinion, as long as the internal links are there for a legitimate reason.
| 5:40 pm on Mar 16, 2006 (gmt 0)|
Dawg, having same issues. On the DC you posted I have a 'normal' amount of listings and browsing through first 1K none are supplemental. On the vast majority of other DCs which I guess are converting to BD i'm seeing 16K pages indexed, compared to 3.3 Million normally. As of two days agoand for all DCs my SERPS are terrible as well (like position 90 out of 300 results). Traffic has dropped around 90% since then.
| 5:42 pm on Mar 16, 2006 (gmt 0)|
We are out of Suplemental land on that DC, only urls after the 110th out of 579 pages though... <<<
As the DC is not Big Daddy you were never in Supplemental land at that DC. So this cannot be seen as a fix.
| 12:26 am on Mar 17, 2006 (gmt 0)|
|As the DC is not Big Daddy you were never in Supplemental land at that DC. So this cannot be seen as a fix. |
Well I suppose that takes care of that glimmer of hope I had for a moment.
I have 2 problems and am hoping you guys might be able to give me some advice. I'm in the same boat with all of you, except I just took this in-house SEO position 3 months ago & now Google does this. Any feedback would be really appreciated.
1. My homepage is not in the index at all, nor is it in the Supps! The non-www version is in the supplementals, but I've had a 301 in place redirecting to the www-version for about 2-3 months. What should I take from this? Am I just the "unlucky" one to have the homepage out of the index as well?
2. Googlebot/Mozillabot has seemingly stopped crawling our site altogether. Most people here have reported a dramatic increase in crawling activity--but non here :(
We've got a mid-sized site (~150 pages) & just added 75 original content pages/articles which I thought would do the trick because we had gotten sooo close. The site is actually larger than most (now) in this industry space, apart from the lead-gen directory sites (mainly done by <edited>, but that's an entirely different post).
Since these 2 major factors seem to be different for us, I'm just looking for some answers...
<No specifics about the space, please
See Forum Charter [webmasterworld.com]>
[edited by: tedster at 12:39 am (utc) on Mar. 17, 2006]
| This 265 message thread spans 9 pages: < < 265 ( 1 2  4 5 6 7 8 9 ) > > |