| 6:37 am on Aug 12, 2004 (gmt 0)|
Today I have been seeing that some sites now above mine have a good part of my content into their pages, hidden by different techniques, this is not new.
The new part is that those sites pages are now above mines, and that may be some new duplicate rule, or it is because those sites are more 1 topic than mine, so my copy works better for them.
| 9:40 am on Aug 12, 2004 (gmt 0)|
>>>>the inner pages too have a PR, but the index page doesnt. Has anyone else witnessed the same.
this is a recent phenomen Copper.There is a thread about it somewhere but you are not alone and as long as you still hold your positions in the serps,don't panic.
| 10:01 am on Aug 12, 2004 (gmt 0)|
On my site,
It was ranked highly for widgets... a few days ago the back links dropped / and so did our position.
But get this, we decided to add into our footer 2 weeks ago a link back to the home page with 'cheap widgets' and then we lost ALL our rankings for that term...
The page rank has not dropped as yet but i think it will.
well to say the least dont try it!
| 10:04 am on Aug 12, 2004 (gmt 0)|
My sites was getting approx 75 unique visit a day alst month, but now only 15-20 :(
| 10:51 am on Aug 12, 2004 (gmt 0)|
My guess G is trying to save their SERPs from those 1000s of nonsense pages generated by single push of a button, that try to cover every possible combination of all the words from english Dictionary.
Like discussed in this thread: [webmasterworld.com...]
I see some success which is good.
Sorry for webmasters affected, but G does not owe you anything, go and try to add enough uniqueness to each one of your pages.
| 11:47 am on Aug 12, 2004 (gmt 0)|
Remember Brett's recommendation regarding adding one quality page of unique content a day? And after one year you would have 365 pages of completely unique content that shold generate substantial targeted traffic numbers....
This is certainly more realistic with how many sites tend to grow...certainly adding 100 or 1000 pages a day is simply not realistic for unique content unless one is bringing across the void some sort of archived unique content....
Google is simply trying to control the hideous onslaught of millions of pages that are designed to manipulate the serps...
continue to develop out content for your audience...make it specific and useful to their needs....let the spammers have their day...don't panick
| 3:02 pm on Aug 12, 2004 (gmt 0)|
>>>>the inner pages too have a PR,but the index page doesnt. Has anyone else witnessed the same.
>>>>this is a recent phenomen Copper.There is a thread about it somewhere but you are not alone and as long as you still hold your positions in the serps,don't panic.
Hiya Copper and Cabbie...
I was the one who mentioned this happening above... the problem is it did affect the SERPS. Now, I have my PR back to PR6, but my SERPS appear to be permanently damaged. How is it that I have a PR6 and not one of the pages now listed in front of me is a better PR?
I thought it was maybe backlinks, but that number it totally random among the ranks above me, so it's
not that... so basically I am still clueless...
Is anyone not?
Also, on a seperate note to what someone just posted...
Unique content I have, the people above me, some do and some don't... I wish it were that simple. I add a page or at least add a paragraph to one every week,
with over 90 pages of unique content on different types of services we offer... they have a common bound because of location and the overal type of service it is, but each page truly deals with
it's own subject within that service... how much can I really do there without starting to talk
about things I just don't have or do?
| 3:05 pm on Aug 12, 2004 (gmt 0)|
Also... in the past you were supposed to have a more general index page with a little of everything, and then link to pages that are more specific about each thing... this is what I have done, because it would be impossible to target all of the keywords properly on one page and get any type of god ratings...
Now the index page seems to have to have it for the other pages to even be seen.
| 3:09 pm on Aug 12, 2004 (gmt 0)|
I am afraid to change anything because I don't know if things will return to normal.
What's odd... another note, is that there are still some searches, on pages for that matter that I haven't spent time "optimizing" (they have no alt tags and the keyword density is not great) who are coming up number one.
| 11:04 pm on Aug 12, 2004 (gmt 0)|
I do not know what happened?, I guess I am glad I did not quit my day job (almost did) but my site lost 90% of the traffic and lost 90% of the revenue. I have paused all of my AdWords campaigns (10k a month budget) and I am sitting on the side hoping for a miracle. My site is clean with a lot of unique content (almost 100k pages indexed) I survived Florida and others and I think this one is the worst, I see lot of garbage of there in front of me. I hope this one is just a fluke
| 12:01 am on Aug 13, 2004 (gmt 0)|
I know I have said it a million times, but my competition hasn't moved, this has to be the key, something in my site and whomever else's fell is different from the "staying" sites. I don't get it...
My question... I am a consultant, I am paid to handle this for someone else. What do I tell them? People familiar with the industry know these things happen and it's not necessarily our faults, but they don't. How does one explain it to the "unknowledgeable"
| 1:19 am on Aug 13, 2004 (gmt 0)|
|How does one explain it to the "unknowledgeable" |
this is the game, always has been, always will be.
Explain that **** happens (although i am sure they are aware of this).
| 1:49 am on Aug 13, 2004 (gmt 0)|
Ok, so what is really going on with Google then?
I still have a PR4. I have no cached images or pages in G anymore. My top 13 search terms are no longer posting for me. My backlinks increased in the last update.
G crawled a deep, deep page in a forum of mine with a php extension. The url is huge. That is the only cahced page and that was done yesterday.
Traffic has declined drastically. I have corrected every possible thing I believe that could be questionable according to GTOS.
GoogleGuy, where are you? Please help us....
| 2:00 am on Aug 13, 2004 (gmt 0)|
Googleguy is dealing with politics and scandals, lol
| 2:02 am on Aug 13, 2004 (gmt 0)|
My PR was back in the index page for a day, and it's gone again, and the small regain of my position became a further nose dive into SE oblivion. Meanwhile I couldn't be better on yahoo and msn listings, I am at an all time high, top three all across most of my targeted keywords. Too bad they refer half the number of people that google used to.
| 3:16 am on Aug 13, 2004 (gmt 0)|
"GoogleGuy, where are you? Please help us.... "
whatever they did, they did to outsmart us. I doubt GG would tell us how to SEO to be #1 in Google :). Not to mention the IPO quiet period...
| 3:40 am on Aug 13, 2004 (gmt 0)|
|GoogleGuy, where are you? Please help us.... |
Y'all might want to take a few seconds to click on that "Charter" link towards the top of this page. You might notice this section:
|In the meantime, here are some quick guidelines as to post types and phrases that might trigger an action or prevent a pre-moderated post from being approved: |
# pleas to specific users such as a mod or Google representative.
| 3:58 am on Aug 13, 2004 (gmt 0)|
Just read in a Reuters article:
Google hit by image problem:
"After a series of missteps, Google has gone from media darling to fallen angel with breathtaking speed."
That sounds like my log files!
| 4:04 am on Aug 13, 2004 (gmt 0)|
"My site is clean with a lot of unique content (almost 100k pages indexed)"
Are you serious? 100,000. What kind of content could that be? It takes professional writers years to put together a 300 page novel. Even decent web copy takes at least a "little bit" of time, even if just a few minutes to an hour per page.
I see VERY FEW sites online that offer genuine content, and by that I mean something that someone has actually put some thought into for the purpose of serving a target audience versus simply trying to rank in a serp.
No offense, but everytime I hear that someone has just put up 3,000 pages or runs 100,000 or 200,000 pages, I can't help but think that it is exactly the kind of stuff that google doesn't want clogging up the serps and I'm not surprised to hear that a site of this sort is being slammed in an algo change.
| 4:27 am on Aug 13, 2004 (gmt 0)|
I have trouble accepting that this is anything to do with duplicate content.
The sites I have build have just under 1000 pages that has taken the last 18months to build up by hand. These have been hit with an 70 - 80% reduction.
The sites I have that run from wholesaler databases (and thus other retails have access to the content) and are about 2000 pages in size, have seen no decline at all.
| 5:15 am on Aug 13, 2004 (gmt 0)|
you seem to be incredulous that any site could have 100,000 pages. i have one such site.
i won't tell you what mine is but let me describe for you hundreds of such sites: newspapers. every day, a decent size newspaper can put out hundreds of unique and very thoughtful web pages. sure, it takes a staff of dozens to hundreds to do that, but, well, that's what they do. and after a couple years of doing this, even a moderate-sized daily newspaper can easily put together 100,000 pages of content. and importantly, this is valuable content that is fully google-worthy.
so, sure, you might be sitting in your bedroom dressed only in your underpants frantically trying to throw together another few pages for the web, but there are plenty of sites that genuinely have a lot of content and they aren't playing any nefarious search engine games to get there.
| 6:00 am on Aug 13, 2004 (gmt 0)|
|"After a series of missteps, Google has gone from media darling to fallen angel with breathtaking speed." |
Or maybe it are simply the media powers and connections of Google's competitors like M$ and Yahoo who are running a FUD campain before their competitors IPO?
| 8:22 am on Aug 13, 2004 (gmt 0)|
|you seem to be incredulous that any site could have 100,000 pages. i have one such site. |
You yourself pointed out in this thread before that you have a relatively spammy 200000+ site :-)
|i won't tell you what mine is but let me describe for you hundreds of such sites: newspapers. every day, a decent size newspaper can put out hundreds of unique and very thoughtful web pages. |
I do not think that any page of a relevant newspaper was dropped by Google via this algo change (neither do I think you are maintaining such a site).
You and all claiming here about drop real content site above 100000+ may please drop the links here so we can see with our own eyes that this are origin sites worth to be in the top ten!
| 8:47 am on Aug 13, 2004 (gmt 0)|
|you might be sitting in your bedroom dressed only in your underpants frantically trying to throw together another few pages for the web |
I'm glad this is a recognised dress code for webmasters! It can be embarrising when you have to answer the door to the Jehova's Witnesses at 2.30 still wearing your boxers though!
| 8:53 am on Aug 13, 2004 (gmt 0)|
so, sure, you might be sitting in your bedroom dressed only in your underpants frantically trying to throw together another few pages for the web
That is very funny.
I agree, big doesn't necessarily mean spam or crappiness.
A well constructed db delivered page with lots of clever scripting and appropriately placed 'info plug ins' can be very useful indeed.
Some people do big sites well, some don't. Those that do are likely to survive and those that don't will be dropped or fall to obscurity, sooner or later.
I once got lazy with a domain. I used a template and sprinkled a mix of kw's in here there and everywhere. The pages were on topic and provided a means of purchasing the kw related product. It worked in the sense of it gave the user a lazer targetted route to what they required. It took me a week to script and research and after 3 months was doing really ok and earning me money. However, if Im really honest, it was a crappy website in the sense that it didn't really offer anything that couldn't be found elsewhere and those with whom I was competing were offering far greater added value in terms of well researched supplemental information, that went above and beyond what could possibly be required for the users trip.
Anyways, the website in question doesn't rank for any of its keywords and income is a trickle of what it once was, whereas my ex competitors are still there.
This game is the same as any other. If you dont work hard enough and keep on working at it, youll end up losing.
| 9:47 am on Aug 13, 2004 (gmt 0)|
Well, it's not just these huge sites that are hit. My site is about 40 pages, each one done by hand. We are the only ones selling the product, and each page includes a long, detailed description of the items. Apart from one page that we were aggressively collecting links for, we were also hit on terms/pages that hadn't been touched in months
| 12:00 pm on Aug 13, 2004 (gmt 0)|
Like many others, my 2 largest sites (about 20K pages) have gone from 1000 unique visitors per day to about 225 however my smaller more content driven sites have remained the same or gone up.
I know that Google owes us nothing, but my real beef is this: After Florida and Austin, my best earning sites were hit badly, so I and probably like many others made these sites much larger in an attempt to grab as much traffic as possible from lesser competitive keywords...a strategy that worked (until now!) however I almost felt guilty at poluting the internet with thousands upon thousands of pages, but it was the only option.
My point is what has Florida, Austin and this latest tweak really done to improve things other than panic many webmasters into over developing sites to try and claw back some traffic?
And the result of this latest tweak? well one possibility is to publish 10 more versions of the same site accross different domains/servers and all different enough to avoid penalties....ridiculous! more internet pollution and more spammy results. Personally I preferred Google before November 2003
| 12:18 pm on Aug 13, 2004 (gmt 0)|
|however I almost felt guilty at poluting the internet with thousands upon thousands of pages, but it was the only option. |
My point is what has Florida, Austin and this latest tweak really done to improve things other than panic many webmasters into over developing sites to try and claw back some traffic
Again...this stems from having a weak business plan from the start.
What you're demonstrating is an endless cycle: I've lost ranking because google changed the algo to deal with spam like mine, so now I'll spam even harder to get soem of that traffic back, after which google will be forced to do another algo change to deal with THAT spam....amd over and over and over again.
The fact is....a good many of the webmaster crying about lost traffic are part of the problem that necessitates these changes in the first place!
Meanwhile...those of us who take a legitimate "path" and develop a business model/website that converts targeted traffic regardless of the source are foced to continually endure changes that invariably catch us in the "net" these algo changes throws (i.e. throwing the baby out with the bathwater), and continue to suffer because of the lack of ethical marketing practices of others.
....I long for the day when google & others begin to employ (as I think they must) a full time "spam busting" department full of actual humans that would act to manually and permanently review & ban these spam machines. I doubt many who discuss building these 20k+ page sites everymonth would be able to stand a human review.
| 12:25 pm on Aug 13, 2004 (gmt 0)|
>>> Are you serious? 100,000. What kind of content could that be? It takes professional writers years to put together a 300 page novel. Even decent web copy takes at least a "little bit" of time, even if just a few minutes to an hour per page.
I would agree with ownerrim on this issue. It is very hard for a site to have genuine contents with 100,000 pages - all to be done manually. You must be a big corporate with a lot of staff to attain that status...How many manpower and time it requires?
DB driven sites are different; you can attain 100k pages but that's not unique and genuine contents. For news sites, many articles are outsourced or rewritten - those are also different.
FWIW, Google gives preference to big sites too, but not with 100k pages that come up with crappy pages and contents.
At the end, it does not matter how would you view your site, it is Google who decides.
| 12:35 pm on Aug 13, 2004 (gmt 0)|
I think this site is pretty decent and seems to have a lot of pages
[edited by: Brett_Tabke at 3:15 pm (utc) on Aug. 13, 2004]
[edit reason] thanks, but no specifics please ;-) [/edit]
| 12:39 pm on Aug 13, 2004 (gmt 0)|
well, who knows. i just think the stuff "that's like everything else" and lighter on unique content will always be subject to flapping about in the latest google wind. original content tends to be "heavier" in this regard and less subject to be driven by the gales of whatever algo tweak is presently in the works. But it takes a bit more time and considerably more effort to produce this sort of content, which is exactly why you don't see it much at all on the entire web. I can't tell you how many times i've looked for something, checked out the first 10-30 serp results and found paraphrasings of exactly the same crap. what was unique enough to differentiate the different sites? I couldn't see it. And it's this kind of stuff that google shouldn't want ranking high, because it will kill its value in the eyes of users. so, when everything looks alike (spamlike), i guess that's when they weed. and, since weeds keep coming back, the job of weeding never ends for google.