Forum Moderators: open
I am talking about traffic of nearly 7 k from google everyday and hence its a sizable decrease.
Looking for early answers on how we could check the things
The new part is that those sites pages are now above mines, and that may be some new duplicate rule, or it is because those sites are more 1 topic than mine, so my copy works better for them.
It was ranked highly for widgets... a few days ago the back links dropped / and so did our position.
But get this, we decided to add into our footer 2 weeks ago a link back to the home page with 'cheap widgets' and then we lost ALL our rankings for that term...
The page rank has not dropped as yet but i think it will.
well to say the least dont try it!
Like discussed in this thread: [webmasterworld.com...]
I see some success which is good.
Sorry for webmasters affected, but G does not owe you anything, go and try to add enough uniqueness to each one of your pages.
R
This is certainly more realistic with how many sites tend to grow...certainly adding 100 or 1000 pages a day is simply not realistic for unique content unless one is bringing across the void some sort of archived unique content....
Google is simply trying to control the hideous onslaught of millions of pages that are designed to manipulate the serps...
continue to develop out content for your audience...make it specific and useful to their needs....let the spammers have their day...don't panick
>>>>this is a recent phenomen Copper.There is a thread about it somewhere but you are not alone and as long as you still hold your positions in the serps,don't panic.
Hiya Copper and Cabbie...
I was the one who mentioned this happening above... the problem is it did affect the SERPS. Now, I have my PR back to PR6, but my SERPS appear to be permanently damaged. How is it that I have a PR6 and not one of the pages now listed in front of me is a better PR?
I thought it was maybe backlinks, but that number it totally random among the ranks above me, so it's
not that... so basically I am still clueless...
Is anyone not?
Also, on a seperate note to what someone just posted...
Unique content I have, the people above me, some do and some don't... I wish it were that simple. I add a page or at least add a paragraph to one every week,
with over 90 pages of unique content on different types of services we offer... they have a common bound because of location and the overal type of service it is, but each page truly deals with
it's own subject within that service... how much can I really do there without starting to talk
about things I just don't have or do?
Now the index page seems to have to have it for the other pages to even be seen.
What's odd... another note, is that there are still some searches, on pages for that matter that I haven't spent time "optimizing" (they have no alt tags and the keyword density is not great) who are coming up number one.
I do not know what happened?, I guess I am glad I did not quit my day job (almost did) but my site lost 90% of the traffic and lost 90% of the revenue. I have paused all of my AdWords campaigns (10k a month budget) and I am sitting on the side hoping for a miracle. My site is clean with a lot of unique content (almost 100k pages indexed) I survived Florida and others and I think this one is the worst, I see lot of garbage of there in front of me. I hope this one is just a fluke
My question... I am a consultant, I am paid to handle this for someone else. What do I tell them? People familiar with the industry know these things happen and it's not necessarily our faults, but they don't. How does one explain it to the "unknowledgeable"
I still have a PR4. I have no cached images or pages in G anymore. My top 13 search terms are no longer posting for me. My backlinks increased in the last update.
G crawled a deep, deep page in a forum of mine with a php extension. The url is huge. That is the only cahced page and that was done yesterday.
Traffic has declined drastically. I have corrected every possible thing I believe that could be questionable according to GTOS.
GoogleGuy, where are you? Please help us....
whatever they did, they did to outsmart us. I doubt GG would tell us how to SEO to be #1 in Google :). Not to mention the IPO quiet period...
GoogleGuy, where are you? Please help us....
Y'all might want to take a few seconds to click on that "Charter" link towards the top of this page. You might notice this section:
In the meantime, here are some quick guidelines as to post types and phrases that might trigger an action or prevent a pre-moderated post from being approved:<snip>
# pleas to specific users such as a mod or Google representative.
Are you serious? 100,000. What kind of content could that be? It takes professional writers years to put together a 300 page novel. Even decent web copy takes at least a "little bit" of time, even if just a few minutes to an hour per page.
I see VERY FEW sites online that offer genuine content, and by that I mean something that someone has actually put some thought into for the purpose of serving a target audience versus simply trying to rank in a serp.
No offense, but everytime I hear that someone has just put up 3,000 pages or runs 100,000 or 200,000 pages, I can't help but think that it is exactly the kind of stuff that google doesn't want clogging up the serps and I'm not surprised to hear that a site of this sort is being slammed in an algo change.
The sites I have build have just under 1000 pages that has taken the last 18months to build up by hand. These have been hit with an 70 - 80% reduction.
The sites I have that run from wholesaler databases (and thus other retails have access to the content) and are about 2000 pages in size, have seen no decline at all.
you seem to be incredulous that any site could have 100,000 pages. i have one such site.
i won't tell you what mine is but let me describe for you hundreds of such sites: newspapers. every day, a decent size newspaper can put out hundreds of unique and very thoughtful web pages. sure, it takes a staff of dozens to hundreds to do that, but, well, that's what they do. and after a couple years of doing this, even a moderate-sized daily newspaper can easily put together 100,000 pages of content. and importantly, this is valuable content that is fully google-worthy.
so, sure, you might be sitting in your bedroom dressed only in your underpants frantically trying to throw together another few pages for the web, but there are plenty of sites that genuinely have a lot of content and they aren't playing any nefarious search engine games to get there.
you seem to be incredulous that any site could have 100,000 pages. i have one such site.
You yourself pointed out in this thread before that you have a relatively spammy 200000+ site :-)
i won't tell you what mine is but let me describe for you hundreds of such sites: newspapers. every day, a decent size newspaper can put out hundreds of unique and very thoughtful web pages.
I do not think that any page of a relevant newspaper was dropped by Google via this algo change (neither do I think you are maintaining such a site).
You and all claiming here about drop real content site above 100000+ may please drop the links here so we can see with our own eyes that this are origin sites worth to be in the top ten!
you might be sitting in your bedroom dressed only in your underpants frantically trying to throw together another few pages for the web
so, sure, you might be sitting in your bedroom dressed only in your underpants frantically trying to throw together another few pages for the web
That is very funny.
I agree, big doesn't necessarily mean spam or crappiness.
A well constructed db delivered page with lots of clever scripting and appropriately placed 'info plug ins' can be very useful indeed.
Some people do big sites well, some don't. Those that do are likely to survive and those that don't will be dropped or fall to obscurity, sooner or later.
I once got lazy with a domain. I used a template and sprinkled a mix of kw's in here there and everywhere. The pages were on topic and provided a means of purchasing the kw related product. It worked in the sense of it gave the user a lazer targetted route to what they required. It took me a week to script and research and after 3 months was doing really ok and earning me money. However, if Im really honest, it was a crappy website in the sense that it didn't really offer anything that couldn't be found elsewhere and those with whom I was competing were offering far greater added value in terms of well researched supplemental information, that went above and beyond what could possibly be required for the users trip.
Anyways, the website in question doesn't rank for any of its keywords and income is a trickle of what it once was, whereas my ex competitors are still there.
This game is the same as any other. If you dont work hard enough and keep on working at it, youll end up losing.
I know that Google owes us nothing, but my real beef is this: After Florida and Austin, my best earning sites were hit badly, so I and probably like many others made these sites much larger in an attempt to grab as much traffic as possible from lesser competitive keywords...a strategy that worked (until now!) however I almost felt guilty at poluting the internet with thousands upon thousands of pages, but it was the only option.
My point is what has Florida, Austin and this latest tweak really done to improve things other than panic many webmasters into over developing sites to try and claw back some traffic?
And the result of this latest tweak? well one possibility is to publish 10 more versions of the same site accross different domains/servers and all different enough to avoid penalties....ridiculous! more internet pollution and more spammy results. Personally I preferred Google before November 2003
however I almost felt guilty at poluting the internet with thousands upon thousands of pages, but it was the only option.My point is what has Florida, Austin and this latest tweak really done to improve things other than panic many webmasters into over developing sites to try and claw back some traffic
Again...this stems from having a weak business plan from the start.
What you're demonstrating is an endless cycle: I've lost ranking because google changed the algo to deal with spam like mine, so now I'll spam even harder to get soem of that traffic back, after which google will be forced to do another algo change to deal with THAT spam....amd over and over and over again.
The fact is....a good many of the webmaster crying about lost traffic are part of the problem that necessitates these changes in the first place!
Meanwhile...those of us who take a legitimate "path" and develop a business model/website that converts targeted traffic regardless of the source are foced to continually endure changes that invariably catch us in the "net" these algo changes throws (i.e. throwing the baby out with the bathwater), and continue to suffer because of the lack of ethical marketing practices of others.
....I long for the day when google & others begin to employ (as I think they must) a full time "spam busting" department full of actual humans that would act to manually and permanently review & ban these spam machines. I doubt many who discuss building these 20k+ page sites everymonth would be able to stand a human review.
I would agree with ownerrim on this issue. It is very hard for a site to have genuine contents with 100,000 pages - all to be done manually. You must be a big corporate with a lot of staff to attain that status...How many manpower and time it requires?
DB driven sites are different; you can attain 100k pages but that's not unique and genuine contents. For news sites, many articles are outsourced or rewritten - those are also different.
FWIW, Google gives preference to big sites too, but not with 100k pages that come up with crappy pages and contents.
At the end, it does not matter how would you view your site, it is Google who decides.