Welcome to WebmasterWorld Guest from 126.96.36.199
"I think some of us are seeing a different variation. I know my site and at least someone else on this thread ranks # 1 if you type in their domain name. Or domain.com. Back actual pages by title, those that used to be in the top 10 SERPS are now about 10 pages back, in the 100+ range. "
This is exactly what I'm seeing. I'm holding many keywords fine but all my big keywords are gone most of the time.
Our site is swinging up and down all day long. Sometimes I search for red widget and we are not on page 1 but we are in the top 4 pages then a search an hour later and we are blasted back to the bottom of the results.
Other keywords seem stuck at the bottom.
As a result our Google traffic is down more than 80%.
We have lost many keywords that we have held #1 on for years. How can we go from #1 to the bottom or not even showing over night?
I have done nothing to deserve this and cant seem to figure out why this happened.
[edited by: tedster at 7:17 pm (utc) on Mar. 2, 2007]
I have tons of incoming links, some reciprocal, but most of them seem to be one-way inbound links. The linking sites are about all manner of < specifics related to my broader topic >. They're closely enough related to my niche that I don't think they're a problem.
I just don't think that's the problem.
Unlike other webmasters here, I haven't taken a huge drop in traffic. My site is very widely diversified in terms of keywords. In a typical month, searchers will find my site using as many as 20,000 different keywords or phrases.
Traffic to my site is down by about 15% from March of last year. However, the traffic from Google is now at 62% of all my visitors. That's up from about 45% from last year.
Where I'm losing traffic is in the search terms such as "[state] widget stores." I've consistently been #1 for "(insert state here) widget stores." Now I'm all over the map.
That's why I keep going back to previous posters ideas about this being some geographic experiment by Google. After all, why would all of my results for "Acme Blue Widgets" stay the same, but my results for "[state] widget stores" be fluctuating hour by hour?
I don't depend upon people searching for "[state] widget stores" for my traffic. The overwhelming majority of my visitors find my site when searching for "Acme XYZ model widget."
However, my sales pitch to the widget stores is that my site ranks #1 for their state for widget stores, so that makes my sales job more difficult. If I tried to explain how people find my site, the store owners would get confused.
But, having said all this, and in previous posts, I wonder if Google isn't trying to give searchers more localized results. If I just do a search for "[state] widget stores" using the datacenter in my area, I'm always #1. If I access another datacenter, perhaps one that serves < a different region of the country >, I get completely different results.
[edited by: tedster at 5:41 am (utc) on Mar. 11, 2007]
it is very much intentional and designed to ensure that NO site enjoys (FREE) top ranking for too long
The is a great theory expect that in genres I work in alot of sites have held their ground for longer than two years.
I also believe in the link diversity helping to keep sites more stable from algo fluxes etc.
I have noticed for some of our more robust sites, when you see a link area that is weak - (possibly in content links, etc) then we try and fill that hole slowly.
Again, no idea if its going to come back, or if this is a result of a design change we did before all this happened (we took some navigation, outgoing links and extraneous text off of every page.)
A lot of our site is technically duplicate content. Maybe me taking that additional text off the page has resulted in the pages being more duplicate that before? And hence we rank for almost nothing?
Thing is, the more I removed from the page the longer the visitors stayed on them, and the higher CTR's we were getting, as well as more page views overall. So I've been reluctant to roll back these changes...
Dec 20th, 2006 tedster wrote:
"This is just my sense of things, but I wouldn't be surprised to see some serious SERPs changes early in the new year -- changes that focus on these duplicate issues that have so far been slipping past the Google radar.
First Vanessa gave Rand a video interview on the topic, and now, soon after that, Adam has given us a more detailed blog post. He's even sharing some useful vocabulary to help further our discussion and comperhesion.
You can tell where at least part of the search quality emphasis is right now at Google. So this current focus might also be a bit of a storm warning for the wise. It's happened before. The way I see it, public statements don't just emerge from a vacuum."
There's no doubt in my mind that this latest update is focused on duplicate content.
I am confused becuase we havent changed anything in our site. Anyone else seeing this? What can we do get our pages indexed again?
We have google sitemap submitted for a year now.
[edited by: tedster at 9:14 pm (utc) on Mar. 11, 2007]
[edit reason] moved from another location [/edit]
And how long ago were we talking about theming?
On Dec 15-20 Google hit me so surgically that any page that didn't mention the central theme was missing or tossed into the supplemental area. Now it seems they've turned up the duplicate content filter so high only doorway pages and pure text pages can slip through easily. I'm amazed about any form of navigation isn't seen as duplicity.
Google will eventually kill off so many web sites they'll be penalized in any search engine. I've concluded the Google solution for anything has a 5-50% accuracy rating with a lot of collateral damage to innocent sites. They then tinker even more until they've screwed up the initial situation and created even newer problems.
They value a wide variety of poor quality links (blogspam) over a smaller number of high quality links (high PR niche authority). They aggressively went down this path -- both in terms of ranking and crawl priorities -- summer 2006, and they are just continuing down it.
Google simply has exercised very bad judgment, and they are one of the most stubborn companies in the world. Getting them to recognize, admit and correct their screwups is a multi-year commitment.
Until a competitor comes along that is even adequate, Google will be happily mediocre.
Well I think this needs to be understood - For example can you bomb a competitors site just by copying the content and reproducing it several times over? then watch it drop in the serps? To me there must be much more to this whole thing than just a duplicate filter. I've seen plenty of people claim to have been whacked with unique content while at the same time sites with duplicate content still rank.
Yep I definately agree with you.
Until a competitor comes along that is even adequate, Google will be happily mediocre.
It's a classic market condition:
=> A Company is the dominate player in its sector...
=> It can then reduce its costs on excellence into the average range for achieving its revenue objectives...and breath new revenue life to the bottom line...
=> It can operate its core product in the gray range of acceptance...while working on other initiatives for revenue and market penetration..(online office products, google maps, adwords tweaks...etc..etc..)
Very common ailment in business... "good is good enough"
If you create a domain or just add a few pages to a existing url with a copy of the competitions pages, you can get another site out of the index in no time.
Make a meta redirection
Make a 301, 302 links still works.
I think they have overdone it with all there filters and "rules" if you dont make a site for google you are not on the internet.
You can NOT make a site for the user/visitor anymore, be cause you could get into trouble with google somehow, it could be links, dublicated content, HTML mistakes....
I REALLY hope we see some competition soon, also many asks me what do you use when you search the net, (be cause Im a webmaster) I ALWAYS say live.com or ask.com , why should I say google they just have to much power now.
[edited by: Play_Bach at 2:17 am (utc) on Mar. 12, 2007]
I will say this regarding our current traffic. It doesn't appear to be qualified traffic... meaning .. they are coming but they aren't buying very much... even with purchasing incentives, free shipping, rebates .. none of it is making a difference right now. Economy?
The reason I ask is that none of the "model XYZ widget" pages on my site have been affected. It's now only a few pages that refer to "widget stores" in a particular state that have dropped from the #1 spot to #2 to #6.
I guess I'm lucky but, at the same time, I'm trying to figure out what Google is doing.
I have a site that has consistently ranked in the top 10-20 for a very competitive 2 word finance term that has completely disappeared from the rankings.
Interestingly enough, we are showing up around #12 when I search for "finance term" as opposed to finance term w/o quotes... where we are not in the top 1000.
Not sure what this may mean, or if it's permanent.. just wondering if anyone else has seen anything similar over the past 24-48 hours.
[edited by: tedster at 5:34 am (utc) on Mar. 12, 2007]
I would guess the new filter was taken off or adjusted again hence why the site recovered for the day.
Problem is that a percentage of good sites always get hit when they try rolling something new out - imo the google serps have been in decline since september.
Also, we have to ask ourselves this - has all this tinkering by google improved their serps? IMO the serps are worse than ever now so was this a problem that needed fixing? i dont think so!
Very good one. This does seem to be what is happening.
"The problem? The problem is Google is a not very good search engine (even if miles better than the competition). They value a wide variety of poor quality links (blogspam) over a smaller number of high quality links (high PR niche authority)."
Absolutely true. Quantities of crap links too often beat smaller numbers of quality links.
[edited by: tedster at 5:44 pm (utc) on Mar. 12, 2007]
[edit reason] fix link [/edit]
I am starting to wonder if Google may have suffered some damage from a virus at the end of 2006 or something. Yahoo is just #$@&*'d , top few results are okay but then nonsense, they are either making algo changes or the link farmers are having a bigger hay day than usual. If its dupe content what are the odds they are both going to nail me the same time after a steady 3 year top ten-twenty. I have a couple of similarly named similarly themed pages definitely pushing the envelope but from what i can see many authority sites doing, it seems fair game.
The only thing that makles any sense at all is that the i just used Xenu and it shows a lot of 301 302, mainly embeded objects like flash and video files that call up plugins that have new paths. The flash is used on many pages as is the dmoz attribution, can this could have such a negative impact , considered broken links and trigger a penalty?
and the dmoz attribution
You are managing - nearly single handedly - to hamstring the internet!
LINKS are afterall what makes the INTERnet the internet.
Google has managed to devalue virtually every type of link possible. Not only that, Google has managed to make webmasters so fearful that many have deemed fit to place nofollow tags on even their longstanding "legitimate", so called natural outgoing links!
How - will ANY new site EVER get a "natural" link?
To get a "natural link" you must FIRST be listed somewhere on the internet before other sites even know you exist - yet, these days, NO GOOD LINKS = NO INDEX.
Most webmasters hesitate to give links to zero pr websites. This leaves little recourse to new sites. Either they purchase links or they try and get links (garbage links) from blog comments, directories and forums.
Of course Google frowns on all the links mentioned above so where does that leave a new site owner?
It dosn't stop there. Long standing sites are falling from grace with Google because of what I mentioned previously - the recent popularity of the no-follow tag and the wholesale disgarding of many outgoing links (like ships throwing extra baggage overboard to lighten the load).
How many of the older "stable" sites that are now finding themselves at the bottom of the serps - finding themselves there because the reliable backlinks they once depended on have either had the link removed or had the long standing link has recently been tagged as no-follow?
AGAIN - "natural linking" can never occur when sites can not be found in the search engines.
IF you find your site at -31 or at position 950 or at the bottom of the serps - no site will ever "naturally" link to you. You are literally forced into a position to "un-naturally" solicite (beg, borrow, or BUY) links in an attempt to regain some standing in the serps.
AND thanks again to Google a fairly new phenomenon is arising - site owners are now resorting to selling "good" link space rather than freely linking out to other sites.
I want to be the first to say Congratulations Google! You have managed to almost single handedly pushed NATURAL LINKING to the brink of extinction!
Attempting to reduce excessive on page spam using x filters and forcing pages be thematic based on the phrases and inbound / internal links analysis
Attempting to force websites to show only unique content
By eliminating the advantage of what Google believes to be unnatural linking schemes
My websites were suffering about 4-5 weeks ago (some more recent than that), some from the so called '950' and others point blank being dropped or simply demoted by 5-10 pages in the serps. Most have returned to previous rankings, and much of this involved hard work and really paying attention to what some of the most active members here were saying outside of the box.
Removed excessive related keyword phrases on pages that were suffering, and made my 'language' as natural as possible.
Extensively checked how I link to each and every page in site.
Implement a permalink to the same page in every page.
Scour and try to remove scrapers as much as possible.
Get high quality links from an absolute diverse section of related websites.
Here is a common theme I am seeing:
1. On page phrase analysis and detection of spam phrases:
It would seem the page phrase analysis to prevent spamming is something I would believe in action here. A really good breakdown of this was posted by Miamacs:
2. Internal website linking - how and where you are linking to your internal pages, rate of change of those links, and again - how the links to various pages correspond with #1 - how Google 'sees' the theme of that page.
3.Duplicate content - which has been covered alot in this forum - fixing canonicals, improving unique content on each and every page, amalgamating pages that appear to offer the same content, applying unique meta description tags to every 'rankable' page, chasing down (if possible) any scrapers, and placing permalinks if possible in your own page content to your own content - a reference for Google to help distinguish where the original content 'may' have come from.
It would seem to me that a new thread which might amalgamate and cover all of these issues as one checklist for the new year would make sense to me.
< continued here: [webmasterworld.com...] >
[edited by: tedster at 5:51 am (utc) on Mar. 13, 2007]