Welcome to WebmasterWorld Guest from 18.104.22.168
I've been reading through the June 27th/August 17th threads, and I was wondering if somebody could make it more clear as to what's actually going on?
Like many of you on the board, my site got trashed in the SERPs on June 27th only to recover a month later. At the time, I thought I had incurred a penalty and went through painstaking detail to remove even the most minute possible violations. I thought that correcting those problems was the reason that I recovered.
So needless to say, I was pretty upset when I got trashed again around the 17th when I knew my site was in total compliance with Google's guidelines. After visiting this forum, I now see that I was not the only one who has been experiencing this type of problem.
Here are my questions. If any of you can shed some light on these, I would really appreciate it.
1. Why is this happening? It seems like some kind of update, but why are certain sites getting trashed when others are standing firm?
2. Can I expect a recovery similar to the one I had in July?
3. Is there anything I can do to fix this, or am I completely at the mercy of Google on this one?
Thanks for you time!
[edited by: tedster at 6:25 am (utc) on Aug. 22, 2006]
"Sites" do not go supplemental: individual URLs do, or don't.
For active pages, Google usually has both a normal and a supplemental result for every URL. For pages gone 404, expired domains, or redirected URLs, Google holds the old page as a Supplemental Result. For duplicate content, Google holds none, some, or all URLs as normal results, and none, some, or all, other alternative URLs for the same content as Supplemental Results.
There is much more in the thread at: [webmasterworld.com...]
[edited by: g1smd at 7:12 pm (utc) on Aug. 22, 2006]
"We do, and have done so for nearly 7 years.
We do pay for some traffic, and also have traditional offline advertising. But the bulk of the first time visitors to our website are coming in from the organic results of the major search engines.
Its not different than being in the Yellow Pages if you ask me."
The point is simple, it's a huge risk to put all your eggs into one basket if you have a sound business plan. Especially if the eggs are going in something that you have zero control over as many of us are finding out with the "refreshes".
instances where I see just the opposite
That's excellent. Just as it was with the infamous "sandbox effect", we are probably seeing a composite effect. That is, several filters in the algo are probably interacting to give what we see as just one footprint. But the exceptions are the way to find the edges, the trip points, for what is going on.
In the first case I mentioned above -- the footer links -- the webmaster backed off on those keyword links and saw upward movement within a few days. Not back to the first page, but a solid jump. This is only one case, but highly suggestive of locating at least one sensitive area. Also, penalties, once applied, may come off in degrees and not al at once. What we observed fits that profile pretty well.
The other case I menioned, the one with the overloaded main navigation, has not yet made any changes. And they are still suffering from the 7-disease pretty badly.
My main point is that we will probably not find just one factor at work here, one where everyone will agree "that explains it -- everything I see is summed up there."
My site is getting trashed in Google's SERPs as a result of the July 27th update and thus I have lost over 80% of my traffic/revenue because of Google's efforts to clamp down on spammers. At the same time, I am having to allocate more and more resources to detecting and blocking site scrappers and referrer log spammers simply to keep my databases from crumbling under rapid fire requests from hyper aggressive bad bots. Yesterday I moved my database to a new server and today I've already had a referrer spam bot that got passed my defenses make so many requests so quickly that my limit for simultaneous database connections was exceeded.
This little war between Google and spammers is costing me thousands of dollars and causing me to lose some really good long term direct advertisers because of my drop in traffic. I am really sick and tired of my site being ground zero for the war between Google and the spammers.
Google needs to get their act together and stop causing collateral damage with their updates that seriously harm legitimate websites. At the same time legitimate websites need to start taking an aggressive stance against spammers and report MFA scrapper sites running AdSense ads via the "Ads by Google" link and file spam complaints via Webmaster Central when these scrapper sites appear in searches we conduct.
Last year Bourbon had done severe damage for 6 months.
OK -- so maybe we can look at the old Bourbon conversations and see if there are any common factors between then and now. It would be a hint if there are.
Minor Shuffling - Incremental Indexing - Not enough changes to be an update. [webmasterworld.com] - May 15, 2005
Google Update Bourbon - Part 1 - Has the sandbox been busted? [webmasterworld.com] May 20, 2005
GoogleGuy's posts - advice on Bourbon [webmasterworld.com] June 1, 2005
My main point is that we will probably not find just one factor at work here
Good point, and I think one of the reasons for that is there are a lot of factors at play, and the difference between being behind the line and over it is extremely small when it comes to these things.
Another example -- a main nav with 16 links, target keyword was in 13 of them
Sure that’s obvious, but is the threshold 12, 11, or when they institute the filtering is it 5 or 6? We have been taking a long hard look at our sites and were a little surprised at the amount of links, with relevant anchor text pointing to inner pages off of our home page.
I've not read a satisfactory explanation yet of what is happening or how to avoid it. Google must be laughing at us to see our confusion. With what was my main site well and truly gone, my only consolation is that most of the other sites that used to compete on my keywords have also disappeared leaving just the big players like Amazon etc to dominate.
I probably would not date it back that far - but perhaps to December 2004 - and with Matt saying that this is a data refresh to a certain algorythm that has been going on for 1.5 years perhaps this might be the case.
Whatever - the algo and data refresh from the outside looking in looks very poor/buggy.
Looking at a site:domain.com search where the homepage was first it went AWOL on the 27th June, Back on the 27th July and AWOL again on the 17th August - I bet the next data refresh would have it back to the top again - and went the homepage goes missing on a site:domain.com check then the whole site goes wrong :(
[edited by: Dayo_UK at 8:44 pm (utc) on Aug. 22, 2006]
dayo: homepage goes missing on a site:domain.com check then the whole site goes wrong
yep, so true! however it happened to me 3 times now and it always came back with fresh content and links... this time i am back with the homepage, but the overall ranking seems to be very poor, all of a sudden...
endurance for larger projects is needed... the "real" ranking for bigger projects seem to start after 6-9 months now!
Google needs to ensure, that the best content is upfront. These algo experts are miles beyond from that and only a smart directory, voting, ranking solution with membership structures could enforce quality.
Just to set my own benchmark, the websites I work on saw significant gains on June 27 and have maintained about half of those initial gains during the subsequent tweaks. In other words I have no axe to grind at this time.
I would like to see a virtual show of hands. Who here has read The Search by John Battelle [battellemedia.com]? Excellent. You can put down your hands. The reason I ask is that there is an excellent description of the Florida update in John's book. He even recaps the reactions here on WebmasterWorld and GoogleGuy’s responses.
Google has come a long way since November 2003. Their communications with webmasters is far better and you can tell that they genuinely want to be helpful. In 2003 I think most website administrators who had never attended a professional conference would be hard pressed to name a Google employee other than LP, SB, ES & MM, and even then only if they were lucky. Today helpful people like Matt Cutts and Vanessa Fox are quick to participate and help set the record straight….but only up to a very generic and guarded point.
That is where the rub lies. Google’s engineers continue to trust automated systems more than they do their human brethren. In a community that embraces open source and the free flow of information it is ironic that Google continues to be one of the most secretive non-governmental organizations on the planet. If Google wants to organize the world's information and make it universally accessible and useful then Google’s leadership must accept the fact that the knowledge of how to help Google collect and distribute the world’s information is a key piece of information in and of itself. Rather than embrace good people everywhere Google lives in fear of evil.
There is an alternative viewpoint. If Google explains to website authors and search optimizers which factors it favors and which factors it frowns upon then most people will gravitate toward the ethical practices. Yes, there will always be renegades, black hats who take advantage and attempt to skew the system in their favor. But in an open environment it will be harder for these rebels to hide or to blend in. They will become as visible as blackheads on otherwise unblemished skin. And because they will stand out they will be easier to isolate and to counter.
When role model companies like Google turn their backs on their audience and secure their gates it sets an example that others are, unfortunately, quick to take-up. For example, how many businesses will plaster their 1-800 number over all of their printed catalogs and postcards but will not display a phone number anywhere on their web site? Why? Fear. How many websites will not publish their email address, rather insist that you use a difficult to find and highly impersonal form? Why? Fear. Where did they learn these practices? From companies like Google. Good people trust good businesses. When companies hide things like telephone numbers and email addresses good people do not think it is for good reasons. They think it is because the leaders of these companies do not care, do not want to lower themselves, or have something to hide. Nobody wants to work with companies that behave in this manner.
I do not think Google should be shameful, quite the contrary. Google is a fantastic company, which is why I am amazed that it does not hold its head up high, straighten its back and speak in clear resonant tones. There are several great individuals working at Google who actively participate in our community. I am thankful to them. But as a company and an industry leader I wish Google would stand-up, open-up and stop living in fear of evil.
[edited by: SEOcritique at 3:05 am (utc) on Aug. 23, 2006]
Is anyone missing sites in Google all of a sudden? Happening within the past hour?
[edited by: SEOcritique at 3:53 am (utc) on Aug. 23, 2006]
Are you absolutely sure that the behaviour you are witnessing can be applied to a concept about why urls (or sites) go supplemental.
The reason I ask is that I will come clean:
Sites where I have on purpose used every bad trick in the book either get removed totally or go supplemental - I continue to receive crap quality traffic on these domains.
Sites that are clean, no link exchanges, can go supplemental over time if the inbound links in themsleves become downgraded (i.e. bad link purchasing!)
Sites that have too many pages visible too fast to googlebot can create a sitewide supplemental problem.
Link strength causes pages within a site to go supplemental - i.e. if you have a blog where links will rotate (the calendar pushes fresh links to the top of the tree) - when links become archived then the link strenth decreases and the page goes supplemental.
The only case I have seen where getting more backlinks gets the page un-supplemental is where link strength has gone down. Imagine that in this case it is a new algorthm keeping known pages in the index but acknowledging that they now do not have enough inbound links to count - in the datawarehouseing trade we call that "marked for deletion".
I have not witnessed any of the cases you mention, g1smd - in terms of Google keeping 2 copies of a page (one historic, one current). What I am witnessing is a "catch all" of penalties either forced or passive.
Can you give clear examples where this is actually the case and/or quotes from someone at Google that this is the behaviour you can expect?
If you suddenly tell Google there are more pages than they knew about - then surely that must go through the page rank system.
Suddenly your page rank distribution decreases - and rapidly as Google then recalculates your page distribution with what it now knows?
And that is a technical point, if you have an algorithm that takes into account link growth etc. how do you deal with notified link growth (either external or internal).
And why assume this is treated properly (i.e. not logically). As it should not realistically be treated the same as natural link structures - but on the other hand how can it not, how do you now calculate page rank in that environment?
Well, you can't. The secret is how is all that stuff being used?
And the key is, you have to treat it differently - but how can you really do that and if you try to how can you be sure that the resultant page rank + extra's = natural stuff. And if you subsequently apply penalties etc., how can you be sure that they are all being applied consistently?
...one factor that stood out to me was anchor text -- within the site links, not outbounds or inbound from other domains. Internal anchor text, in these cases at least, was heavily packed with target keywords, to the point of looking a bit odd for the normal (read that as non-SEO) visitor.
I have been thinking about the sitewide internal anchor links issue also. I use keywords in site-wide bread-crumb links. Another site I compete with does the same thing. However, they moved up during the last two months, while my site dropped, and now that my site is back up post 8/17, they are back down again.
...we are probably seeing a composite effect. That is, several filters in the algo are probably interacting to give what we see as just one footprint. But the exceptions are the way to find the edges, the trip points, for what is going on.
I agree. I think that there may be trip levels that require multiple factors to come into play. Doing anyone or two may not get you hit, but do too many, and you trip a penalty filter that drops you down a bit. Or dampens the value of something else like your inbound link power.
The composite nature makes it very difficult to figure out exactly what is going on.
The composite nature makes it very difficult to figure out exactly what is going on.
For sure...one place any site owner will attempt to make changes is with the "on page factors"...this is where they have the most immediate control ... count on Google having set up their filters to catch aggressive (and in the real competitive sectors subtle) changes fairly easily...with tedster's example being an easy one to catch (internal anchor text abuse...especially in the footer...this goes back to Florida when the word stemming algo was introduced and really created some chaos with what had been some very stable SERPs)...so that the parameters for ranking were then distributed through the link relationships more prominently...
The "off page factors" will be more difficult to control and even if a webmaster has set up an entire network...and can tweak both internal and external factors to their liking...Google will probable have identified this network and set some traps out..
The best solution is to minimize any type of SEO strategy to the realm of absolute human usability elements...making the site work correctly for your target demographics...and working to increase your conversions (whatever they may be)...for maximum returns...
Remember...Google's primary interest is revenue...and increasing this per quarter...their decisions will always be focused on how they can achieve this...
while my site dropped, and now that my site is back up post 8/17, they are back down again
site grouping is not talked about much. Yet we know Google has a long standing algo that picks similar sites/pages. It would not be a surprise if they decided for very similar sites they would only show one or two. This would explain the yoyo effect between similar sites. Either they get delibertely rotated or they rotate as ranking factors are adjusted or traffic patterns are analysed or used.
In attempt to correct(!), we rolled our web sie changes back to pre-Aug 7th to no effect.
Early in Aug we did submit a new site map with many more pages. So one of the possibilities (as described earlier) is that some google algorithm factor has been "diluted" by the increase in the number of pages. We saw the number of pages go to about 300,000 +, and then over a period of few days fall back to 195,000 +. Last year we had good rankings and 300,000 + pages so I am not convinced this is the answer although, of course, the new changes to the algorithm may be a factor!
Other factors that have crossed my mind is the use of Google Adsense - as many of our pages have Google Adsense adverts - the increase in pages submitted might have triggered a "spam" of Google Adsense. In other words, Google believes we are just creating pages for Adsense revenue and penalised us.
Another observation - it appears for all our major keywords we cannot get above position 30 on the rankings. We see some increases up to postion 31, and then they bounce back to 50 +. Start increasing up to 31, and then drop back again. Anyone else seeing this effect?