homepage Welcome to WebmasterWorld Guest from 54.167.75.155
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 182 message thread spans 7 pages: < < 182 ( 1 [2] 3 4 5 6 7 > >     
June 27th, August 17th, What's happening?
Pages are vanishing and reappearing -- even whole sites
DeROK




msg:3055211
 3:15 am on Aug 22, 2006 (gmt 0)

< a continued discussion from these threads:
[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...] >

-----------------------------------------------

Hey everyone,

I've been reading through the June 27th/August 17th threads, and I was wondering if somebody could make it more clear as to what's actually going on?

Like many of you on the board, my site got trashed in the SERPs on June 27th only to recover a month later. At the time, I thought I had incurred a penalty and went through painstaking detail to remove even the most minute possible violations. I thought that correcting those problems was the reason that I recovered.

So needless to say, I was pretty upset when I got trashed again around the 17th when I knew my site was in total compliance with Google's guidelines. After visiting this forum, I now see that I was not the only one who has been experiencing this type of problem.

Here are my questions. If any of you can shed some light on these, I would really appreciate it.

1. Why is this happening? It seems like some kind of update, but why are certain sites getting trashed when others are standing firm?

2. Can I expect a recovery similar to the one I had in July?

3. Is there anything I can do to fix this, or am I completely at the mercy of Google on this one?

Thanks for you time!

[edited by: tedster at 6:25 am (utc) on Aug. 22, 2006]

 

Rugles




msg:3056044
 6:03 pm on Aug 22, 2006 (gmt 0)

Thanks, Tedster.

Not going to affect me then. That person sure is playing with fire.

ashear




msg:3056045
 6:04 pm on Aug 22, 2006 (gmt 0)

Month over month the updates seem to be very consistant in nature. I do not beleive that we have seen any major updates and at this rate this will be going on for months.

wrkalot




msg:3056082
 6:30 pm on Aug 22, 2006 (gmt 0)

I'm not saying that your observations about internal anchor text are incorrect but like every other theory I've heard, I can find instances where I see just the opposite... and this one is no exception (sadly).

handsome rob




msg:3056126
 7:05 pm on Aug 22, 2006 (gmt 0)

I was a google homepager for a long time, but I switched to Yahoo a month or so ago because of how awful google's serps have become, both in my business-related searches and in my personal searches. My income hasn't been hurt by these updates, because of our business model. However, speaking solely from the standpoint of a "civilian" searcher, I think google needs to face the fact that these updates of theirs are a disaster.

g1smd




msg:3056132
 7:11 pm on Aug 22, 2006 (gmt 0)

>> Also when you change your site if it got supplemental <<

"Sites" do not go supplemental: individual URLs do, or don't.

For active pages, Google usually has both a normal and a supplemental result for every URL. For pages gone 404, expired domains, or redirected URLs, Google holds the old page as a Supplemental Result. For duplicate content, Google holds none, some, or all URLs as normal results, and none, some, or all, other alternative URLs for the same content as Supplemental Results.

There is much more in the thread at: [webmasterworld.com...]

[edited by: g1smd at 7:12 pm (utc) on Aug. 22, 2006]

AustrianOak




msg:3056135
 7:12 pm on Aug 22, 2006 (gmt 0)

"Well ... so internet business without SE results .. interesting .. concept .. obviously everyone wants away from that, but meanwhile until we get a huge loan for offline ads or wikipedia delivers enough ..."

&

"We do, and have done so for nearly 7 years.

We do pay for some traffic, and also have traditional offline advertising. But the bulk of the first time visitors to our website are coming in from the organic results of the major search engines.

Its not different than being in the Yellow Pages if you ask me."

The point is simple, it's a huge risk to put all your eggs into one basket if you have a sound business plan. Especially if the eggs are going in something that you have zero control over as many of us are finding out with the "refreshes".

night707




msg:3056156
 7:27 pm on Aug 22, 2006 (gmt 0)

I am also a june 17, july 27, august 17 victim with quality content and a long established site where the structure has not been modified since years. Last year Bourbon had done severe damage for 6 months.

tedster




msg:3056160
 7:31 pm on Aug 22, 2006 (gmt 0)

instances where I see just the opposite

That's excellent. Just as it was with the infamous "sandbox effect", we are probably seeing a composite effect. That is, several filters in the algo are probably interacting to give what we see as just one footprint. But the exceptions are the way to find the edges, the trip points, for what is going on.

In the first case I mentioned above -- the footer links -- the webmaster backed off on those keyword links and saw upward movement within a few days. Not back to the first page, but a solid jump. This is only one case, but highly suggestive of locating at least one sensitive area. Also, penalties, once applied, may come off in degrees and not al at once. What we observed fits that profile pretty well.

The other case I menioned, the one with the overloaded main navigation, has not yet made any changes. And they are still suffering from the 7-disease pretty badly.

My main point is that we will probably not find just one factor at work here, one where everyone will agree "that explains it -- everything I see is summed up there."

KenB




msg:3056183
 7:46 pm on Aug 22, 2006 (gmt 0)

I think it is time for legitimate web publishers to declare open war on SERP spammers and MFA site scrapper sites. I'm sick and tired of them.

My site is getting trashed in Google's SERPs as a result of the July 27th update and thus I have lost over 80% of my traffic/revenue because of Google's efforts to clamp down on spammers. At the same time, I am having to allocate more and more resources to detecting and blocking site scrappers and referrer log spammers simply to keep my databases from crumbling under rapid fire requests from hyper aggressive bad bots. Yesterday I moved my database to a new server and today I've already had a referrer spam bot that got passed my defenses make so many requests so quickly that my limit for simultaneous database connections was exceeded.

This little war between Google and spammers is costing me thousands of dollars and causing me to lose some really good long term direct advertisers because of my drop in traffic. I am really sick and tired of my site being ground zero for the war between Google and the spammers.

Google needs to get their act together and stop causing collateral damage with their updates that seriously harm legitimate websites. At the same time legitimate websites need to start taking an aggressive stance against spammers and report MFA scrapper sites running AdSense ads via the "Ads by Google" link and file spam complaints via Webmaster Central when these scrapper sites appear in searches we conduct.

tedster




msg:3056189
 7:49 pm on Aug 22, 2006 (gmt 0)

Last year Bourbon had done severe damage for 6 months.

OK -- so maybe we can look at the old Bourbon conversations and see if there are any common factors between then and now. It would be a hint if there are.

Minor Shuffling - Incremental Indexing - Not enough changes to be an update. [webmasterworld.com] - May 15, 2005

Google Update Bourbon - Part 1 - Has the sandbox been busted? [webmasterworld.com] May 20, 2005

GoogleGuy's posts - advice on Bourbon [webmasterworld.com] June 1, 2005

randle




msg:3056209
 7:56 pm on Aug 22, 2006 (gmt 0)

My main point is that we will probably not find just one factor at work here

Good point, and I think one of the reasons for that is there are a lot of factors at play, and the difference between being behind the line and over it is extremely small when it comes to these things.

Another example -- a main nav with 16 links, target keyword was in 13 of them

Sure that’s obvious, but is the threshold 12, 11, or when they institute the filtering is it 5 or 6? We have been taking a long hard look at our sites and were a little surprised at the amount of links, with relevant anchor text pointing to inner pages off of our home page.

Iguana




msg:3056215
 7:59 pm on Aug 22, 2006 (gmt 0)

Everything that is happening sounds like what started on Aug 10 2004. I refer to a post I made last year [webmasterworld.com ]. That refers to the first five times that a lot of sites (not pages) were buried in the SERPS. It has gone on roughly every month or two since then and each time there are threads attempting to come up with answers.

I've not read a satisfactory explanation yet of what is happening or how to avoid it. Google must be laughing at us to see our confusion. With what was my main site well and truly gone, my only consolation is that most of the other sites that used to compete on my keywords have also disappeared leaving just the big players like Amazon etc to dominate.

Dayo_UK




msg:3056277
 8:44 pm on Aug 22, 2006 (gmt 0)

>>>>>Everything that is happening sounds like what started on Aug 10 2004. I refer to a post I made last year [webmasterworld.com...] That refers to the first five times that a lot of sites (not pages) were buried in the SERPS. It has gone on roughly every month or two since then and each time there are threads attempting to come up with answers.

I probably would not date it back that far - but perhaps to December 2004 - and with Matt saying that this is a data refresh to a certain algorythm that has been going on for 1.5 years perhaps this might be the case.

Whatever - the algo and data refresh from the outside looking in looks very poor/buggy.

Looking at a site:domain.com search where the homepage was first it went AWOL on the 27th June, Back on the 27th July and AWOL again on the 17th August - I bet the next data refresh would have it back to the top again - and went the homepage goes missing on a site:domain.com check then the whole site goes wrong :(

[edited by: Dayo_UK at 8:44 pm (utc) on Aug. 22, 2006]

pontifex




msg:3056314
 9:14 pm on Aug 22, 2006 (gmt 0)


dayo: homepage goes missing on a site:domain.com check then the whole site goes wrong

yep, so true! however it happened to me 3 times now and it always came back with fresh content and links... this time i am back with the homepage, but the overall ranking seems to be very poor, all of a sudden...

endurance for larger projects is needed... the "real" ranking for bigger projects seem to start after 6-9 months now!

P!

night707




msg:3056431
 10:46 pm on Aug 22, 2006 (gmt 0)

Google needs to get their act together and stop causing collateral damage with their updates that seriously harm legitimate websites.

Google needs to ensure, that the best content is upfront. These algo experts are miles beyond from that and only a smart directory, voting, ranking solution with membership structures could enforce quality.

james_plato




msg:3056451
 11:13 pm on Aug 22, 2006 (gmt 0)

i'm having problems like this with Yahoo. has anyone see that? my website keeps popping in and out....

I'm glad I'm not alone on the Google part either...

SEOcritique




msg:3056610
 2:46 am on Aug 23, 2006 (gmt 0)

Tedster, thank you for invoking the Florida update (shudder).

Just to set my own benchmark, the websites I work on saw significant gains on June 27 and have maintained about half of those initial gains during the subsequent tweaks. In other words I have no axe to grind at this time.

I would like to see a virtual show of hands. Who here has read The Search by John Battelle [battellemedia.com]? Excellent. You can put down your hands. The reason I ask is that there is an excellent description of the Florida update in John's book. He even recaps the reactions here on WebmasterWorld and GoogleGuy’s responses.

Google has come a long way since November 2003. Their communications with webmasters is far better and you can tell that they genuinely want to be helpful. In 2003 I think most website administrators who had never attended a professional conference would be hard pressed to name a Google employee other than LP, SB, ES & MM, and even then only if they were lucky. Today helpful people like Matt Cutts and Vanessa Fox are quick to participate and help set the record straight….but only up to a very generic and guarded point.

That is where the rub lies. Google’s engineers continue to trust automated systems more than they do their human brethren. In a community that embraces open source and the free flow of information it is ironic that Google continues to be one of the most secretive non-governmental organizations on the planet. If Google wants to organize the world's information and make it universally accessible and useful then Google’s leadership must accept the fact that the knowledge of how to help Google collect and distribute the world’s information is a key piece of information in and of itself. Rather than embrace good people everywhere Google lives in fear of evil.

There is an alternative viewpoint. If Google explains to website authors and search optimizers which factors it favors and which factors it frowns upon then most people will gravitate toward the ethical practices. Yes, there will always be renegades, black hats who take advantage and attempt to skew the system in their favor. But in an open environment it will be harder for these rebels to hide or to blend in. They will become as visible as blackheads on otherwise unblemished skin. And because they will stand out they will be easier to isolate and to counter.

When role model companies like Google turn their backs on their audience and secure their gates it sets an example that others are, unfortunately, quick to take-up. For example, how many businesses will plaster their 1-800 number over all of their printed catalogs and postcards but will not display a phone number anywhere on their web site? Why? Fear. How many websites will not publish their email address, rather insist that you use a difficult to find and highly impersonal form? Why? Fear. Where did they learn these practices? From companies like Google. Good people trust good businesses. When companies hide things like telephone numbers and email addresses good people do not think it is for good reasons. They think it is because the leaders of these companies do not care, do not want to lower themselves, or have something to hide. Nobody wants to work with companies that behave in this manner.

I do not think Google should be shameful, quite the contrary. Google is a fantastic company, which is why I am amazed that it does not hold its head up high, straighten its back and speak in clear resonant tones. There are several great individuals working at Google who actively participate in our community. I am thankful to them. But as a company and an industry leader I wish Google would stand-up, open-up and stop living in fear of evil.

[edited by: SEOcritique at 3:05 am (utc) on Aug. 23, 2006]

Tomseys




msg:3056618
 2:59 am on Aug 23, 2006 (gmt 0)

Is anyone missing sites in Google all of a sudden? Happening within the past hour?

KenB




msg:3056635
 3:11 am on Aug 23, 2006 (gmt 0)

SEOcritique, I liked your last post and it would be nice if your theories were true in regards to spammers being blackheads. Well I guess they are like blackheads, but they just aren't as easy for Google to get rid of them. :(

Is anyone missing sites in Google all of a sudden? Happening within the past hour?

No my site is as it has been since it's minor recovery on Aug 17th (I do wish there would be a full recovery).

SEOcritique




msg:3056647
 3:32 am on Aug 23, 2006 (gmt 0)

Thank you for the kind words KenB. I’ll be quick to point out that it is not a theory but a viewpoint (more than one can be applicable). I cannot say that I have formulated a hypothesis or conducted research to back it up. What I will do is point-out that there are two ways to combat yard weeds. One is to fight them individually, to dig them up. You will be on your knees all spring, summer and fall and you will have lots of divots to commemorate your efforts. The other way is to nurture a lawn of thick grass by aerating, fertilizing and watering. It takes much less effort and looks immeasurably greener.

[edited by: SEOcritique at 3:53 am (utc) on Aug. 23, 2006]

Tomseys




msg:3056648
 3:36 am on Aug 23, 2006 (gmt 0)

strange, most of my sites have disappeared in G though they still have PR

[edited by: Tomseys at 3:47 am (utc) on Aug. 23, 2006]

KenB




msg:3056655
 3:41 am on Aug 23, 2006 (gmt 0)

SEOcritique, yes "theory" was a poor choice of words, but I still like your point of view. The lawn analogy is a good one as well. I wish Google would take what you wrote to heart. I've always believed that if you look for evil, you will find evil everywhere. If, however, you look for and nurture good, good will prevail. I also believe that Google's current efforts to make a direct assault on SERP spammers is doomed to fail with only legitimate sites being hurt, because spammers can more easily adapt (after all they tend to use throwaway domains). If Google did more to nurture good websites and did more to cut off the flow of cash (via AdSense) to scraper/MFA sites, we would see a reduction in the SERP spam problem.

twebdonny




msg:3056656
 3:44 am on Aug 23, 2006 (gmt 0)

The Florida Update....that was the beginning of the END!

Swanson




msg:3056669
 3:50 am on Aug 23, 2006 (gmt 0)

g1smd, I am interested in your observations about supplementals.

Are you absolutely sure that the behaviour you are witnessing can be applied to a concept about why urls (or sites) go supplemental.

The reason I ask is that I will come clean:

Sites where I have on purpose used every bad trick in the book either get removed totally or go supplemental - I continue to receive crap quality traffic on these domains.

Sites that are clean, no link exchanges, can go supplemental over time if the inbound links in themsleves become downgraded (i.e. bad link purchasing!)

Sites that have too many pages visible too fast to googlebot can create a sitewide supplemental problem.

Link strength causes pages within a site to go supplemental - i.e. if you have a blog where links will rotate (the calendar pushes fresh links to the top of the tree) - when links become archived then the link strenth decreases and the page goes supplemental.

The only case I have seen where getting more backlinks gets the page un-supplemental is where link strength has gone down. Imagine that in this case it is a new algorthm keeping known pages in the index but acknowledging that they now do not have enough inbound links to count - in the datawarehouseing trade we call that "marked for deletion".

I have not witnessed any of the cases you mention, g1smd - in terms of Google keeping 2 copies of a page (one historic, one current). What I am witnessing is a "catch all" of penalties either forced or passive.

Can you give clear examples where this is actually the case and/or quotes from someone at Google that this is the behaviour you can expect?

james_plato




msg:3056670
 3:52 am on Aug 23, 2006 (gmt 0)

most of site is missing too. And after I started using Google sitemaps....hmmmm

Swanson




msg:3056673
 4:12 am on Aug 23, 2006 (gmt 0)

I do think there could be a relation there with sitemaps too.

If you suddenly tell Google there are more pages than they knew about - then surely that must go through the page rank system.

Suddenly your page rank distribution decreases - and rapidly as Google then recalculates your page distribution with what it now knows?

And that is a technical point, if you have an algorithm that takes into account link growth etc. how do you deal with notified link growth (either external or internal).

And why assume this is treated properly (i.e. not logically). As it should not realistically be treated the same as natural link structures - but on the other hand how can it not, how do you now calculate page rank in that environment?

Well, you can't. The secret is how is all that stuff being used?

And the key is, you have to treat it differently - but how can you really do that and if you try to how can you be sure that the resultant page rank + extra's = natural stuff. And if you subsequently apply penalties etc., how can you be sure that they are all being applied consistently?

egomaniac




msg:3056718
 5:26 am on Aug 23, 2006 (gmt 0)


...one factor that stood out to me was anchor text -- within the site links, not outbounds or inbound from other domains. Internal anchor text, in these cases at least, was heavily packed with target keywords, to the point of looking a bit odd for the normal (read that as non-SEO) visitor.

I have been thinking about the sitewide internal anchor links issue also. I use keywords in site-wide bread-crumb links. Another site I compete with does the same thing. However, they moved up during the last two months, while my site dropped, and now that my site is back up post 8/17, they are back down again.


...we are probably seeing a composite effect. That is, several filters in the algo are probably interacting to give what we see as just one footprint. But the exceptions are the way to find the edges, the trip points, for what is going on.

I agree. I think that there may be trip levels that require multiple factors to come into play. Doing anyone or two may not get you hit, but do too many, and you trip a penalty filter that drops you down a bit. Or dampens the value of something else like your inbound link power.

The composite nature makes it very difficult to figure out exactly what is going on.

decaff




msg:3056760
 7:09 am on Aug 23, 2006 (gmt 0)

The composite nature makes it very difficult to figure out exactly what is going on.

For sure...one place any site owner will attempt to make changes is with the "on page factors"...this is where they have the most immediate control ... count on Google having set up their filters to catch aggressive (and in the real competitive sectors subtle) changes fairly easily...with tedster's example being an easy one to catch (internal anchor text abuse...especially in the footer...this goes back to Florida when the word stemming algo was introduced and really created some chaos with what had been some very stable SERPs)...so that the parameters for ranking were then distributed through the link relationships more prominently...

The "off page factors" will be more difficult to control and even if a webmaster has set up an entire network...and can tweak both internal and external factors to their liking...Google will probable have identified this network and set some traps out..

The best solution is to minimize any type of SEO strategy to the realm of absolute human usability elements...making the site work correctly for your target demographics...and working to increase your conversions (whatever they may be)...for maximum returns...

Remember...Google's primary interest is revenue...and increasing this per quarter...their decisions will always be focused on how they can achieve this...

soapystar




msg:3056852
 8:36 am on Aug 23, 2006 (gmt 0)

while my site dropped, and now that my site is back up post 8/17, they are back down again

site grouping is not talked about much. Yet we know Google has a long standing algo that picks similar sites/pages. It would not be a surprise if they decided for very similar sites they would only show one or two. This would explain the yoyo effect between similar sites. Either they get delibertely rotated or they rotate as ranking factors are adjusted or traffic patterns are analysed or used.

webvivre




msg:3056863
 8:53 am on Aug 23, 2006 (gmt 0)

We too have been affected by the "changes" - after years of top page 1 rankings, our keywords were dumped to page 4 or 5 on about Aug 7th. We have had top rankings for several years and we are a well established site (vintage: 2000.

In attempt to correct(!), we rolled our web sie changes back to pre-Aug 7th to no effect.

Early in Aug we did submit a new site map with many more pages. So one of the possibilities (as described earlier) is that some google algorithm factor has been "diluted" by the increase in the number of pages. We saw the number of pages go to about 300,000 +, and then over a period of few days fall back to 195,000 +. Last year we had good rankings and 300,000 + pages so I am not convinced this is the answer although, of course, the new changes to the algorithm may be a factor!

Other factors that have crossed my mind is the use of Google Adsense - as many of our pages have Google Adsense adverts - the increase in pages submitted might have triggered a "spam" of Google Adsense. In other words, Google believes we are just creating pages for Adsense revenue and penalised us.

Another observation - it appears for all our major keywords we cannot get above position 30 on the rankings. We see some increases up to postion 31, and then they bounce back to 50 +. Start increasing up to 31, and then drop back again. Anyone else seeing this effect?

leeds1




msg:3056889
 9:27 am on Aug 23, 2006 (gmt 0)


I've made no changes

I don't have google sitemaps

Homepage MIA for singular term

OK for plural.

(happened before this yr)

This 182 message thread spans 7 pages: < < 182 ( 1 [2] 3 4 5 6 7 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved