Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

April 2007 Google SERP Changes

         

madmatt69

4:35 am on Apr 1, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



< continued from [webmasterworld.com...] >

So yesterday I had decent traffic for the first time in a few weeks - today it's the lowest i've ever seen it. Like since launching it 4 years ago.

No clue why...Can't put my finger on it at all.

[edited by: tedster at 5:42 pm (utc) on April 1, 2007]

trinorthlighting

10:28 pm on Apr 13, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Here is a good example of why you also see fluctuations in Google as well. Hypothetical numbers here:

Every day 10,000 internet pages are added, the bot picks them up and lets say Google bot looks at the on page and off page factors and thinks 10% (1,000) of them are worthy of the top 10. Well, now is the time to test those pages to see how people like those pages.

So now Google puts the pages up in the top 10 for a week and measures things such as bounce rates, how many people add to favorites, etc...

So now you have 30,000 pages a month being tested like this in this scenario. Now realize these are theoretical number and they are probably much greater.

Robert Charlton

10:41 pm on Apr 14, 2007 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



So now Google puts the pages up in the top 10 for a week and measures things such as bounce rates...

"Top 10?" You should be so lucky. ;)

I think that Google does give new pages a ranking benefit and looks at what happens to them, but "top 10" for a new page in the other 2-million to 200-million-page queries you may be competing on is pretty unlikely.

kidder

11:22 pm on Apr 15, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It would be nice to get some clear data on the sites that are dropping in and out - to see if we can work out a pattern / cause.

Age =
Number pages =
Links =
Industry =
Competitive factors =
Nature of symptoms =
% of traffic lost =

Marcia

12:17 am on Apr 16, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Every day 10,000 new sites are added, phrases are parsed out of the new pages and added to a taxonomy, and the co-occurrence figures change, resulting in a modification of the "possible" and the "good phrase" lists on a continual basis. At query time, when the query phrase is checked against the "good phrase list" for information gain, pages will move in and out of their previous positions according to how they sit on the posting lists.

Vimes

10:28 am on Apr 16, 2007 (gmt 0)

10+ Year Member



To add to kidder’s list:

Age =
Number pages =
% of text to html on the pages that have dropped =
Links (What type: recips, paid or natural %) =
Anchor text variations % =
Industry =
Competitive factors =
Nature of symptoms =
% of traffic lost =

I’ve noticed that it’s not just pages that are low on content dropping out of the indices, high content pages seem to be affected regardless of inbounds and the authority of those inbounds.

Vimes.

kidder

7:28 am on Apr 17, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I have been looking at the numbers of pages served for a given set of searches - they vary quite a lot from day to day - To me it looks almost like new data is being pushed out almost daily - so maybe we are stuck in some type of "bad data push" as Matt Cutts said a while back and they cycle just repeats over and over. A long shot?

adiadi

10:22 am on Apr 17, 2007 (gmt 0)

10+ Year Member



< moved from another location >

We've recently disappeared from SERPs for our best keywords, leaving us with just the home page on the first page. We're aware of some internal Google changes but it started several weeks ago and we're a little worried. Has anyone been in a similar situation?

Thanks!

[edited by: tedster at 3:06 pm (utc) on April 17, 2007]

supermanu

4:05 am on Apr 18, 2007 (gmt 0)

10+ Year Member



Welcome to the club Adiadi...

What is strange is that for your site, only the homepage remains...

For my website it was exactly the contrary: homepage gone but other files were indexed correctly

graeme_p

10:35 am on Apr 18, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Mt site was fell drastically and stayed down for a full month, all of February. Now it has bounced back to slightly better than it was in January.

I remember other people posting earlier, who both fell and recovered, so there appears to be a good chance that the likes of Adiadi will also see a recovery.

sunny_kat

5:51 pm on Apr 18, 2007 (gmt 0)

10+ Year Member



My website ranks for a highly competetive term under my industry vertical.

Every morning the website disappears from ranking for 3 hours and suddenly appears back.

This 3 hours of rank disappear is happening continously since past 5 days.

Has anyone else experienced a similar problem

Can we define this as a Google Clowning Update.

[edited by: tedster at 6:10 pm (utc) on April 18, 2007]

sunny_kat

12:02 pm on Apr 19, 2007 (gmt 0)

10+ Year Member



Guys!...No one there with thoughts?

Bewenched

12:56 pm on Apr 19, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



sunny_kat,
We have seen the same thing... a few hours of constant phone calls and good traffic... then nothing.

FredP

10:38 am on Apr 19, 2007 (gmt 0)

10+ Year Member



< moved from another location >

My web site sells a range of products and is very well known in the industry for it. My site is referred to as the authority on these products and even my competitors constantly refer to it for updates.

My prime keyword has been in the top five of google.com and .co.uk for nearly 5 years. I always have fresh contents on the site and do not use any spamming techniques. I have no broken links on the site and have very high quality links in and out of the site.

However, a couple of weeks ago, my website and a few other in the same or similar ranking position, dropped to pages 3, 4 or 5, for my prime keyword. In our place we have some awful websites, who use most spamming techniques known to google, with poor quality contents and most of them copied from other websites. Some of my other keywords have kept their positions but not all.

Can anyone help me understand this? Or can you guide me to an existing thread about this?

Any tips and advice will be highly appreciated.

[edited by: tedster at 5:39 pm (utc) on April 19, 2007]

gehrlekrona

2:44 am on Apr 20, 2007 (gmt 0)

10+ Year Member



FredP, Only God and Google can aswer that question. Well, maybe God anyway because Google sure can't or won't tell us anything whay they do all this crap.
I am wondering why Ask.com, Yahoo and maybe MSN can keep the rankings in serp's pretty stable but Google can't? I bet Google id experimenting all the time. Letting loose some crappy sites even though they know they are crappy just to see what is going to happen. Maybe they are just gathering data for something, god knows what.
It's pretty tireing and totally impossible to keep up. One day it's up and next day it's down. Up one week and down next, up one month... you get the idea.....

Marcia

2:49 am on Apr 20, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



From what I've seen on two different sites just the past couple of weeks, I've got a suspicion that it isn't taking as many links as it used to in order to move upward, I suspect the knobs have been turned a bit for that.

simey

3:06 am on Apr 20, 2007 (gmt 0)

10+ Year Member



It's bizarro world for my serps.
My main site now has all the interior pages buried.

Then there's 2 sites I'd almost forgot about. Deindexed and/or banned a couple of years ago. Now ranking #1 for some terms. lol

andrewshim

7:32 am on Apr 20, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It's pretty tireing and totally impossible to keep up. One day it's up and next day it's down. Up one week and down next, up one month

you got it right there dude...

My site has been cycling at #4, #3, #2, #1 then kaput for 2 days, then back again into the cycle.

4 days ago, it turned different. Was at #1, then kaput and not back yet. Usually I would see my site pop back in a couple of DCs, then quickly jump back in ALL DCs. This time, I see it coming back for less than 25% of the DCs, and go kaput again. Me fears that this is it... they've finally found something they don't like about my site and decided to lock me in the dungeon.

It's gotten to the point where I'm scared of moving up the serps! I'd rather stay at #29 permanently, getting a hundred uniques a day. At least that would be predictable.

kidder

11:18 am on Apr 20, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Things are changing with the index all the time - for 2 days now we are back in the top of the serps. We have tons of supp pages using the site command and going to the end of the results does not give us the usual option to repeat the search. This is something new. Maybe things are "getting sorted" as clearly they are making adjustments, you can bet from the noise on the forums they are aware of the general unrest from the webmasters. It would be nice to get some feedback on this issue.

HIPO

2:54 pm on Apr 20, 2007 (gmt 0)

10+ Year Member



I monitor many sites and I have only seen the -30/-40/-50 penalty applied to whole directories.

Is any one seeing the penalty applied to pages that are placed in the root directory?

skyewalk

3:37 pm on Apr 20, 2007 (gmt 0)

10+ Year Member



My sites main rankings have come back (from 950), but most of my deeper content pages went supplemental last week - and most disappeared from the index this week.

So now I rank well for some big keywords but have lost all my long-tailed detailed stuff - a big shame as that was alot of traffic, and I had lots of very relevant, detailed content for the more obscure searches.

Anyone had this?

g1smd

3:45 pm on Apr 20, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I have been trying to get more than half a million "duplicate content" and/or "you are not logged in" pages for a particular site OUT of the index for the last couple of years.

The number of listed pages in a site search has always been generally shrinking, but with an occasional small temporary climb back up, but on any fall it will not get to absolute zero.

The pattern is that now and for many months the number shrinks to about 200 and then climbs back to about 3000 in a dozen or more steps over about a week or two, and then it falls back to about 200 or so in a dozen or more steps, and then the cycle repeats.

The URLs in the SERPs are all URL-only entries and they are all disallowed in robots.txt. Google did a good job in removing more than half a million last year. It's just the last few hundred / last few thousand that are the problem now.

potentialgeek

5:32 am on Apr 21, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It's pretty tiring and totally impossible to keep up. One day it's up and next day it's down. Up one week and down next, up one month.

It's the new reality. It's unsettling, but we just have to get used to it. Long gone are the days when there were monthly updates, and everything stayed the same for 30 days.

Expect the fluctuations to continue so that it's different at the start of the day than at the end. It's tempting to check SERPs daily now, but there are better things to do. SERPS aren't stocks; we don't need the 'quotes' so often.

p/g

decaff

7:15 am on Apr 21, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Here's one for you..

Google knows the usability values of their organic SERPs listings (for competitive verticals down to the niches)..

Of course, they also know, in greater detail, the usability of their PPC users for CTR / Bounce rates...etc...

With this data they can test calculations...for what happens...when a top performing site in the organics is flipped on and off vs. what their vast user base (people who search daily ... 200 Million+ queries a day..is one figure I have seen)...do when they "may not" find exactly what they are looking for in the organics..and then turn to the Adwords listings...(thus the new "gentler" colors for the background) ...

AND toss this in with Google's data for User Agents and Screen Types...they also know that for some screens .. like the laptop I am writing this from... the top level "Sponsored Listings"...which by the way this text has been reduced to a slightly stealthier shade of gray recently...blends in more with the organic listings and the white background...

So given all of these calculations...they have discovered that by setting some switches in the algo...they can have the top performing organic lists pop in and out (and this can happen at peak traffic times per each sector...as tracked by usability data)...and then see an increase in Adwords Clicks...equals...more revenue..plus...the fact that when these Ads see more clicks...the costs go up for the advertisers ... CHA-CHANG!

gehrlekrona

2:18 pm on Apr 21, 2007 (gmt 0)

10+ Year Member



decaff,
I think you hit the nail on the head this time! We are too blind checking out the SERP's but the SERP's is just a 'cover up" for Google. They don't care about the SERP's, who's in and who's out. It's all a matter of getting people to click on the AdWords and the SERP is just a way for them to get people to go to Google. Guess we have to create 2 web sites out of every idea we have. One that's high in SERP's on good days and another crappy site that they show when they want people to click on the AdWords so when they flip back and forth we'll be on top anyway :)
Google is one of the best business ideas in the past 10 years and I wish I would have come up with the idea to collect all web sites data and use them to make money.

decaff

5:02 pm on Apr 21, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Google is one of the best business ideas in the past 10 years and I wish I would have come up with the idea to collect all web sites data and use them to make money...

BINGO! ... it's a stunning business model...and yes..I think any number of people would share your sentiment...

Powdork

6:46 pm on Apr 21, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The URLs in the SERPs are all URL-only entries and they are all disallowed in robots.txt. Google did a good job in removing more than half a million last year. It's just the last few hundred / last few thousand that are the problem now.
Have you tried removing the robots.txt disallow for these pages so Googlebot can see a noindex meta on the page?

thecityofgold2005

8:09 pm on Apr 21, 2007 (gmt 0)

10+ Year Member



i work in a sector where we cannot use adwords. in the past few months i have seen sites pop in and out exactly as described. for this reason i think the link between this phenomenon and adwords does not exist.

g1smd

10:04 pm on Apr 21, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>> Have you tried removing the robots.txt disallow for these pages so Googlebot can see a noindex meta on the page? <<

There is no robots meta, and there is no access to add one. I'd like to see how long it takes Google to get it right, doing it the way that it is already done.

decaff

10:41 pm on Apr 21, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



i work in a sector where we cannot use adwords. in the past few months i have seen sites pop in and out exactly as described. for this reason i think the link between this phenomenon and adwords does not exist. ..

This phenomenon would be built into the algorithm...and would occur across all sectors .. whether they are allowed to run Adwords or not...

Powdork

11:51 pm on Apr 21, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



There is no robots meta, and there is no access to add one. I'd like to see how long it takes Google to get it right, doing it the way that it is already done.
But you know Google will continue to list robots protected urls, they just won't visit the page and index the actual content. That's how the robots.txt exclusion works (the url removal tool notwithstanding, and probably not working).

< continued here: [webmasterworld.com...] >

[edited by: tedster at 7:57 pm (utc) on April 25, 2007]

This 180 message thread spans 6 pages: 180