| 11:17 am on Jun 23, 2008 (gmt 0)|
|what happened and why it happened at all ..... |
No offence intended however if we knew the answers to your questions this thread would not exist.
I think I have an idea why I have one site bouncing all over and as such performing an experiment to see IF it resolves the issue.
If it does not then I am as flumoxed as anyone else as to which stupid dial they turned the wrong way or their inability to roll back a bad data push.
| 1:35 pm on Jun 23, 2008 (gmt 0)|
Just to add some perspective to this, when one site rolls back another site gains. We all win or lose at some point its just we have less long term free traffic stablility than we had years ago from the mighty google and its more evenly spread out now.
I dont think this is a case of a specific adjustment because imo the serps have been moving around like this for some 5/6 weeks now at least.
Dont get me wrong i fully understand the frustration and pain when a site gets hit, its not nice and can make you feel most unwell to say the least!. When google loves you its your best friend and when it dumps you its the partner from hell!.
We are at the stage now where i think almost every site is getting a period in or out the serps for top positions, gawd knows - on some sites its like watching the ruddy Hokey Cokey!, some sites have a bit more stability than others but in all the serps of google today dont provide anything like the stability of google two/ three years ago, lets face it!.
Because we are all so dependant on google search with its massive share of the market this makes it all the more important to rank but i just dont see the current events / adjustments as a "Bad data Push" or anything else - thats wishfull thinking for those that have fallen back this time around!.
I think google knows exactly whats its doing and as stated before on other threads we are now in the age of "everflux" and i think its all now about making google as profitable as possible whilst delivering mediocre results. None of us can now expect the free traffic stability that we used to get from google and its time to face facts!
The best policy is to try and be less dependant on google (i know thats very difficult to do, due to its massive reach)and try to get visitors from a range of different ways in addition to google so that you are less vulnerable to googles constant fluctuations
The only way for webmasters to get 100% stability in google now is by buying adwords anything else is a bonus. We no longer see updates like in the past where every quarter or so the serps would change, now its continual adjustment and our businesses need to change to take this into account
| 10:35 pm on Jun 23, 2008 (gmt 0)|
stockexperts, if you would like to know more about the -950 take a look into the hot topics.
[edited by: SEOPTI at 10:38 pm (utc) on June 23, 2008]
| 5:43 am on Jun 24, 2008 (gmt 0)|
|Not only you, I'm finished completely, they put 16 of my sites in the -950 box on 4 June (all of them). |
SEOPTI, tell us more about your particular situation if possible. I have heard of losses, but when we hear a fellow webmaster has gotten hit this bad, it warrants a study in my opinion. Many senior members of this forum have helped me recover, most notably the slaughter of September 2005.
In terms of your websites, let's get some information, we might be able to help. Since there is a common thread here (you) it might actually be easier to assess this with your particular websites since we might be able to see some common traits. Maybe take a sample set of the websites you own and tell us:
1. What is the age of the websites
2. What was the average previous ranking of your main keyword? What exact position are the websites now? Did all pages and keywords get demoted?
3. Hosting location and domain registration location
4. Are websites 'canonically' correct
5. Describe any recent promotion of websites, especially internal an external linking
6. Are the websites linking to each other in any way?
7. Are all websites listed with you as owner in whois?
8. Did all drop at the same time and date?
9. Are all written using unique content?
10. Do all sites show in position one for a domain.com search?
11. Have you ever received a -950 penalty before on any of these websites?
| 5:59 am on Jun 24, 2008 (gmt 0)|
I second the motion, CainIV. I was going to ask the same question a while back. The fact that so many sites of one webmaster were hit must surely be a good way to determine what the 950 is all about. Is it impossible to find the common denominator that is poisoning your rankings? Sixteen sites... what corners are you cutting, if any? What do they all have in common?
| 6:19 am on Jun 24, 2008 (gmt 0)|
950 penalty don't usually hit for cutting corners. To the contrary, they usually hit full blown, well developed pages. Of course, the operative word with 950 penalties is "usually"... they are all over the board with zilch consistency.
| 7:57 am on Jun 24, 2008 (gmt 0)|
12. Do you have competitors who may have reported you for anything?
Shouldn't this be on a separate thread?
| 11:28 pm on Jun 25, 2008 (gmt 0)|
We are seeing older Stale pages, many with products that no longer exist being folded into the SERPS. I looks like G is taking another look at pages that have not been ranked for sometime with the algo changes that they said (at the SMX west) would be live by august.
Is any one else seeing - not so great SERPS?
[edited by: tedster at 3:35 am (utc) on June 26, 2008]
[edit reason] moved from another location [/edit]
| 4:11 am on Jun 26, 2008 (gmt 0)|
Is anyone else seeing crud in the serps for their niche?
| 4:53 am on Jun 26, 2008 (gmt 0)|
Kristos, I have been seeing crud SERPs in my niche... paid, hijacked, reciprocals, foreign language, etc. It seems to happen more on brand searches.
Do you have a reference for the August algo changes reference? I'd love to research that. Thanks.
| 5:29 am on Jun 26, 2008 (gmt 0)|
it was a personal meeting with the google reps after hours, and in the lubrication that occurs at the after hours party (FYI, where you get the purest info!)
g is starting to look to videos (instructional, related, properly tagged) and images properly alted for blind people (the original intention of the AL tag) on every page. (fyi - the ALT on every image should say image of ****, or picture of ***, not keyword stuffed but actually designed for what the internet was designed for)
you structure should all be in css if you can do it,
so the page that google sees (without css) should look like a news page, h1 at the top h2 - then a paragraph, h3 and a sub paragraph, etc etc
the old internet was designed to give informational pages from university to university and that is what they will be shortly (and are now rolling it in even now evidently) looking for
your pages and we have proof that is is working....
without css (FIrefox/webmastertools/css/disable all styles) your pages should be bare of all structure just headers properly formatted and data and videos and images (properly ALTED)
| 9:25 am on Jun 26, 2008 (gmt 0)|
- interesting post.
But i wonder if there are more basic pattern across sites affected.
Before examining the body content elements, how about checking some of the good old "duplicate content issues" .
I've just discovered a large quantity of similar [ not the same ] meta descriptions on a site that has taken a hit. I mean maybe 50% of the auto generated description characters are the same. Previously it survived , but maybe not now.
Maybe Google tweaked the dials another notch on this score. On page issues would probably be the first place to decipher this IMO
Has anyone seen this as a potential factor?
| 5:05 pm on Jun 26, 2008 (gmt 0)|
<quote>how about checking some of the good old "duplicate content issues" </quote>
one of the new factors is the blogoshpere, big time!.
one of the nuances of blogging is that folks talk about what other folks talk about, import related news stories into their blog, each of which is technically duplicate content.
One of our efforts which has done really well in this jumbled up time is a computer related site and brings in news feeds that are related to the content on each page only. this is working very well.
each news article has it's own page on the site that the original feed links to with a ORIGINAL ARTICLE LINK with a NOFOLLOW.
It has industry information (part numbers, descriptions, etc) which could also technically be called duplicate content, but G is loving it
G is an information hound and if you are providing relevant, related information, it shows you are trying to enhance the user experience. Again. every page has video, many instructional how-tos, some from you tube but most original videos created for the user.
| 5:31 pm on Jun 26, 2008 (gmt 0)|
Another observance, for what it's worth:
Googlebot is attempting to grab my sitemap 6-7 times a day, where before June 4th, it was being spidered about once a day.
EVERY time my main effected sites' main page is spidered by Google, I see an initial rise in rankings, and after a 24-hour period (presumably after filters applied), my ranking sinks to lower positions than ever. Keep in mind, I've been making NO changes at all since June 4th, so I find this totally bizarre behavior.
It's almost as Google has a personal vendetta against my site, and is allowing for a slow, drawn-out death. Maybe this drop in ranking is in direct proportion to the tanking of the Google share price :)
| 7:21 pm on Jun 26, 2008 (gmt 0)|
|(fyi - the ALT on every image should say image of ****, or picture of ***, |
An interesting statement however can you back this up? I have checked W3C and Google Webmaster guidelines and cannot find any reference anywhere that says this.
| 7:42 pm on Jun 26, 2008 (gmt 0)|
this was an inside tip from g's folks, actually it isn't really what you'd call a secret, Its just a great user experience for all users. The original intent of the ALT tag was not severe keyword stuffing, but instead a way for a blind person to read a page, so it makes perfect sense that you would want them to hear on their audio page reader "picture of ****" instead of "blue widgets, green widgets, cheap widgets".
they cant figure out what it really is.
It was actually shared with me as a way to get our pictures and images into the image search on google (yet another way to drive traffic to our site) and we are seeing very good results with this simple addition of a descriptive word of what it actually is.
this also applies to videos, text around the video should also say "video of ***", or "movie of ***"
I know this is kind of strange to hear for folks like us (I have been in SEO since 1999 when altavista was the big dog on the block )but the RICH user experience is what G is looking for.
| 7:58 pm on Jun 26, 2008 (gmt 0)|
I do have all my alts completed, always have done from the late 90's, but the thought of adding "image/picture of" to some several tens of thousands of images, both thumbnail and enlargements, does not exactly thrill me!
Hehehe, must find me an SEO apprentice with nothing to do one day:-)
| 1:02 am on Jun 27, 2008 (gmt 0)|
I have read all through this thread (started out reading the June 4 thread), and I was not really intending to post, since I'm not sure that what happened to my site is part of the same algorithm change (if that's what it is). However, then I noticed that a couple of my competitors (at least) seem to have been slapped down by G in a similar fashion to my site, and to me, that looks like a pattern of some kind. Maybe someone with more experience than me can see a clue where I cannot.
Anyway, my site is:
- About 2 years old
- Runs Adsense ads
- PR 4
- About 50 .aspx pages
- About 4,000 dynamic pages
Until June 23, we ranked on page 1, or top of page 2, on Google for our top phrase. Most important, this result on Google pointed to the home page. We also had a long tail, with lots of inside pages ranking very well in Google for hundreds of secondary phrases.
After June 23, we have the following situation:
- For our top phrase, our home page has essentially disappeared from Google. It is not in the top 50 pages.
- But, for our top phrase, we do have an inside page showing up around page 10, and another page later on.
- Our home page has not disappeared from Google -- not only is it # 1 if you type in the name of the site, complete with "sitelinks," it even shows up deep in the Google SERPs for some secondary terms!
- Our long tail results seem largely unchanged -- doing very well, for secondary phrases.
This is very weird, because to put it in perspective, this site is all about one topic, really. The site is about "blue widgets," and the domain root was ranking on page 1 for "blue widgets," then suddenly that top phrase, but only that one, has been penalized in some way. Not only that, it has only been penalized in relation to the home page. I don't get it.
I might not have posted but I see at least two of my competitors, also targetting blue widgets, who have dropped from page 1 or 2 of Google, for the same top phrase, and drop down either to about page 20, or even into oblivion. But none of these competitors disappeared. If you type in the site name, there is their home page. Also, for at least one of them, their secondary pages seem to be doing ok as well. Again -- a case of the domain page getting seriously damaged, for the top phrase.
In the last week or so, two events took place on my site:
- I changed the way that the database handles all the dynamic page meta description and keyword tags, so as to make them all unique, when previously they had all been identical (ie one set of description tags, one set of keywords).
- The site was hacked about a week ago, and the hacker got into the database and inserted the word "hacked" into every title tag. The site has also been getting pounded with malicious adware all week, and the actual webmaster is scrambling to fix.
Like I said, I don't know if this is part of the bigger, June 4 + meltdown, or if it's some random coincidence. But since some of my competitors got slammed also, it can't just be my site. Maybe we all got hacked by the same scumbag. (Can I say scumbag here?).
I do work at link-building, but try to stay white hat. Haven't bought or sold any links for this site, only do recips with good, solid sites, hand-picked, and recips are only a small percentage. Do a bit of original article writing for a couple of the better article distro sites, etc. Have gotten some great, one way IBLs for the site by doing some decent, original research for some of the content on the site. Have had a widget out there for about six months, but it is totally relevant and on topic, utilizing the site's database, and has only been added to relevant sites, as far as I know.
If my SEO has caused this, we are all doomed!
| 3:17 am on Jun 27, 2008 (gmt 0)|
dont want to raise no Personal or G flags
I have adsense on many of our site and g analytics on probably too many
not to raise fear but ....
remember g is an information hound.
some (not the majority of our sites) have g analytics.
one and one alone has shined in this (see preceding posts)
just keep in mind g is thinking three steps ahead of most folks, maybe not you all ,
data data data - user experience is the focus for their algo
it simple for them - they can't help it.
<info> sites that have g analytics have suffered</info>
G is analyzing everything, storing everything, for future reference
not all sites of ours though.
sites (of ours) with rich user experience have come out front in this troublesome time. there have been some dips as they re-examine old pages, but things are good.
please re-examine my previous posts. in this thread
G is concentrating on a RICH user experience
which means to most of us having plenty of media, properly tagged images and media and making sure our structure is outside of our page in css
| 6:37 am on Jun 27, 2008 (gmt 0)|
I have given a good look at one of the sites that has replaced me for my key word (starting June 4th). The site is entirely reprinted content! I doubt the site is ranking for many other key words, but for my main key word it is currently #1 (I'm #14). The keyword is included on the home page three times and is included in the url (it is also in my url). The site is very small, is only 1 year old, and has only about 30 links pointing to it! This is a nice site but it makes no sense why it is #1.
| 7:15 am on Jun 27, 2008 (gmt 0)|
Here I am.
Hitted hard yesterday night 26-Jun.
Seems like a sort of -200 / -250 penalty.
The ONLY thing I did is to modify sitemap after warning "All the URLs in your Sitemap are marked as having dynamic content".
Now my site 2 years old, lots of links, white hat, adsense, analytics, titles and description very similar (travel site) is gone.
Bad situation but I think we have to deal with it.
Any pattern in my site according to you?
Good luck to all.
| 2:08 pm on Jun 27, 2008 (gmt 0)|
This one is really interesting:
"All the URLs in your Sitemap are marked as having dynamic content"
Where did you see this one, in WMT?
This tells me they are after dynamic sites.
| 2:37 pm on Jun 27, 2008 (gmt 0)|
make a fast search on google and you will notice that some webmasters are getting this message under sitemap status.
The warning scared me so I drop <changefreq>always</changefreq> node for all my pages as yes, dynamic pages in this case seems mean not dynamic url but instead fresh content, "dynamically" updated in content.
The warning disappeared; so did my site in SERPs :-(
| 5:37 pm on Jun 27, 2008 (gmt 0)|
...some (not the majority of our sites) have g analytics...
<info> sites that have g analytics have suffered</info>
So only your sites with analytics got hit? The ones without have retained their positions or am I reading this all wrong? And is it just one or two sites that have shown this behavior or a much larger sample?
| 6:03 pm on Jun 27, 2008 (gmt 0)|
we cannot draw a real conclusion with this since the site that came through this shining brightly has g analytics.
but it is a safe bet, sooner or later that g will start trying to figure out how to use this data as it amasses
| 8:01 pm on Jun 27, 2008 (gmt 0)|
Since most websites do not use "image of ..." I would think the tools that blind people are using to listen to a website will ad the text "image of..." themself. Now that would make sense to me.
|so it makes perfect sense that you would want them to hear on their audio page reader "picture of ****" instead of "blue widgets, green widgets, cheap widgets". |
The alt tag is used in an image. So a tool (or a search bot) knows it is an image and the value in the alt tag is the subject of the image. Why tell in the alt tag of an image that it is an image?
Although you got this as an inside tip from Google, it still does not make sense to me to ad more then the subject alone of an image in the alt tag.
I don't believe Google will give a website that is using "image of widget" a higher value than a website that is using "widget" in the alt tag.
However, I do believe that Google will give a website a higher value when it is using the alt tag then when it is not using the alt tag. And by using I don't mean entering the same main keyword of the page in each image.
| 8:50 pm on Jun 27, 2008 (gmt 0)|
only in image results in G, and it is working for us wherever we implement
| 8:58 pm on Jun 27, 2008 (gmt 0)|
I've noticed significant serps shake-ups in some of the competitive areas I follow. The pattern I see is some (but not all) of the mom and pop, maybe not so much content, SEOed store sites are down; sites like encyclopedia, .gov, authority content and manufacturer (rather than individual small stores that sell the products) sites are up.
| 11:46 am on Jun 28, 2008 (gmt 0)|
Ok, what happened on the night of the 26th? Has anybody noticed any radical changes?
I'm suddenly getting loads of long-tail searches to a site that has had very little traffic for years. And the long-tail searches are hitting pages that are NOT in ANY WAY optimised for those searches.
To the contrary, they are optimised for e.g. "red widgets" - and I'm seeing searches for "fat widgets using red and iron" i.e. totally random!
Anybody else seeing anything similar, or can offer an explanation?
| 3:46 pm on Jun 28, 2008 (gmt 0)|
I can see the change in the rankings for our clients and qucxily gather that Most of our UK Seo clients have better rankings..but they seems to be unstable as of now.
I wish them to settle down as What I just saw right now :).
The main impact I see is because of the recent dynamic optimization for the ecommerce sites..will keep posted..
| 5:02 pm on Jun 28, 2008 (gmt 0)|
|Re-inclusion requests are really only a last resort. |
Have any of you tried this and if so has it worked? Especially if you have a well-aged site that follows the G guidelines. This has been going on off and on since April for one of my sites; my options are really getting narrow.
| This 212 message thread spans 8 pages: < < 212 ( 1 2 3 4 5 6  8 ) > > |