Welcome to WebmasterWorld Guest from 188.8.131.52
I've wondered too as to why some search terms are affected more than others and some result pages are changing around while other barely move.
Has anyone seen any relationship between how popular a search the term is and how much movement is going on?
As to when it will end I don't think we can predict as nothing quite like this has gone on before.
[edited by: tedster at 2:48 am (utc) on June 1, 2008]
joined:June 3, 2007
what happened and why it happened at all .....
No offence intended however if we knew the answers to your questions this thread would not exist.
I think I have an idea why I have one site bouncing all over and as such performing an experiment to see IF it resolves the issue.
If it does not then I am as flumoxed as anyone else as to which stupid dial they turned the wrong way or their inability to roll back a bad data push.
Just to add some perspective to this, when one site rolls back another site gains. We all win or lose at some point its just we have less long term free traffic stablility than we had years ago from the mighty google and its more evenly spread out now.
I dont think this is a case of a specific adjustment because imo the serps have been moving around like this for some 5/6 weeks now at least.
Dont get me wrong i fully understand the frustration and pain when a site gets hit, its not nice and can make you feel most unwell to say the least!. When google loves you its your best friend and when it dumps you its the partner from hell!.
We are at the stage now where i think almost every site is getting a period in or out the serps for top positions, gawd knows - on some sites its like watching the ruddy Hokey Cokey!, some sites have a bit more stability than others but in all the serps of google today dont provide anything like the stability of google two/ three years ago, lets face it!.
Because we are all so dependant on google search with its massive share of the market this makes it all the more important to rank but i just dont see the current events / adjustments as a "Bad data Push" or anything else - thats wishfull thinking for those that have fallen back this time around!.
I think google knows exactly whats its doing and as stated before on other threads we are now in the age of "everflux" and i think its all now about making google as profitable as possible whilst delivering mediocre results. None of us can now expect the free traffic stability that we used to get from google and its time to face facts!
The best policy is to try and be less dependant on google (i know thats very difficult to do, due to its massive reach)and try to get visitors from a range of different ways in addition to google so that you are less vulnerable to googles constant fluctuations
The only way for webmasters to get 100% stability in google now is by buying adwords anything else is a bonus. We no longer see updates like in the past where every quarter or so the serps would change, now its continual adjustment and our businesses need to change to take this into account
Not only you, I'm finished completely, they put 16 of my sites in the -950 box on 4 June (all of them).
SEOPTI, tell us more about your particular situation if possible. I have heard of losses, but when we hear a fellow webmaster has gotten hit this bad, it warrants a study in my opinion. Many senior members of this forum have helped me recover, most notably the slaughter of September 2005.
In terms of your websites, let's get some information, we might be able to help. Since there is a common thread here (you) it might actually be easier to assess this with your particular websites since we might be able to see some common traits. Maybe take a sample set of the websites you own and tell us:
1. What is the age of the websites
2. What was the average previous ranking of your main keyword? What exact position are the websites now? Did all pages and keywords get demoted?
3. Hosting location and domain registration location
4. Are websites 'canonically' correct
5. Describe any recent promotion of websites, especially internal an external linking
6. Are the websites linking to each other in any way?
7. Are all websites listed with you as owner in whois?
8. Did all drop at the same time and date?
9. Are all written using unique content?
10. Do all sites show in position one for a domain.com search?
11. Have you ever received a -950 penalty before on any of these websites?
Is any one else seeing - not so great SERPS?
[edited by: tedster at 3:35 am (utc) on June 26, 2008]
[edit reason] moved from another location [/edit]
Kristos- interesting post.
But i wonder if there are more basic pattern across sites affected.
Before examining the body content elements, how about checking some of the good old "duplicate content issues" .
I've just discovered a large quantity of similar [ not the same ] meta descriptions on a site that has taken a hit. I mean maybe 50% of the auto generated description characters are the same. Previously it survived , but maybe not now.
Maybe Google tweaked the dials another notch on this score. On page issues would probably be the first place to decipher this IMO
Has anyone seen this as a potential factor?
one of the new factors is the blogoshpere, big time!.
one of the nuances of blogging is that folks talk about what other folks talk about, import related news stories into their blog, each of which is technically duplicate content.
One of our efforts which has done really well in this jumbled up time is a computer related site and brings in news feeds that are related to the content on each page only. this is working very well.
each news article has it's own page on the site that the original feed links to with a ORIGINAL ARTICLE LINK with a NOFOLLOW.
It has industry information (part numbers, descriptions, etc) which could also technically be called duplicate content, but G is loving it
G is an information hound and if you are providing relevant, related information, it shows you are trying to enhance the user experience. Again. every page has video, many instructional how-tos, some from you tube but most original videos created for the user.
Googlebot is attempting to grab my sitemap 6-7 times a day, where before June 4th, it was being spidered about once a day.
EVERY time my main effected sites' main page is spidered by Google, I see an initial rise in rankings, and after a 24-hour period (presumably after filters applied), my ranking sinks to lower positions than ever. Keep in mind, I've been making NO changes at all since June 4th, so I find this totally bizarre behavior.
It's almost as Google has a personal vendetta against my site, and is allowing for a slow, drawn-out death. Maybe this drop in ranking is in direct proportion to the tanking of the Google share price :)
joined:June 3, 2007
(fyi - the ALT on every image should say image of ****, or picture of ***,
An interesting statement however can you back this up? I have checked W3C and Google Webmaster guidelines and cannot find any reference anywhere that says this.
I know this is kind of strange to hear for folks like us (I have been in SEO since 1999 when altavista was the big dog on the block )but the RICH user experience is what G is looking for.
joined:June 3, 2007
Hehehe, must find me an SEO apprentice with nothing to do one day:-)
Anyway, my site is:
- About 2 years old
- Runs Adsense ads
- PR 4
- About 50 .aspx pages
- About 4,000 dynamic pages
Until June 23, we ranked on page 1, or top of page 2, on Google for our top phrase. Most important, this result on Google pointed to the home page. We also had a long tail, with lots of inside pages ranking very well in Google for hundreds of secondary phrases.
After June 23, we have the following situation:
- For our top phrase, our home page has essentially disappeared from Google. It is not in the top 50 pages.
- But, for our top phrase, we do have an inside page showing up around page 10, and another page later on.
- Our home page has not disappeared from Google -- not only is it # 1 if you type in the name of the site, complete with "sitelinks," it even shows up deep in the Google SERPs for some secondary terms!
- Our long tail results seem largely unchanged -- doing very well, for secondary phrases.
This is very weird, because to put it in perspective, this site is all about one topic, really. The site is about "blue widgets," and the domain root was ranking on page 1 for "blue widgets," then suddenly that top phrase, but only that one, has been penalized in some way. Not only that, it has only been penalized in relation to the home page. I don't get it.
I might not have posted but I see at least two of my competitors, also targetting blue widgets, who have dropped from page 1 or 2 of Google, for the same top phrase, and drop down either to about page 20, or even into oblivion. But none of these competitors disappeared. If you type in the site name, there is their home page. Also, for at least one of them, their secondary pages seem to be doing ok as well. Again -- a case of the domain page getting seriously damaged, for the top phrase.
In the last week or so, two events took place on my site:
- I changed the way that the database handles all the dynamic page meta description and keyword tags, so as to make them all unique, when previously they had all been identical (ie one set of description tags, one set of keywords).
- The site was hacked about a week ago, and the hacker got into the database and inserted the word "hacked" into every title tag. The site has also been getting pounded with malicious adware all week, and the actual webmaster is scrambling to fix.
Like I said, I don't know if this is part of the bigger, June 4 + meltdown, or if it's some random coincidence. But since some of my competitors got slammed also, it can't just be my site. Maybe we all got hacked by the same scumbag. (Can I say scumbag here?).
I do work at link-building, but try to stay white hat. Haven't bought or sold any links for this site, only do recips with good, solid sites, hand-picked, and recips are only a small percentage. Do a bit of original article writing for a couple of the better article distro sites, etc. Have gotten some great, one way IBLs for the site by doing some decent, original research for some of the content on the site. Have had a widget out there for about six months, but it is totally relevant and on topic, utilizing the site's database, and has only been added to relevant sites, as far as I know.
If my SEO has caused this, we are all doomed!
<info> sites that have g analytics have suffered</info>
G is analyzing everything, storing everything, for future reference
not all sites of ours though.
sites (of ours) with rich user experience have come out front in this troublesome time. there have been some dips as they re-examine old pages, but things are good.
please re-examine my previous posts. in this thread
G is concentrating on a RICH user experience
which means to most of us having plenty of media, properly tagged images and media and making sure our structure is outside of our page in css
Hitted hard yesterday night 26-Jun.
Seems like a sort of -200 / -250 penalty.
The ONLY thing I did is to modify sitemap after warning "All the URLs in your Sitemap are marked as having dynamic content".
Now my site 2 years old, lots of links, white hat, adsense, analytics, titles and description very similar (travel site) is gone.
Bad situation but I think we have to deal with it.
Any pattern in my site according to you?
Good luck to all.
make a fast search on google and you will notice that some webmasters are getting this message under sitemap status.
The warning scared me so I drop <changefreq>always</changefreq> node for all my pages as yes, dynamic pages in this case seems mean not dynamic url but instead fresh content, "dynamically" updated in content.
The warning disappeared; so did my site in SERPs :-(
...some (not the majority of our sites) have g analytics...
<info> sites that have g analytics have suffered</info>
So only your sites with analytics got hit? The ones without have retained their positions or am I reading this all wrong? And is it just one or two sites that have shown this behavior or a much larger sample?
so it makes perfect sense that you would want them to hear on their audio page reader "picture of ****" instead of "blue widgets, green widgets, cheap widgets".Since most websites do not use "image of ..." I would think the tools that blind people are using to listen to a website will ad the text "image of..." themself. Now that would make sense to me.
The alt tag is used in an image. So a tool (or a search bot) knows it is an image and the value in the alt tag is the subject of the image. Why tell in the alt tag of an image that it is an image?
Although you got this as an inside tip from Google, it still does not make sense to me to ad more then the subject alone of an image in the alt tag.
I don't believe Google will give a website that is using "image of widget" a higher value than a website that is using "widget" in the alt tag.
However, I do believe that Google will give a website a higher value when it is using the alt tag then when it is not using the alt tag. And by using I don't mean entering the same main keyword of the page in each image.
I'm suddenly getting loads of long-tail searches to a site that has had very little traffic for years. And the long-tail searches are hitting pages that are NOT in ANY WAY optimised for those searches.
To the contrary, they are optimised for e.g. "red widgets" - and I'm seeing searches for "fat widgets using red and iron" i.e. totally random!
Anybody else seeing anything similar, or can offer an explanation?
I wish them to settle down as What I just saw right now :).
The main impact I see is because of the recent dynamic optimization for the ecommerce sites..will keep posted..