| 11:18 pm on Sep 21, 2006 (gmt 0)|
"No, I don't. What is the problem? A black widget picture should have the alt text of black widget."
The problem is that for a very long time I was on page 3 for my main key phrase, and now I'm on page 27. For a great part of that time that my site was stable in the rankings, I had no ALT text whatsoever. (I simply forgot to add it until recently.)
That could be a coincidence, I realize, but as I said, right now I don't have much to lose by experimenting.
| 11:28 pm on Sep 21, 2006 (gmt 0)|
Not to steer off course with the topic but 72.14.207.x's seem to be reverting to the filtered results that we were seeing pre-August. Those of you that came out of the box in August, are you still showing up on these DC's for your chosen terms? I can't quite pinpoint if it is pre August SERP's or there is a new filter being applied to the result sets.
[edited by: MLHmptn at 11:28 pm (utc) on Sep. 21, 2006]
| 12:30 am on Sep 22, 2006 (gmt 0)|
Another possible test
Just wondering how many sites affected contain and link to pdf files?
| 12:34 am on Sep 22, 2006 (gmt 0)|
|I'm not sure what to put in an ALT for an image of a black widget, other than 'black widget.' |
"Photo" or "image" would work. Remember, the purpose of ALT text is to provide information to blind people who use screen readers or to users who, for whatever reason, have images disabled in their browsers. If you don't have a caption, then "black widget" would be appropriate, but if you do, repetition is an unnecessary burden on users.
As for whether keywords in ALT text will help you, hurt you, or have no effect at all, I'd guess that the answer probably depends on whether other aspects of your pages fit a spam profile. It's not unreasonable to believe that, in some cases, ALT text that reads "black widgets blue widgets red widgets purple widgets widgetology widgetonian widgetville" can be the straw that breaks the camel's back.
| 1:04 am on Sep 22, 2006 (gmt 0)|
"Photo" or "image" would work for me. I DO have captions, as I said.
"As for whether keywords in ALT text will help you, hurt you, or have no effect at all, I'd guess that the answer probably depends on whether other aspects of your pages fit a spam profile."
Exactly, europeforvisitors. I'm not for a moment saying that everybody should delete their alt text -- just that in MY OWN case, I suspect that repeating keywords in my alt text was what broke the camel's back.
As I said, I've been using keyword-rich titles and descriptions straight from Amazon's pages, and while that really seemed to be working well for some time, all of a sudden not so much.
| 1:15 am on Sep 22, 2006 (gmt 0)|
This whole alt image discussion is a non starter for me. Yes of course alt text is a good thing for your site and alt text should be a brief description of the image. Thats it!
Please don't bother replying be far more interested in if you think pdf files present a hijack vulnerability
[edited by: Pirates at 1:19 am (utc) on Sep. 22, 2006]
| 1:49 am on Sep 22, 2006 (gmt 0)|
Im seeing a huge change in the number of pages returned for searches - just taking a quick look at one indicator single word search, the count has gone from over 800 million pages down to around 50 million - and it looks to be across the board for other searches - billions of pages are not being returned on searches for some reason
A database shakeup?
| 2:31 am on Sep 22, 2006 (gmt 0)|
I am not sure but it looks like pre-august come-back for me and if I remember right then I see old results that was showing up before the turn to the better for me in August.
One odd thing that I have seen over all is that I am getting hits from google, just a few, for keyword I haven't even targeted at all, and when I check Google then I am on first place out of 29 MILLION pages! It seems though, that not a lot of people search for that though but anyway....
I am getting, THANK GOD!, more hits from MSN and Yahoo lately and even from Ask and Mamma... Not sure if I did earlier or my stats program just didn't show it.....
All in all the traffic is down from 12-15,000 unique a day to under 1,000 so that really su#*$!. Some of my big competitors seems to have been affected too but not at all the we have been...
I saw among the Supplemental results that Google had pages that were 6-8 months old and that I don't even use anymore BUT have another name but kind-of the same content. Maybe I should clean out my folders and deal with the file that "Are not found" instead of having to guess if they have a duplicate penalty or not.
| 2:32 am on Sep 22, 2006 (gmt 0)|
|if you think pdf files present a hijack vulnerability |
Are you refering to the recently published PDF security vulnerabilities [webmasterworld.com]? I know that the report said that we have barely scratched the surface, but as I understand the exploits, the files must be on a Windows box. Also what has been published so far runs on a client, and not the server.
| 2:12 pm on Sep 22, 2006 (gmt 0)|
Definitely seeing a disturbance in the force this morning. Got a high ranking site thatís gone poof on some data centers.
Anyone else feeling the effects of one these "data pushes" today?
| 2:51 pm on Sep 22, 2006 (gmt 0)|
I have been in the hole since the April "refresh".. getting to about 1% of my orginal traffic.
Today 99% of my site has gone supplemental!?
| 3:17 pm on Sep 22, 2006 (gmt 0)|
Every site: command on 22.214.171.124 seems to be returning only three results and then the rest go supplemental. That's the DC I've been hitting through my DSL connection. However, I haven't noticed any significant traffic drops, and my sites are still coming up where I expect them to on key search terms.
My cable connection is hitting 126.96.36.199, and that seems to give the normal results as far as supplementals.
You might check to see which DC is reporting you as 99% supplemental.
| 5:11 pm on Sep 22, 2006 (gmt 0)|
Well now the 99% supplemental dropped back to about 10% on regular google search.. I have a total of about 3000 page btw.
On 188.8.131.52 I am still 99% supplemental.
This rollercoaster is nuts.. certainly keeping us all on our toes all summer long. I just hope I return back to my regular (4+ year) results again. From 100,000+ hits a month to about 5,000 isn't fun.
[edited by: AustrianOak at 5:12 pm (utc) on Sep. 22, 2006]
| 8:34 pm on Sep 22, 2006 (gmt 0)|
Q1: Is it just me, or do supplemental results now require an even more "obscure query" to return them than before?
Q2: Has anyone found any positive usefulness for supplemental results at all?
| 9:34 pm on Sep 22, 2006 (gmt 0)|
There are many types of Supplemental Results, check the earlier threads for details.
| 11:08 pm on Sep 22, 2006 (gmt 0)|
Thanks Tedster for link, I was thinking about it on a different level that maybe we could discuss another time. But I think back to another one of your comments about plex must be aware. I think you are right and I am sure there are people far cleverer than myself dealing with the results so I am going to chill out, have a beer and let google do what in my opinion they are by far the best at "filtering spam listings".
| 7:20 pm on Sep 23, 2006 (gmt 0)|
So has anyone who took a major hit on Sept. 15th shown any signs of coming back? Traffic from Google to my site is now down 95% of what it used to be.
| 12:19 am on Sep 24, 2006 (gmt 0)|
Jumping in here, hope you don't mind. :)
My expertise is in another area and not SEO, but my site has original content, no link farming or anything like that - but last Friday it appears all of my Google-ranked pages have totally disappeared, and now the only Google traffic I get is if someone searches on my name or website name. Not good. No other search engines were affected, I still have my top 10 rankings there, and as far as I can see I haven't been banned from Google.
Is everyone as much in the dark about what happened as I am? I've read so many different things in the last 24 hours that people are saying caused it - since the group here is obviously sharp, I'm hoping for some advice on what to do and what the heck caused this.
| 2:38 am on Sep 24, 2006 (gmt 0)|
The mystery for many webmasters is how could they have been thriving on Google traffic for a long time, and then all of a sudden it goes away. They do a site: query and see most of their urls are now a "Supplemental Result" -- or worse still, their pages are just not there.
My personal take on this is that Google is in a long-term, rolling clean-up of the way they handle their index -- a massive reorganization finally made possible by the new elbow room that the Big Daddy infrastructure provides.
Google is not doing this to bedevil webmasters and business owners, although that does seem to happen. They are doing this to make their search results cleaner, less filled with duplicate paths to the same information. This is their core direction -- indexing as much of the world's information as they can, rather than indexing hundreds of ways for searchers to find the same information.
One impression I get is that many web-based businesses do not watch their server logs closely enough. As long as the traffic and conversions are there, they just assume that "nothing is broken." I see this even in very sizable companies. When I finally pry a log or two from them, I see that Google traffic is being sent to all kinds of url variations that are only in the index by accident. In other words, it's like not getting on a scale because you are already sure you don't have a weight problem.
Check out this thread from 2005, brought to our attention by Marcia: What exactly is "duplicate content" [webmasterworld.com]. In that thread, which precedes Big Daddy, people were already talking about the importance of unique titles and meta tags. That bird has now come home to roost.
Also note the discussion about how the duplicate testing will strip out the common page elements -- the template. So the answer is "No", repeating your template elements on many pages will not trip the duplicate filtering. In fact, you're more likely to get into trouble by introducing small variations to the template from page to page, especially on a larger site.
I think that forum members here have done a good job untangling the kinds of issues that cause ranking problems with these last 3 or 4 data refreshes, and how to fix them. The information is here, and I am working to organize it into a couple of solid reference threads so we don't need to keep saying the same things over and over.
One warning; just because you haven't had a problem so far, if you do have one of these situations that we have been dissecting recently, it may well come back to bite you in October or November. The most critical of these are the duplicate problems --
1. unintentionally duplicate urls (canonical issues, query strings, index.html etc)
2. intentionally duplicate content within your domain.
This includes actual page copy, titles and meta descriptions. Whether the duplicate issues were unintentional (an incorrect url rewrite) or intentional (creating many doorway pages with little content variation) duplicate information looks just one way to an algorithm -- it's duplicate, and Google doesn't want it. An algorithm cannot read your intentions, only your code.
These issues area sign of a technically unhealthy website. Whether these factor are creating problems right now or not, business owners and webmasters should at least put together their repair plan -- even if they are not yet willing to execute it. Do the upfront work and do it now. In the case of a dynamic website, the work may get tricky.
All that said, once in a while I do see a site that just got in trouble that does not seem, at first glance, have one of these issues. But that is pretty darned rare. It could be a false positive on some algo dial or other, but maybe I just didn't look deeply enough yet. If you're having troubles, then I say suspect yourself first rather than getting defensive about yourself or angry with Google.
And one more idea. If you are not already doing this, this is a good time to start keeping a changelog for your website. If something starts to go south you need to know what changes you made and when, or you'll just be guessing. Do your analysis in a disciplined manner. Understand that just because two things look related, that does not mean the relationship is cause and effect.
As an example of the value of a changelog, sometimes a site decides to add a "custom 404" but they don't execute it properly. This so-called "404" ends up giving a 200 header for any non-existent URL. Now this error may not bite back for several months. In the beginning, sales stats may even look better. But eventually, rankings can collapse under the load of all those duplicate urls pointing to the same content with a 200 OK header. If the company has a changelog, they can look back at their major changes in recent months and see what might have created the problem.
Many people who whose sites were hurt recently COULD have seen the writing on the wall by giving some regular attention to their server logs and asking some tough questions ab out what they showed. You've got to know where your traffic is coming from, and whether it makes sense. I worked with one site recently who thought their conversions were in trouble. A closer analysis showed that half their Google traffic was going straight to pop-up pages that had no navigation back to the main site. Many of their main urls were already in duplicate redundancy country and not ranking well at all. Short terms ifx, get some navigation on those pop-up pages. Long terms fix, handle many flavors of duplicate content trouble.
I think next month will bring another flood of sites who start having troubles for the first time, as this Google sea change continues to roll on. I don't think they will be a whole new KIND of trouble, just more of the same.
| 4:28 am on Sep 24, 2006 (gmt 0)|
>> In fact, you're more likely to get into trouble by introducing small variations to the template from page to page, especially on a larger site.
Why would that be tedster? I am curios...
| 4:37 am on Sep 24, 2006 (gmt 0)|
If the "template" in one case no longer matches the site template, then everything in the eccentric version of the template -- anchor tags, keywords etc -- may get evaluated as completely independent page content. As such it may trip various flags and filters. In short, it can look like an intentional attempt to manipulate Google rankings.
This is supposition on my part, but it fits the Google patterns that I feel relatively certain about. And in one case that I worked with, the oddball template page just would not rank, even though it certainly seems like it should. But when we conformed that page's template to the site-wide template, it did rank within a few weeks.
I realize that's just one case and not at all a proof of a cause-and-effect relationship. But it was pretty suggestive.
| 4:47 am on Sep 24, 2006 (gmt 0)|
What about sub-templates? Where there is an overall template for the whole site, and sub-templates that vary slightly depending on which section of the site they are used on.
Mostly those variances might be navigational links, but they might also include things like disclaimers that apply only, or primarily to a single section.
I tend to think G can handle that situation fine as long as each section has enough pages to establish a pattern for the sub-template.
| 4:51 am on Sep 24, 2006 (gmt 0)|
I agree that it may seem dubious and add one line to the suspicious count, but what about placing "Related Products," on the side which in turn vary from page to page? Now I am confused.
My main site has problems ranking for "domain.com" 0(from #30 - #45; sandbox IMO) since I messed up with links from another site of mine a while back...maybe I am being paranoid with the content and once the inks "penalty" is lifted I will be fine.
| 5:44 am on Sep 24, 2006 (gmt 0)|
|I tend to think G can handle that situation fine as long as each section has enough pages to establish a pattern for the sub-template. |
Absolutely. I never saw anything that even hinted at a problem in this area. What is enough pages? Maybe three or four, something like that. I have one case where there are only four pages in a sub-template, and there was never any ranking problem that we noticed.
|but what about placing "Related Products," on the side which in turn vary from page to page |
I can really realte to that question. As I see it, if the content varies from page to page, then it will not get picked up as template content. Instead it will be evaluated as part of the body content. So if the set of related products is relatively unchanging over several different pages, I'd go to some effort and make sure that the featured product on each of those pages has some solid and unique description -- to avoid duplicate filtering for the urls involved. As always, if the ranking is already fine, then I don't make changes!
I doubt that a Related Products sidebar could look like an attempted manipulation -- unless the information gets interpolated into the main menu, or something funky like that. The bigger concern would be tripping a duplicate flag if the rest of the page is extremely thin.
In fact, I'm watching over a new site launch this week where we struggled with something like the above in trying to make dynamically generated product pages sufficiently unique. I think we've got it nailed, but only Google will tell for sure.
| 9:58 pm on Sep 24, 2006 (gmt 0)|
|The mystery for many webmasters is how could they have been thriving on Google traffic for a long time, and then all of a sudden it goes away. They do a site: query and see most of their urls are now a "Supplemental Result" -- or worse still, their pages are just not there. |
Can I add to this another scenario (that I was talking about), they do a site command and see a disproportion in the number of pages in the result to the actuall pages that are actually on there site (please include pdf's when assessing) and the first results doesn't show the home page but maybe some supplemental page.
Then using the command I posted earlier they see allot of numeric subdomains containing scraped content from their site at the bottom linked to another numeric subdomain but the results at the top of the page will probably contain either adsense or a hyperlink to someone that you know in your field especially if they use the search facility that the numeric subdomain invariably use and you can bet the whole thing has been paid for by the competitor (look for the numerous listings of one site others may be mentioned for relevance).
I only say this because I want to separate what I was talking with what Tedster is now talking about which seems to be more about good site structure and duplicate content within your site you were unaware of.
As I say I am in agreement with Tedster that anything I have possibly stumbled on would I am sure be already be being dealt with by Google already anyway. So if your affected in this way I would just let Google do there job, that they are very good at.
| 10:06 pm on Sep 24, 2006 (gmt 0)|
A big thanks for that, Tedster. Much appreciated food for thought (even more food for thought in your subsequent posts).
[edited by: Patrick_Taylor at 10:11 pm (utc) on Sep. 24, 2006]
| 10:37 pm on Sep 24, 2006 (gmt 0)|
By the way you may see some big names involved in this. Just shows the big boys spam as much as anyone else and trustrank should not be on brandnames (which it isn't, well done google).
| 2:44 am on Sep 25, 2006 (gmt 0)|
Not sure if this was a summer thing, but Google has done relatively major updates each month lately. Call them data refreshes or whatever, but it seems that enough sites have been hit, come back, hit again and so on. I hope google has somethign better for next month. Will we see it the 27th (end of 3rd Qt that Matt Cutts had mentioned,) or in mid October?
| 8:42 am on Sep 25, 2006 (gmt 0)|
Waiting for the data refresh for couple of my sites too... the old higher rankings had made an appearance in a few DCs a week back to vanish again. 27th is the date on which it has happened on 2 occasions earlier.
| 3:36 pm on Sep 25, 2006 (gmt 0)|
Would have to say looking at the serps this morning that the refresh has already started - either that or someone played with an amazing filter - blogs have gotten a huge boost and old sites across the board have been replaced by Amazon, .edu and "community boards"
This seems to be extremely widespread from what I see and not limited to one industry - at least the 15 or so I keep track of with heavy hitting money words
| 4:08 pm on Sep 25, 2006 (gmt 0)|
|the refresh has already started |
Started for us late Friday, and still seeing the effects today. Just like on July 27th, sites removed from certain data centers for their main keywords.
| This 82 message thread spans 3 pages: < < 82 ( 1  3 ) > > |