Forum Moderators: Robert Charlton & goodroi
At that time a site: search showed "1 to 3 of about 55". Google hadn't bothered to index all the pages, and filtered most of what they had indexed out of the search. If you unhid the duplicates using "repeat this search with omitted results included" option, you could then see all 53 indexed pages, and could see that the snippet was the same for every one of them.
That problem was fixed many months ago. All of the descriptions were changed to be different, and to reflect exactly what was on that particular page. Within weeks Google was showing 150 pages in the site search, and 120 were now treated as unique.
Things remained this way for many months -- until last week.
In 72.14.207.99 and other "experimental" datacentres, the site is now back down to something like "1 to 40 of about 150". The other 110 pages all show exactly the same snippet when you unhide the duplicates.
I don't know if Google is just working with old data for those pages, and needs to update the data with a new crawl, or whether there is some bug in their system. Whatever it is, Google isn't using the latest information on the real page for indexing and evaluation of the pages. Some of the pages have a cache that has jumped back to one from several months ago, but other pages have a cache that is less than a week old.
Waiting to see what happens in the next few weeks, before changing anything. As far as I can tell, there is nothing that needs changing.
Since we have tried to differentiate the content between regional sites [ .com . .co.uk etc etc ] it was a prompt for us to shift the place where the navigation appeared on the page. My suspicion was that Google had the ability to strengthen it's duplicate content assessment of sites through the observation of the menu structure [ in our case].
great_9 - I'd be interested to know where on your pages the ALT-Tags are positioned.
and needs to update the data with a new crawl, or whether there is some bug in their system.
Both, I think.
Whitey:
I'd be interested to know where on your pages the ALT-Tags are positioned.
All over the place. HTML 4.01 Validation requires img alt tags to be specified. Descriptions in SERPS are taken from navigation menu. Alt tags were "pix" and "pixel" for filling up table cells for specific width). It is really strange since the first time they were indexed, google used the meta descriptions for each page. Also, I forgot to mention that they were unique (first 100-150 letters of an article -- it's a news site).
I replaced almost all of the alt tags to "". Since google is on my site every day, for some of the pages, if things were as usual I could see the difference in a couple of days; but since things AREN'T as usual at google...
I'm sick and tired of playing games with google. I never did any blackhat SEO (at least, knowingly) and my site still drops in serps for the last year and a half. I think I'll give up doing SEO and create a new layout.
The site has about 100k pages.
Either way, the use of the menu's to describe the site is of no value compared to the meta description or on page content.
If it is intentional, it may be part of comparing duplicate content page structures - but that's just me speculating.
I did this for awhile, but then it didn't seem to matter because google went for the description instead.
Maybe with everything else Google has reverted to picking up first text in first cell again.
In the older "BigDaddy" datacentres that page has failed to appear in the index for that search term (but still appears for the other search terms that it has ranked for, for the last 2 years).
I am guessing that those versions of the "BigDaddy" index are not being maintained, and will be phased out soon. Those "BigDaddy" datacentres usually return a higher number of pages, but are littered with ancient supplemental results.
In contrast, the "experimental" datacentres show a lower number of fully-indexed pages, but all of the supplmental results from before 2005 June are now gone. In their place some sites now show supplemental results from 2 to 10 months old instead. Supplemental results are for pages that no longer exist, or are the ghost of the content of pages that still exist but the supplemental results represent the previous version of the content of those pages.
On some, however, they have shifted to terms from a different part of the page - notably the left hand column [ of 3 ] , even if this sits under a single colum [ 1 ]
I get the feeling something is going on with comparing page structures though the DC's for duplicate assessment and isolating pages that are too similar.
Does anyone know how penalties are being applied or assessed though this fluctuation?
Does anyone know how penalties are being applied or assessed though this fluctuation?
I think it's too early to tell. Maybe at the end of the week. Just maybe. I don't think GoogleGuy is going to join us on this thread.
btw. I've removed all of the ALT tags. I'll come back with SERP results ASAP.
Every page of the site now shows exactly the same snippet, whereas before it showed the text from the meta description - and every one of them was different back then.
This used to be a good test to spot "duplicate content" problems (duplicated title/meta description especially) but now that test is completely useless.
Has Google broken the "site" test in the same way they broke the "link" test a year or two back, just to hinder people checking their sites, or do we need Google to refresh their snippet database, or fix a bug in the way that the site: search works?
The site showing 1 to 33 of about 150 a while back, was up to 1 to 77 of about 150 a couple of days ago, and as of yesterday showed 1 to 142 of about 148.
Google seems to have fixed this to work.
now... about 6 months ago I've changed all the links to point to www.example-domain.com, and google still hasn't dropped the example-domain.com links :-(
Ok... so I added rewrite rules into htaccess to do a 301 redirect in order to eliminate example-domain.com pages from the index, and thus remove any duplicate content that google maybe thinks is there.
Also, I've added a google sitemap, made a nofollow,noindex tags for "print this page".
Old search-engine-friendly pages are showing up in the serps so I 301-redirected them also, to point to the real pages (example: page1234.html >> 301 www.example.com/page.php?id=1234)
And look at this: MONDAY - today morning stats:
site:www.domain.com 194,000 results
site:domain.com 194,000 results
What's going on? Can't they see that they already have those pages in their index?
And meta descriptions are now back in, I think for all of the pages.
[edited by: Woz at 6:00 am (utc) on May 15, 2006]
[edit reason] Examplified URLs [/edit]
This effect appears much more marked if the target is a phrase - sometimes just a couple of words - that appear together in the text but not in the description.
For me, at least, it seems to produce good results. Google usually extracts meaningful text.
If the keyword that produces the hit doesn't appear in the description but does appear in the text, then Google will show text fragments containing the keyword from the page instead of the meta tag description.
Agreed, and that's the correct way, BUT... we were talking about a search for our sites (ex. site:example.com). The results were there but instead of descriptions, google used alt tags from first table cells on the left side.