Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Duplicate Description and Title Notices in WMT - do they affect ranking?

         

helpnow

4:55 pm on Jul 5, 2008 (gmt 0)

10+ Year Member



GWT Tools now contains Content Analysis. This is a great new tool. For large sites, some of the information there may have been difficult / impossible to discover previously through any other means.

For example, duplicate meta descriptions and duplicate title tags.

Is there a definite answer yet on this?

Do duplicate meta descriptions affect your site's overall ranking?

Do duplicate title tags affect your site's overall ranking?

The Google help topic is inconclusive - it says it will not affect your being indexed, but it does say it may increase your ranking.

We're fixing our site anyways, but I would like to know everyone else's experiences with this...

How important are these tags and meta descs. for ranking?

g1smd

10:23 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Yes, there are a number of operations applied to the SERPs at the time of delivery: clustering, filtering, hiding behind the "omitted results" link, snippet generation, bolding of words in the SERP, tagging with "xx ago" information, and many other things.

That's in addition to all the internal stuff that generates the basic list of results for the SERPs and then ranks and sorts them.

Marcia

11:49 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>first sniff those pages appear to be dupes so why waste processing power on them.

And even if the title and description aren't the same, there's a problem when they sniff what's the first part of pages as being dupes. And that's not for snippet generation, it's an indexing/scoring issue.

[edited by: Marcia at 11:49 pm (utc) on July 7, 2008]

Whitey

12:19 am on Jul 8, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Dupe title plus dupe snippet among pages on different sites is the delivery filter mechanism.

I support Marcia's observation above to add to this - particularily the content at the top of the page as the 3rd level of duplicate differentiation.

there's a problem when they sniff what's the first part of pages as being dupes. And that's not for snippet generation, it's an indexing/scoring issue

Does the position of unique content on the page matter to reducing the overall duplicate content calculation in conjunction with the meta titles and , maybe/maybe not, the description ?

[edited by: Whitey at 12:29 am (utc) on July 8, 2008]

Marcia

12:33 am on Jul 8, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Whitey, I'm seeing some problems with "top of page" content both when first encountered in source code sequence and also in the "active window" section even for pages with a tables layout (no CSS-P) and a left side mega menu.

The filter was working a little different last summer, but back then I tested on a site that was practically all filtered out with the "similarity" filter. When I removed a top of page JavaScript drop-down menu (which shows up as plain text in text-only cache, not links) from a few test pages, they bumped right out from behind the "click here for more similar results." I tried this for pages one by one.

tedster

12:53 am on Jul 8, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Big drop-down boxes that can be indexed as text often make trouble - ANYWHERE on the page, and especially at the top. Last year I had success changing two such boxes on a sitewide basis so that the options no longer showed up in text. One was a "select country" list and the other was "select product".

Needless to say, the site was struggling to get any traffic for "country product" searches that they hoped to target. I assume that, to the algo, every page looked relevant - and that means no page looked relevant!

In that case, the titles and meta descriptions were generated dynamically, but only those two keywords showed variation. That wasn't enough, until we got the smokescreen out of the way.

Lorel

1:34 am on Jul 8, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If you don't use a description tag then Google picks up the first actual text it sees on the page and if this is the menu or some other text that is repeated on every page you get right back to the duplicate description tag problem.

g1smd

2:11 am on Jul 8, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



*** When I removed a top of page JavaScript drop-down menu ***

Shouldn't JavaScript either be hidden in comment tags, or be delivered from an external file? Would that solve the problem?

On a site that I recently worked on, one with accessibility navigation links at the top of every page, the accessibility div wasn't delivered to any bots at all... only to browsers.

tedster

3:33 am on Jul 8, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If you don't use a description tag then Google picks up the first actual text it sees on the page

The description snippet depends on the actual query terms that the user entered. If you only do a site: operator query, then there's no semantic content for the snippet algo to work with - so Google often defaults to the top text for site: queries.

But if the search results are for a query that does have semantic content, then Google will often pick a snippet from somewhere else on the page, where those terms actually appear. However, if the search term is also a link in the menu, then there's always a chance that you'll see that protion of the menu displayed in text.

The point is that there is not just one description snippet for a URL - Google has moved on to a more sophisticated algo. It still has bugs and sometimes gives strange results - so the meta description can be very important.

Even with a meta description, if the page is returned for a search that doesn't have relevant content in the description, the snippet algo may still ignore the meta description content and generate a different snippet.

Ajaxunion

4:17 am on Jul 8, 2008 (gmt 0)

10+ Year Member



The reason Matt Cutts doesnt have meta desc on his blog is because he uses wordpress. and with the simple install they dont have meta desc on the post pages.

Wordpress is a great blogging app and it doesnt use any meta desc...

Im sure that they thought about it while making the app...

And guess what else... blogger which is owned by google doesnt either use meta desc....

There you have it...

Marcia

4:26 am on Jul 8, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You can have a meta description with a WordPress plugin. And you can customize page titles with either one, though it's easier with WordPress.

But Matt isn't worried about rankings, he doesn't need to be.

tedster

4:53 am on Jul 8, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Im sure that they thought about it while making the app...

Why would they? Most CMS builders are coders who do not know much about search engines. Whether it's Wordpress or any other CMS platform, for some reason the meta description is very often overlooked. I can't count how many clients I've had update their CMS to give them control over the meta description. In fact, that's part of my standard initial discovery these days.

Experiences of success mean a lot more than pointing to what this or that star website is doing. If you're not a star, then you've got to take advantage of every edge you can get.

Whitey

5:36 am on Jul 8, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Shouldn't JavaScript either be hidden in comment tags, or be delivered from an external file? Would that solve the problem?

Does anyone have an answer for this [ just so it doesn't get overlooked ] - i thought this was an interesting question.

Marcia

7:01 am on Jul 8, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Could it even work, having JS enclosed by comment tags, when the drop-down box has to be visible for visitors to use?

(I never use JS, so I'm ignorant on the subject)

Added:

I just looked at a site that has many dozens in a drop down list, and the form action is a _GET from a php catalog. Of course, I can only see the HTML in the page source code (not the PHP), but the point is that the anchor text of all those links (probably the entire catalog) shows up as straight text in the Google text cache - not links. On every page of the site.

That can't be a good thing. ;)

[edited by: Marcia at 7:08 am (utc) on July 8, 2008]

g1smd

10:07 am on Jul 8, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



*** Why would they? Most CMS builders are coders who do not know much about search engines. ***

I am having this battle at the moment with a custom-CMS designer.
Client is the content-owner, and we have no access to modify the code running the CMS.

Frustrating isn't the word.

ichthyous

4:41 pm on Jul 8, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Big drop-down boxes that can be indexed as text often make trouble - ANYWHERE on the page, and especially at the top.

I think you're spot on there...I had big nav drop downs at the bottom of my pages and just removed them all. They added to the page size and watered down the targeted keywords.

ichthyous

4:43 pm on Jul 8, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I have another question...in WMT I have 1,200 pages reported as dupe descriptions. Most of those are from my online store, not from my main website. My store actually plugs-in to my main site and people can click a button on a page in the main site and it puts the item in the cart in the online store. I don't spend too much time optimizing the store, since it really just operates as a checkout vehicle for me, and since the store app doesn't use URL rewrite.

If I leave those pages and don't correct the dupe descriptions on them will it *potentially* put a drag on the pages from my main site as well...or are they seen as two unique sites? The url for the main site is http://example.com/ and the store is http://example.com/store/index.php

[edited by: Receptional_Andy at 7:19 pm (utc) on July 8, 2008]
[edit reason] Switched to example.com (it can never be owned) [/edit]

Marcia

4:59 pm on Jul 8, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



ichtyous, is there any chance you can put custom titles on the store pages and eliminate the meta descriptions? And is there enough PR and link love in the site to support the store pages in the main index?

Question about drop-down lists:

I happen to have a script (which is not available from the programmers any more) that uses a JS dropdown list, but ALSO provides a Perl/CGI alternative for user agents that don't have JS enabled. I'm assuming that the Perl alternative would serve up the links as actual hyperlinks with anchor text.

I've just found it tucked away on my HD, but have never yet used it. Would that be worth a shot to give a try?

Ajaxunion

7:06 pm on Jul 8, 2008 (gmt 0)

10+ Year Member



Do you think that blogger also doesnt think about SEO? Doesnt google want to rank for its own products on its own engine?

I dont see meta desc on blogger either!

Receptional Andy

7:27 pm on Jul 8, 2008 (gmt 0)



Do you think that blogger also doesnt think about SEO? Doesnt google want to rank for its own products on its own engine?

I dont see meta desc on blogger either!

Ajaxunion: if you've ever used AdWords you're likely aware of the potentially dramatic impact different snippets of text have on the number of users who click on one ad rather than another. I consider the meta description element to be an absolute gift for any SEO: those two lines represent the single largest area of page 'real estate' your listing occupies on a page. And you get to choose the text without too much concern for affecting actual rankings!

ichthyous

5:12 am on Jul 9, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



is there any chance you can put custom titles on the store pages and eliminate the meta descriptions? And is there enough PR and link love in the site to support the store pages in the main index?

Actually upon second look the problem may be simpler. As I mentioned the store is a separate app, but integrated with the main site through an integration plugin. The plugin adds a link on each full size image page on the main site which visitors can click and it passes them to a page in the store where they can add the item to the shopping cart. In Google WMT the URL in the link to the store item and the store's page are seen as duplicates. I'm not quite sure how that is happening, but I think adding a nofollow tag to the store links on the main site might prevent this. Otherwise, I could block the link urls in robots.txt I suppose. Any other ideas would be appreciated.

g1smd

11:19 am on Jul 11, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



One more thing about "iotsytmrr,whosevsttXXad.Iyl,ycrtswtori." or "similar results filtering"...

A site newly online a few months ago has one internal page that introduces the new website, and outlines a few plans for the website in the future. That page does not mention any of the keywords that pertain to the content of the rest of the site.

That page has shown up like this in a Google site:domain.com search...

Either:
- always listed absolute last in the site:domain.com search.

Or:
- hidden behind the "similar results filter", and then appears half way up the list when &filter=0 is applied.

So, "omitted results" or "filtered results" may not just be "similar results", but may also include "pages that have content that has nothing to do with the theme of the rest of the site".

.

It's odd though, because another page that might have fallen victim to that filtering, is the accessibility page, where you can reset the website presentation to be more accessible, select a larger typeface, and so on. However, it may be that "accessibility" is detected as being relevant to a site all about a health-related topic, and it might have received such filtering had the website been all about some other topic. The accessibility page often appears about a tenth or fifth of the way down the list for a site:domain.com search.

This 51 message thread spans 2 pages: 51