Forum Moderators: Robert Charlton & goodroi
For example, duplicate meta descriptions and duplicate title tags.
Is there a definite answer yet on this?
Do duplicate meta descriptions affect your site's overall ranking?
Do duplicate title tags affect your site's overall ranking?
The Google help topic is inconclusive - it says it will not affect your being indexed, but it does say it may increase your ranking.
We're fixing our site anyways, but I would like to know everyone else's experiences with this...
How important are these tags and meta descs. for ranking?
That's in addition to all the internal stuff that generates the basic list of results for the SERPs and then ranks and sorts them.
And even if the title and description aren't the same, there's a problem when they sniff what's the first part of pages as being dupes. And that's not for snippet generation, it's an indexing/scoring issue.
[edited by: Marcia at 11:49 pm (utc) on July 7, 2008]
Dupe title plus dupe snippet among pages on different sites is the delivery filter mechanism.
I support Marcia's observation above to add to this - particularily the content at the top of the page as the 3rd level of duplicate differentiation.
there's a problem when they sniff what's the first part of pages as being dupes. And that's not for snippet generation, it's an indexing/scoring issue
Does the position of unique content on the page matter to reducing the overall duplicate content calculation in conjunction with the meta titles and , maybe/maybe not, the description ?
[edited by: Whitey at 12:29 am (utc) on July 8, 2008]
The filter was working a little different last summer, but back then I tested on a site that was practically all filtered out with the "similarity" filter. When I removed a top of page JavaScript drop-down menu (which shows up as plain text in text-only cache, not links) from a few test pages, they bumped right out from behind the "click here for more similar results." I tried this for pages one by one.
Needless to say, the site was struggling to get any traffic for "country product" searches that they hoped to target. I assume that, to the algo, every page looked relevant - and that means no page looked relevant!
In that case, the titles and meta descriptions were generated dynamically, but only those two keywords showed variation. That wasn't enough, until we got the smokescreen out of the way.
Shouldn't JavaScript either be hidden in comment tags, or be delivered from an external file? Would that solve the problem?
On a site that I recently worked on, one with accessibility navigation links at the top of every page, the accessibility div wasn't delivered to any bots at all... only to browsers.
If you don't use a description tag then Google picks up the first actual text it sees on the page
The description snippet depends on the actual query terms that the user entered. If you only do a site: operator query, then there's no semantic content for the snippet algo to work with - so Google often defaults to the top text for site: queries.
But if the search results are for a query that does have semantic content, then Google will often pick a snippet from somewhere else on the page, where those terms actually appear. However, if the search term is also a link in the menu, then there's always a chance that you'll see that protion of the menu displayed in text.
The point is that there is not just one description snippet for a URL - Google has moved on to a more sophisticated algo. It still has bugs and sometimes gives strange results - so the meta description can be very important.
Even with a meta description, if the page is returned for a search that doesn't have relevant content in the description, the snippet algo may still ignore the meta description content and generate a different snippet.
Wordpress is a great blogging app and it doesnt use any meta desc...
Im sure that they thought about it while making the app...
And guess what else... blogger which is owned by google doesnt either use meta desc....
There you have it...
Im sure that they thought about it while making the app...
Why would they? Most CMS builders are coders who do not know much about search engines. Whether it's Wordpress or any other CMS platform, for some reason the meta description is very often overlooked. I can't count how many clients I've had update their CMS to give them control over the meta description. In fact, that's part of my standard initial discovery these days.
Experiences of success mean a lot more than pointing to what this or that star website is doing. If you're not a star, then you've got to take advantage of every edge you can get.
(I never use JS, so I'm ignorant on the subject)
Added:
I just looked at a site that has many dozens in a drop down list, and the form action is a _GET from a php catalog. Of course, I can only see the HTML in the page source code (not the PHP), but the point is that the anchor text of all those links (probably the entire catalog) shows up as straight text in the Google text cache - not links. On every page of the site.
That can't be a good thing. ;)
[edited by: Marcia at 7:08 am (utc) on July 8, 2008]
If I leave those pages and don't correct the dupe descriptions on them will it *potentially* put a drag on the pages from my main site as well...or are they seen as two unique sites? The url for the main site is http://example.com/ and the store is http://example.com/store/index.php
[edited by: Receptional_Andy at 7:19 pm (utc) on July 8, 2008]
[edit reason] Switched to example.com (it can never be owned) [/edit]
Question about drop-down lists:
I happen to have a script (which is not available from the programmers any more) that uses a JS dropdown list, but ALSO provides a Perl/CGI alternative for user agents that don't have JS enabled. I'm assuming that the Perl alternative would serve up the links as actual hyperlinks with anchor text.
I've just found it tucked away on my HD, but have never yet used it. Would that be worth a shot to give a try?
Do you think that blogger also doesnt think about SEO? Doesnt google want to rank for its own products on its own engine?I dont see meta desc on blogger either!
Ajaxunion: if you've ever used AdWords you're likely aware of the potentially dramatic impact different snippets of text have on the number of users who click on one ad rather than another. I consider the meta description element to be an absolute gift for any SEO: those two lines represent the single largest area of page 'real estate' your listing occupies on a page. And you get to choose the text without too much concern for affecting actual rankings!
is there any chance you can put custom titles on the store pages and eliminate the meta descriptions? And is there enough PR and link love in the site to support the store pages in the main index?
Actually upon second look the problem may be simpler. As I mentioned the store is a separate app, but integrated with the main site through an integration plugin. The plugin adds a link on each full size image page on the main site which visitors can click and it passes them to a page in the store where they can add the item to the shopping cart. In Google WMT the URL in the link to the store item and the store's page are seen as duplicates. I'm not quite sure how that is happening, but I think adding a nofollow tag to the store links on the main site might prevent this. Otherwise, I could block the link urls in robots.txt I suppose. Any other ideas would be appreciated.
A site newly online a few months ago has one internal page that introduces the new website, and outlines a few plans for the website in the future. That page does not mention any of the keywords that pertain to the content of the rest of the site.
That page has shown up like this in a Google site:domain.com search...
Either:
- always listed absolute last in the site:domain.com search.
Or:
- hidden behind the "similar results filter", and then appears half way up the list when &filter=0 is applied.
So, "omitted results" or "filtered results" may not just be "similar results", but may also include "pages that have content that has nothing to do with the theme of the rest of the site".
.
It's odd though, because another page that might have fallen victim to that filtering, is the accessibility page, where you can reset the website presentation to be more accessible, select a larger typeface, and so on. However, it may be that "accessibility" is detected as being relevant to a site all about a health-related topic, and it might have received such filtering had the website been all about some other topic. The accessibility page often appears about a tenth or fifth of the way down the list for a site:domain.com search.