homepage Welcome to WebmasterWorld Guest from 54.227.160.102
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Googlebot ignoring robots.txt AND WMT parameter settings
ichthyous

10+ Year Member



 
Msg#: 4483195 posted 2:06 pm on Aug 9, 2012 (gmt 0)

Lately the number of duplicate page titles and decriptions Google is reporting for my site in Webmaster Tools has been increasing. The great majority of those reported pages are just the same page with navigational parameters added to the end. For example: site.com/pageurl/ and site.com/pageurl/?g2_navId=xfe40ba4a are being considered dupes now when they weren't before. I have blocked google from indexing the dynamic urls in robots.txt for years and I have set the parameter under "URL Parameters" in Webmaster Tools to "one representative url" as they recommend...also did this a few years ago. Still, google is now indexing these pages when it wasn't before and considering them duplicates of the page without the dynamic string appended. Other than adding some rewrite rule to my .htaccess file to redirect GB to merge all the pages is there any way to control this? I believe that the dynamically appended urls are necessary for navigation so I might be limited in using rewrite rules. I was also wondering if all these dupe page titles and descriptions may bring down my quality score and hurt my ranking? Thanks for any advice!

 

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4483195 posted 3:38 pm on Aug 9, 2012 (gmt 0)

Right now, I don't trust WMT to indicate the truth of the data ranking is calculated from. As you point out, in recent weeks all kinds of reporting anomalies are showing up. Unless you have actually seen traffic drops that correlate to the reporting changes, I'd simply set the information "on the mack burner."

Unfortunately, Webmaster Tools does go through periods where it's not very dependable as a source of data that means "take action on this right away." It seems to be based on a secondary copy of Google's data, not the actual core data used to generate SERPs. And that secondary copy gets, well, very strange at times.

It may just be caused by a reporting idea gone haywire, or a bad data import from another back end copy. However, when the WMT data gets weird, we need to stay focused on our actual traffic and treating that well well.

ichthyous

10+ Year Member



 
Msg#: 4483195 posted 3:58 pm on Aug 9, 2012 (gmt 0)

I see...this has been getting worse for some time though, not just the last two weeks. I did see a big drop in traffic right after the update on June 24th, but I had also implemented some page changes so I can't be sure what the source of the drop was. Traffic is very slowly creeping up, although not what it was before the 24th. In general, the higher the number of pages reported with dupe titles and descriptions, the more it impacts your site's quality score?

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4483195 posted 4:03 pm on Aug 9, 2012 (gmt 0)

It can be hard to analyze things when you just made other changes at the same time. Have you checked to see if Google Search is now sending traffic directly to a URL that you asked them to ignore?

Str82u



 
Msg#: 4483195 posted 5:31 pm on Aug 9, 2012 (gmt 0)

I can say that as a test last month I created a page with a bad link added deliberately, added the page with the bad link to robots.txt THEN uploaded the page with the bad link. Two weeks later both Bing and Google show me the 404 for a page that they never should have seen the link to if they obeyed the robots directive.

It seems like traffic is better when the duplicate titles and descriptions are more under control but I don't think it causes a huge effect on SERPs unless those pages are actually included in the directory. If you do a site:domainname search, do you see those URLs? If not it's probably just an issue that WMT is bringing to your attention rather than something to take action on.

ichthyous

10+ Year Member



 
Msg#: 4483195 posted 7:36 pm on Aug 9, 2012 (gmt 0)

Thanks for the responses. I ran a search using site:mydomainanme and it appears that the pages flagged as dupe content reported in WMT are not appearing in the serps...the "canonical" pages (if that's what they are called) are there. So that's a good thing. I am seeing massive rewriting of all my page titles now. Google has even gone so far as to strip out the first half of every title. For instance, I use "Category Name - Item Name" as my page titles to differentiate each item page which might be used in different categories. The categories are being stripped and just the item name appears now. Nothing I suppose I can do about it...it will be interesting to see how it affects the CTR. I have 301'd every broken page I could find on my site and used Page Speed to optimize my pages to the nth degree hoping that will improve my site's quality score in Google's eyes. I will prob have to start correcting any dupe content issues I see where two different "canonical" pages are being flagged, leaving aside the pages which should be indexed. Thanks for your help

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4483195 posted 9:59 pm on Aug 9, 2012 (gmt 0)

it will be interesting to see how it affects the CTR.

Yes, it will. That is the goal of Google's title rewrite algorithm, after all. Internal page differentiation is more of an in-site concern, and first you actually need the traffic for that disambiguation to matter to your visitors.

lucy24

WebmasterWorld Senior Member lucy24 us a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



 
Msg#: 4483195 posted 1:16 am on Aug 10, 2012 (gmt 0)

I believe that the dynamically appended urls are necessary for navigation

You believe? Don't you know? It is your own site isn't it?

If g### is rewriting page titles as part of an experiment to increase ctr, I must be in the control group :( Or possibly their computer took one look at the page content and threw up its virtual hands in despair. ("Uhm... What, if anything, does this page have to do with lawn mowers?")

Str82u



 
Msg#: 4483195 posted 3:46 am on Aug 10, 2012 (gmt 0)

I am seeing massive rewriting of all my page titles now.
I'm getting that too but in the opposite way. I have pages that include the site title after a hyphen "Green Item Page - Widget Website" and some of the pages that don't use the site title are having it added in the SERPs like internal site search results pages "Results for Green Widgets" becomes "Results for Green Widgets - Widget Website".

I've seen something similar in the past and wonder it's not a subtle hint to start adding the site title to the pages without it.

lucy24

WebmasterWorld Senior Member lucy24 us a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



 
Msg#: 4483195 posted 7:08 am on Aug 10, 2012 (gmt 0)

I've seen something similar in the past and wonder it's not a subtle hint to start adding the site title to the pages without it.

Nuh-uh. No way. Not falling for that one. As soon as everyone dutifully runs around and adds the site title to all page titles, g### will issue a new directive about Duplicate Content In Titles.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4483195 posted 10:36 am on Aug 10, 2012 (gmt 0)

No way. Not falling for that one.

I completely agree. Google tailors the click-able title to specific search terms, but that doesn't mean we should assume they want to see it used for any and all search terms. I title my pages to help my visitors know what they're about. That's it plain and simple.

Str82u



 
Msg#: 4483195 posted 11:21 am on Aug 10, 2012 (gmt 0)

@lucy24 and @tedster - thanks for those last two replies, @ichthyous thanks for bringing up that you noticed title rewrites.

Ralph_Slate

10+ Year Member



 
Msg#: 4483195 posted 2:12 pm on Aug 10, 2012 (gmt 0)

I'm seeing "duplicate meta description" messages in WMT because Google can't figure out the order of parameters in the querystring. For example, they are treating page.php?a=1&b=2 as a different page than page.php?b=2&a=1.

Yes, it is my own fault for not being consistent in how I internally link to that page, but I'd think that since Google is aware that they are parameters, the order shouldn't matter.

tedster

WebmasterWorld Senior Member tedster us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4483195 posted 2:24 pm on Aug 10, 2012 (gmt 0)

Maybe, eventually, they will. But for some websites the order does matter, just as for some websites, the presence or absence of "www" does matter. We shouldn't expect Google to be responsible for our technical set-up. Even when they do it well, the release of a new batch of code can undermine that in a day. So we need to stay sharp ourselves, and not expect Google to look out for our own interests.

ichthyous

10+ Year Member



 
Msg#: 4483195 posted 5:05 pm on Aug 10, 2012 (gmt 0)

Google tailors the click-able title to specific search terms


Yes, Google adds or omits certain keywords for the search. For instance my titles use "relevant-term-1" and, later in the title, "relevant-term-2". If that 2nd term appears in the search Google will include that term in the rewrite, but only if it's part of the search. That makes sense and may actually help, as it highlights the term people search for while truncating the overall title to make it more readable. I think they may be taking this a bit too far now though as I am seeing simple short page titles being rewritten in a fashion makes less sense than the original. As for the appended parameters Lucy, not all of us coded our sites ourselves. I am using an open source platform that is getting quite old now and they were pretty clueless about SEO to begin with. I had to do a lot of customizing of the code to avoid dupes, but why they need to append nav parameters to the end of the url under certain circumstances is beyond me. Even more of a mystery is why google is now indexing them and setting them aside after six years of following my instructions in robots.txt and WMT panel.

lucy24

WebmasterWorld Senior Member lucy24 us a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



 
Msg#: 4483195 posted 1:37 am on Aug 11, 2012 (gmt 0)

Tedster? Do we look that much alike? Never woulda guessed it :)

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved