homepage Welcome to WebmasterWorld Guest from 23.20.28.193
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / WebmasterWorld / Webmaster General
Forum Library, Charter, Moderators: phranque & physics

Webmaster General Forum

    
Google won't give up indexing my wrong title tags
They've been changed, but how long will they keep this version?
jastra




msg:4059624
 4:24 am on Jan 13, 2010 (gmt 0)

One of our programmers used a dynamic page from Client A's site as a template for a new site for Client B. Trouble is, he brought along all the head tags too.

As a result, the new client's site got head tags that say "New York Widgets" when the client really has "Miami Widgets."

It's been 4 months since the head tags were corrected and several updated XML sitemaps later. Still, Google retains these bad head tags in the SERPs.

How long can stuff like this normally take? And does anybody have a suggestion on how to get Google's attention?

 

AnkitMaheshwari




msg:4059651
 6:11 am on Jan 13, 2010 (gmt 0)

Did your page (Client B Site) got indexed after the correction were made?

If yes, what does the cached snapshot of the page shows (Client A's site data or Client B's?)

jastra




msg:4059827
 1:48 pm on Jan 13, 2010 (gmt 0)

Actually, Client B's site has a different look and feel, only the ASP source code was used. And the old head tags were inadvertently left in.

None of these pages are marked as cached by Google.

jastra




msg:4059847
 2:13 pm on Jan 13, 2010 (gmt 0)

I reviewed some old notes. On 9 October and 19 November these pages were indicated by Google in the SERPs as cached. And, again, they were the correct pages for Client B. But the old head tags, optimized for Client A were still there in the new pages.

jastra




msg:4061090
 12:25 am on Jan 15, 2010 (gmt 0)

I believe AnkitMaheswari was onto something about caching.

Please see my previous: I know that the Client B pages with the wrong <title> tags were indexed on 9 October and on 19 November. But yesterday I noticed in the SERPs that those pages do not display the "cached" link. What does this mean?

Is there any way to get the page versions with the correct <title> tags indexed?

Or is my only option to perhaps change the filenames of the pages, then 410 those old pages?

Anybody?

phranque




msg:4061275
 10:24 am on Jan 15, 2010 (gmt 0)

do you have a meta robots element on those pages?
is there anything relevant in the robots.txt file?
or are you using any nofollow in your internal linking?

jastra




msg:4061416
 3:08 pm on Jan 15, 2010 (gmt 0)

phranque,

No to all three questions.

And the robots.txt file that the designer installed in the root folder doesn't address any of the database pages that are in question.

nomis5




msg:4062148
 9:23 pm on Jan 16, 2010 (gmt 0)

I've noticed before, and read about this, Google is very reluctant to update amended title tags.

They put great weight on the title tags and to prevent this being manipulated, changes take many months to take effect.

jastra




msg:4062411
 2:31 pm on Jan 17, 2010 (gmt 0)

nomis5 I agree. And from what I've also read here, if you already have decent title tags and attempt to tweak or fine tune any further, you risk losing rank. There was speculation that Google might have viewed it as gaming their algorithms.

All that even though Google recommends fine tuning title tags in their webmaster guidelines.

If this lasts much longer I suppose our only recourse is to rename those filenames and 410 the current pages.

jdMorgan




msg:4062472
 5:48 pm on Jan 17, 2010 (gmt 0)

The one thing that Google "hates" is large, sudden changes. So I'm going to recommend that you grit your teeth and wait it out instead of giving G yet another reason to think your pages are 'suspicious.' Build links, expand and update content -- anything productive to keep your mind off this embarrassing mistake, and suggest that an official "acceptance testing" program (with a formal list of things to check off, including <title> and <meta name="description"> tags) be put into place for new developments so that this won't happen again.

Keep an eye on Googlebot, and once the pages in question have been spidered a couple of times, this situation should resolve itself. It's very hard to be patient and "do nothing," but that's what it takes in this situation; Working on the other aspects of improving the site will help -- both practically and emotionally...

Jim

cnbeg




msg:4062786
 1:01 pm on Jan 18, 2010 (gmt 0)

3 moth

johnnie




msg:4062822
 2:05 pm on Jan 18, 2010 (gmt 0)

I've seen some pretty messed up SERPs as well recently. Old titles, old meta descriptions, reverted rankings etc. I think Google is having the hiccups.

jastra




msg:4062835
 2:20 pm on Jan 18, 2010 (gmt 0)

Yep, JdMorgan-- this event was the seminal moment that made us start a more thorough testing regimen. Since this screwup every site redesign gets its full regular testing but also a complete assessment from an SEO standpoint.

Programmers and designers have been given checklists of items to do and what to look out for. They wouldn't otherwise have knowledge of SEO and search engine visibility issues (and shouldn't be expected to have that knowledge, beyond some basics).

Even our experienced software tester wasn't aware of most of the SEO and search engine visiblity issues.

This system is working well now.

jastra




msg:4062852
 3:00 pm on Jan 18, 2010 (gmt 0)

As yet another update, I found that Google has simultaneously cached the pages with the corrected head tags. The cache shows a date of 30 December.

So they currently have two versions in the SERPs.

Clearly, I should have thought to run allintitle on the new head tags too.

g1smd




msg:4062991
 6:35 pm on Jan 18, 2010 (gmt 0)

When you study how Google works in detail you will see that SERP titles and snippet text come from a different database than that which shows the 'cache' or which actually 'ranks' pages.

There are many times where the SERP title or the snippet shows text which cannot be found in the cache copy or on the real live page. A lot of this is bound up with the treatment of what Google used to call Supplemental Results.

The different databases are updated at different times, and the databases also hold on to old information for well over a year and can show different text depending on the actual search query.

jastra




msg:4063020
 7:13 pm on Jan 18, 2010 (gmt 0)

Then you're throwing the dice when you let an unoptimized new Web page go live and get spidered. and that's what we did.

Petrogold




msg:4063371
 6:56 am on Jan 19, 2010 (gmt 0)

"Google "hates" is large, sudden changes"
Google likes new contents. Then why hates large change if genuine & no duplication? Actully, no one knows their algo & SEO.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / WebmasterWorld / Webmaster General
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved