Welcome to WebmasterWorld Guest from

Forum Moderators: phranque

Message Too Old, No Replies

Google won't give up indexing my wrong title tags

They've been changed, but how long will they keep this version?



4:24 am on Jan 13, 2010 (gmt 0)

10+ Year Member

One of our programmers used a dynamic page from Client A's site as a template for a new site for Client B. Trouble is, he brought along all the head tags too.

As a result, the new client's site got head tags that say "New York Widgets" when the client really has "Miami Widgets."

It's been 4 months since the head tags were corrected and several updated XML sitemaps later. Still, Google retains these bad head tags in the SERPs.

How long can stuff like this normally take? And does anybody have a suggestion on how to get Google's attention?


6:11 am on Jan 13, 2010 (gmt 0)

5+ Year Member

Did your page (Client B Site) got indexed after the correction were made?

If yes, what does the cached snapshot of the page shows (Client A's site data or Client B's?)


1:48 pm on Jan 13, 2010 (gmt 0)

10+ Year Member

Actually, Client B's site has a different look and feel, only the ASP source code was used. And the old head tags were inadvertently left in.

None of these pages are marked as cached by Google.


2:13 pm on Jan 13, 2010 (gmt 0)

10+ Year Member

I reviewed some old notes. On 9 October and 19 November these pages were indicated by Google in the SERPs as cached. And, again, they were the correct pages for Client B. But the old head tags, optimized for Client A were still there in the new pages.


12:25 am on Jan 15, 2010 (gmt 0)

10+ Year Member

I believe AnkitMaheswari was onto something about caching.

Please see my previous: I know that the Client B pages with the wrong <title> tags were indexed on 9 October and on 19 November. But yesterday I noticed in the SERPs that those pages do not display the "cached" link. What does this mean?

Is there any way to get the page versions with the correct <title> tags indexed?

Or is my only option to perhaps change the filenames of the pages, then 410 those old pages?



10:24 am on Jan 15, 2010 (gmt 0)

WebmasterWorld Administrator phranque is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

do you have a meta robots element on those pages?
is there anything relevant in the robots.txt file?
or are you using any nofollow in your internal linking?


3:08 pm on Jan 15, 2010 (gmt 0)

10+ Year Member


No to all three questions.

And the robots.txt file that the designer installed in the root folder doesn't address any of the database pages that are in question.


9:23 pm on Jan 16, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

I've noticed before, and read about this, Google is very reluctant to update amended title tags.

They put great weight on the title tags and to prevent this being manipulated, changes take many months to take effect.


2:31 pm on Jan 17, 2010 (gmt 0)

10+ Year Member

nomis5 I agree. And from what I've also read here, if you already have decent title tags and attempt to tweak or fine tune any further, you risk losing rank. There was speculation that Google might have viewed it as gaming their algorithms.

All that even though Google recommends fine tuning title tags in their webmaster guidelines.

If this lasts much longer I suppose our only recourse is to rename those filenames and 410 the current pages.


5:48 pm on Jan 17, 2010 (gmt 0)

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member

The one thing that Google "hates" is large, sudden changes. So I'm going to recommend that you grit your teeth and wait it out instead of giving G yet another reason to think your pages are 'suspicious.' Build links, expand and update content -- anything productive to keep your mind off this embarrassing mistake, and suggest that an official "acceptance testing" program (with a formal list of things to check off, including <title> and <meta name="description"> tags) be put into place for new developments so that this won't happen again.

Keep an eye on Googlebot, and once the pages in question have been spidered a couple of times, this situation should resolve itself. It's very hard to be patient and "do nothing," but that's what it takes in this situation; Working on the other aspects of improving the site will help -- both practically and emotionally...



1:01 pm on Jan 18, 2010 (gmt 0)

5+ Year Member

3 moth


2:05 pm on Jan 18, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

I've seen some pretty messed up SERPs as well recently. Old titles, old meta descriptions, reverted rankings etc. I think Google is having the hiccups.


2:20 pm on Jan 18, 2010 (gmt 0)

10+ Year Member

Yep, JdMorgan-- this event was the seminal moment that made us start a more thorough testing regimen. Since this screwup every site redesign gets its full regular testing but also a complete assessment from an SEO standpoint.

Programmers and designers have been given checklists of items to do and what to look out for. They wouldn't otherwise have knowledge of SEO and search engine visibility issues (and shouldn't be expected to have that knowledge, beyond some basics).

Even our experienced software tester wasn't aware of most of the SEO and search engine visiblity issues.

This system is working well now.


3:00 pm on Jan 18, 2010 (gmt 0)

10+ Year Member

As yet another update, I found that Google has simultaneously cached the pages with the corrected head tags. The cache shows a date of 30 December.

So they currently have two versions in the SERPs.

Clearly, I should have thought to run allintitle on the new head tags too.


6:35 pm on Jan 18, 2010 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

When you study how Google works in detail you will see that SERP titles and snippet text come from a different database than that which shows the 'cache' or which actually 'ranks' pages.

There are many times where the SERP title or the snippet shows text which cannot be found in the cache copy or on the real live page. A lot of this is bound up with the treatment of what Google used to call Supplemental Results.

The different databases are updated at different times, and the databases also hold on to old information for well over a year and can show different text depending on the actual search query.


7:13 pm on Jan 18, 2010 (gmt 0)

10+ Year Member

Then you're throwing the dice when you let an unoptimized new Web page go live and get spidered. and that's what we did.


6:56 am on Jan 19, 2010 (gmt 0)

5+ Year Member

"Google "hates" is large, sudden changes"
Google likes new contents. Then why hates large change if genuine & no duplication? Actully, no one knows their algo & SEO.

Featured Threads

Hot Threads This Week

Hot Threads This Month