Forum Moderators: Robert Charlton & goodroi
A lot of my pages are being indexed, but the title is not being recorded - instead, the URL is being used.
I'm using a php driven site, but haven't had any indexing problems to date... anyone have any input on why this is happening and/or how I stop it from happening in the future?
This is what I noticed:
If it is fully listed, there are 5 lines below the listing.* Show Google's cache of www.mysite.com
* Find web pages that are similar to www.mysite.com
* Find web pages that link to www.mysite.com
* Find web pages from the site www.mysite.com
* Find web pages that contain the term "www.mysite.com"There are two types of URL only listings:
First type shows 4 of the above 5 lines:
* Find web pages that are similar to www.mysite.com
* Find web pages that link to www.mysite.com
* Find web pages from the site www.mysite.com
* Find web pages that contain the term "www.mysite.com"
If you use "site:www.mysite.com", you will see the full listing (title & descriptions)Second type shows only 3 out of 5:
* Find web pages that are similar to www.mysite.com
* Find web pages that link to www.mysite.com
* Find web pages that contain the term "www.mysite.com"
Doesn't matter what you do, it's still URL only listing.Is the first type above just a transition from full to URL listing?
I know the answer to this problem. It has taken me a while to work it out but i was on the right lines and the above proves this.
Google's cache is not recorded properly, this is likely to be because Gbot has had a problem with seeing the content. The last visit the bot has it needs to see the content - or it feels the page is no longer there - like a shopping item which is sold out, etc. So it lists the page but no Title & Description.
What could stop it seeing the page on its last visit?
Redirections...
Domains used to point to...
Server issues... host down, host time out's, etc.
Google has been busy recently trying to reindex everything due to the update - this will put more strain on your server.
Solution - pay for better hosting or move hosts.
Cheers!
Google's cache is not recorded properly, this is likely to be because Gbot has had a problem with seeing the content. The last visit the bot has it needs to see the content - or it feels the page is no longer there - like a shopping item which is sold out, etc. So it lists the page but no Title & Description.What could stop it seeing the page on its last visit?
Redirections...
Domains used to point to...
Server issues... host down, host time out's, etc.
I agree that Google's cache is not recorded properly. The home page of my new site which was initially listed in full with cache became a URL only listing a week later. If I used site:www.mysite.com", it became fullly listed. But clicked on Google's cache, there was no cache. Today, the home page returned to the full listing with new date but no cache.
I don't think this problem is due to "redirections" or "domain used to point to". I'm not sure about server issues either.
Now when I type in domain.com I get a url result as well.
When I type in site:domain.com I get title/desc for what few pages I had left before... but just searching my site won't come up for anything.
I'm at a total loss. It's going to take WAY more content to make my traffic from MSN/Yahoo match what I was getting from Google. I wonder how long my site will disappear for...
Line 1 Page Title of individual page
Line 2 Meta Description of entire Site
Line 3 page URL
It is line two which bothers me, I have actually set the metadescription of each page to that of the entire site but I thought google should display the first line of relevant text here.
Anyone advise?
cheers
I had a bunch of pages that were:
page?section=1&type=2
They were changed to
section/type/page
The old URL's were replaced with 301 redirects to the new style pages. Now the old pages are showing up as URL only results (fairly high too). In addition the new pages are also showing up (but not as high).
I'm hoping that the old pages will be replaced once I'm fully recrawled. But theres not much I can do to a page that is just a 301 redirect so I'm sitting on my hands waiting for the problem to go away.
I noticed many of my pages went URL only on a blog I have.
I realized it must be because of a plugin I used on the site in question. The plugin allows you to give a list of keywords related to the post.
Something like "Read more about: kw1 kw2 kw3"
Clicking on the KW would bring you to a page www.site.com/tag/keyword which the user would do to find similar content on that keyword.
Now the page would list all entries that had the specific keyword tag, thereby creating 2 copies of the same content. This is why I think many pages went URL only.
My question: what's the best way to fix this? Would blocking the /tag/ folder in robots.txt work?
Keep in mind it is not a real folder, but written with mod_rewrite.
Or do I have to get rid of the whole thing, which won't be such a big deal, but what is going to happen to all my URL only listings and the /tag/kws listings? Should I count on google to sort it out on it's own?
My OLD pages (prior to changing titles/etc to eliminate any possiblity of duplicate content) are currently in the supplemental index.
The new pages (same urls) with new titles, etc are now showing some referrals, and McDar shows that a lot of my content is coming back on a lot of DCs.
So now I have duplicate content for real - pages in the sup index, and now newly cached pages in the real index - both showing different titles, and both having the exact same URLs. Perhaps good for the short term... but then what?
For over a year, we've had ab*ut.com scooping the number 1 serp for our main kw combination (pushing us to #2). During Jagger, they disappeared, then came back, then disappeared, and are now back at #4 with URL-only on the page they used to have as #1. My main site now has #1 and 2 on that search. This is appropriate, because the abo*ut.com page had 5 links to our site on it (amongst others), and G seems to have finally sorted out what is actually the main site for that search.