Welcome to WebmasterWorld Guest from 54.162.248.199

Message Too Old, No Replies

Why most of my indexed pages in Google does not have any description?

Indexing problem!

     
12:04 pm on Jan 31, 2005 (gmt 0)

10+ Year Member



Hi folk,
I noticed in the last 3 months that google indexes most of my pages as just links without have any description or tile or even do not cashed?! Please help.

Regards.

12:34 am on Feb 18, 2005 (gmt 0)

10+ Year Member



Zeus

Yes it is also showing up for the site:domain thing and the page description is taken from our homepage even though its only a link to us. It is supplemental though.

Been sunk in depths of Google since March 2004.

1:28 am on Feb 18, 2005 (gmt 0)

WebmasterWorld Senior Member zeus is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Sens a a email to them and ask them to remove any link to you, just for safty, be cause google has some big troubles with some links and redirects. the link as maybe been on that site, but some times you can not see the link when it is some kind of directory, be cause things changes.

zeus

11:22 am on Feb 18, 2005 (gmt 0)

10+ Year Member



"Why most of my indexed pages in Google does not have any description?"

Does anyone know of a search command to determine
how many pages Google lists only the url?

Over 2/3 of my pages have only the url listed
for quite some time.

5:38 pm on Feb 18, 2005 (gmt 0)

10+ Year Member



I don't know of any command that will show that Jackpot. I'm having the same problem with my site, no description, no title, SERPs gone. I've contacted Google and not received a response. What can I do? I feel so helpless...
6:49 pm on Feb 18, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Does Google ever drop the "supplemental results" pages that been excluded by robots.txt?
7:09 pm on Feb 18, 2005 (gmt 0)

10+ Year Member



Does anyone know of a search command to determine how many pages Google lists only the url?

Yes. Find a word that's on every page of yours, but not in the domain name. Maybe something like "reserved" (as in All rights reserved on a copyright notice) or "home page" (as in "Back to home page"). Let's call this the "special term."

Then do this: site:www.mydomain.com "special term"

and this: site:www.mydomain.com -"special term"

The first will show you all the properly-indexed pages.

The second will show you the URL-only pages.

The nice thing about the "site:" command is that Google is not in a position to sabotage it the way they sabotaged the "link:" command. That's because there are thousands of sites out there that have placed a Google search box on their page, and this box has a "search this site only" option on it that utilizes the "site:" command. If Google tried to sabotage the "site:" command, all of these webmasters would start screaming that Google isn't covering their sites very well.

2:08 am on Feb 19, 2005 (gmt 0)

WebmasterWorld Senior Member zeus is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Try - site:www.google.com it has a lot of url only also and only 610 results are shwn the rest is omitted, its said to see that Google only has dublicated content, ha ha.

Results 601 - 610 of about 27,700 from www.google.com

2:46 am on Feb 19, 2005 (gmt 0)

10+ Year Member



When I do the allinurl:mydomain, the first results all have titles. Then at some point everything from there back has no title. That seems like an easy way to determine how many page do and don't have titles.

I have also observed this problem for many months. I read somewhere here that if Google hasn't indexed the pages for a specified period of time, no title will result.

That seems reasonable, since I get daily visits from Googlebot, but have not seen a deep crawl since the summar. Googlebot previously deep crawled my site monthly, getting almost all of the 17,000 pages, but now only a few hundred a week. All of the pages show in the index, but only around 800 have titles.

Don't have any of the problems with the site mentioned here. I wish I knew what to do about it.

3:02 am on Feb 19, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I used to have about 40 pages or more on two websites which didn't have any description but URL only. I redirected www.site1.com & www.site2.com to site1com and site2.com about two week ago. Today I checked the pages, all but 4 pages are fully indexed (title, URL, description, cache ...). Two of the remaining 4 URL-only pages are www.site1.com and www.site2.com.

Based on these results, I would say that one of the reasons of URL-only indexed pages is due to some forms of dublication.

8:57 pm on Feb 19, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Besides all the reasons mentioned above I always find a notitle/nodescription on pages that were not designed as html pages , i.e., windows that one must click on the link for more info and it opens a small window. I don't worry about those however.
10:58 pm on Feb 20, 2005 (gmt 0)

10+ Year Member



Regret to report that Google has not made one
correction to my url only problem during the last
30 days.
Guess it is a non problem for Google.
It's killing me.
11:32 pm on Feb 20, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I know so little it scares me but... from one of my sites that is database generated asp pages. Many hundreds of these pages show up as url-only. The only common thread with these pages is that they return few results from the database. I believe that all the boilerplate and links on the pages makes the changing content such a small percent that they have dropped into duplicate content land.
2:54 am on Feb 21, 2005 (gmt 0)

10+ Year Member



Same for a few of my sites... Terrible... Referrals and income are way down...

Not sure it's not related to dup content either (I do not have dup content but a copyright notice on each page and it seems GG is picking up on that as dup conent?)

Some of my sites show URL only, some show title with no description and no cache.

Some related threads for you guys:

[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...]

Cheers
Roel

3:03 am on Feb 21, 2005 (gmt 0)

10+ Year Member



I have a site which was having this problem. It's mostly dynamic content, but the URL appears static. I'd have a dozen or so pages show up as only URLs in Google.

Then I removed the noarchive tag from all the pages and once Google cached them all, the problem went away. My theory is that if Google attempts to read a page but can't load it for whatever reason, and the page has no cache information, the current snippet/title are lost for whatever reason.

But that's just my wild theory.

3:40 am on Feb 21, 2005 (gmt 0)

10+ Year Member



I also have product pages that are datafeed and most now show as title only. The product descriptions are not similar on about half of the pages, so I wonder if Google is also looking at the page structure as a factor in determining whether it's a duplicate.

How do you offer 35,000 products for sale without having templates for the pages? The main category and subcategory pages are still ranking well, except for a few that may have the same problem as the pages that are produced by the database.

These are all static html pages. In the past all of the pages were indexed showed descriptions and ranked well.

What bothers me is that many of my competitors use the same structure and a few now have identical pages showing, one as a supplemental with the only difference that one has a subdomain with a capital letter. These are exactly the same pages, html and content. Yet they rank # 3 and #4 in search results.

3:56 am on Feb 21, 2005 (gmt 0)

10+ Year Member



Just tried this:

Search normally for site:blabla.com and only a few pages show, rest is just URL's

Search for site:blabla.com and then add "&filter=0" at the end of the search query -> all pages show as normal!

So might it be a dup issue after all?

4:36 am on Feb 21, 2005 (gmt 0)

10+ Year Member



&filter=0 didn't do anything to help my no title, no description problem.
11:37 am on Feb 21, 2005 (gmt 0)

10+ Year Member



Zeus
Yes it is also showing up for the site:domain thing and the page description is taken from our homepage even though its only a link to us. It is supplemental though.

Been sunk in depths of Google since March 2004.

Interesting two of my sites went this way as well in March 2004 - cache still showing as March 2004 - anyone know any filters etc that were applied at this time.

4:10 pm on Feb 24, 2005 (gmt 0)

WebmasterWorld Senior Member ken_b is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



I'm also being affected by this.

I'm wondering if it could be a sort of penalty for phantom duplicate content. Try a search for

site:www.yourdomain.com and a search for site:yourdomain.com (without the www.) and compare the results.

But the thing is it seems to be intermittent.

9:12 pm on Feb 24, 2005 (gmt 0)

WebmasterWorld Senior Member steveb is a WebmasterWorld Top Contributor of All Time 10+ Year Member



That's relative link rot. Make sure your linbking is consistent to start with (not to both / and /index.html for example), and switch to absolute links at least for your key pages.
4:47 pm on Mar 10, 2005 (gmt 0)

10+ Year Member



This is hurting me too. No title, description or cache.
PR 7 site
6 years old

If I search google on site:www.mydomain.com - "some specific info from my site"

all I see is:

www.someotherdomain.com/ page.asp?title=target+keyword&url=http://www.mydomain.com

other-domain.com/cgi-bin/tabi/navi/navi.cgi?links=82

www.anotherdomain.com/Redirect. asp?ID=188&url=http://www.mydomain.com%2F

and more of the same

so does this mean these sites are stealing my sites rankings?

4:51 pm on Mar 10, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hi Gerbot

check out this thread: [webmasterworld.com...]

not sure if it applies to your situation but worth a look

Dazz

7:40 pm on Mar 10, 2005 (gmt 0)

10+ Year Member



Hi, Any one know why google do not index the whole URL, I have a pages that have another url in their urls, like that:
www.mysite.com/somepage.asp?id=123&Downloadurl=http://www.anothersitedomain.com/anypage.htm

the google index the upove url without involve the sub-url(downloadurl),like that:
www.mysite.com/somepage.asp?id=123&Downloadurl=

Please help.

12:17 am on Mar 12, 2005 (gmt 0)

10+ Year Member



Did some more research on this and emailed Google on it as well.

I feel this is what is happening:

1) If you site does not have enough authority (ie. links) then the GG bot may decide not to index your site regularly. That gives the effect that after a while your pages in the index "grow old" and therefore get listed by the TITLE only (TITLE still showing and URL as well, without description) -> solution: add more links, and possible add to GG via add URL (if site has been indexed previously)

2) Another situation is where the url only is showing, no TITLE and no DESCRIPTION. This is when "similar content" has been found, you will often see this on very similar pages -> solution: remove the dup. content (and look at forum posts relating to www. / not www. issue)

Also make sure that your robot files etc. are in order and that your site is spider friendly at all times.

The reason I mostly came to this conclusion is since I have a directory site that when I started it had the URL only (ie. not enough links), then when adding links more and more of the normal content pages (ie. directory cat's) became fully visible (title, desc., url) but the pages that are similar (the "add url" page for every cat that is very similar) is still showing as url only.

So in any case, it is always good to add links and to get rid of similar pages. It is my belief however, that with enough authority links it does not matter if you have similar pages or not as your authority has been established. Obviously not a lot of authority sites will link to you if you have dup. pages...

8:09 pm on Mar 13, 2005 (gmt 0)

10+ Year Member



Roel, you are repeating what everybody has known since 2002. It's all in the webmasterworld archives. That is not the reason sites are being deleted.
8:24 pm on Mar 13, 2005 (gmt 0)

10+ Year Member



I've experienced all of the problems you people have and I can say it's 100 percent waste of time trying to speculate. The sad thing is the only way you can reliably consistently keep any website up on google is to have multiple domain names and separate your content out into the domain names (with some duplicate pages, just re-arranged.. for more reliability).

All the successful one-domain large businesses on the internet do not rely on google alone.. i.e. ebay, amazon, microsoft.. They all rely on their offline promotion, TV promotion, User recommendation, etc. People like us who do rely on google alone will start have to think outside the box BIG TIME. There is no way I can continue my inconsistent business on the web through google (unless of course I continue to manage multiple domains, which has proved successful.. but this is harder work than working at Wal-Mart for 5 bucks an hour, methinks).

If you for example look at all the porn sites and crack sites out there : most (all) of them are not 1-domain websites. They are promoted by hundreds of domains and they separate their content out into 100's of domains. They break all sorts of rules on google and they are the most successful on google (crack sites come up higher than Microsoft many times). Does that tell you anything? So if one domain goes bonkers on google, or heak if 30 domains go bonkers on google.. it' don't matter. Because they still have all that reliability of multiple domains.

8:38 pm on Mar 13, 2005 (gmt 0)

10+ Year Member



Sorry to post so often in this thread but there is nothing wrong with it if I am adding useful info.

Check out this person's quote.. it explains exactly what problem I described above, about thinking outside the box and not just utilizing google.

"Between May and September we had a lot of success with a new site. The pages were slowly taken and cached. A comprehensive site map helped out. Visitors were doubling month on month and sales were high.

However, since September the reverse has become true. Pages steadily fell out of the index, our position sliped and customers vanished. We had changed nothing and were totally white hat in everything we did.

It seems that somtime around September our site map stopped being cached fully (too big?) and the spider was too lazy to follow other links outside the site map.

We changed the site map and restructured our site to no avail as the bot just does not seem to want to know. It randomly visits a handfull of pages and randomly caches a fraction of those.

This so frustrating because we are a start-up, google is our sole marketing tool and source of income. Sales are virtually nil now and it seems that all we can do is wait. What are the rules? What changed?"

There is no way a business should just rely on google.. I've recognized this but haven't taken enough action.. Can't put all your eggs in one basket in this situation (Google worse than Microsoft: yet google uses Linux. What a clash this is).

I know why people still use TV and offline marketing, but there have to be other online solutions in addition to google. No wonder there are so many dot.com roller coasters.. I'd say one of the main reasons is because people like my self and the above fellow just focus on google marketing and pretend that "waiting 3 months" is a viable solution. Google marketing is the only "light we see" and boy do we have to start thinking outside of google.. or OUTSIDE the glass box that google holds us in.

8:34 pm on Mar 19, 2005 (gmt 0)

10+ Year Member



Has anyone detected that Google has addressed this problem?
I still have a huge number of url only same as two months ago
12:07 am on Mar 20, 2005 (gmt 0)

WebmasterWorld Senior Member billys is a WebmasterWorld Top Contributor of All Time 10+ Year Member



About 20% of my pages have this problem. My assumption is that Google is broken here too. Googlebot just ate my whole site yesterday. If those pages still don't have a description in a day or two I will post again. (Let's see if it's fixed)
3:12 pm on Mar 20, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Chances are, the 20% of pages just haven't been re-visited lately.

The only URL only files I have listed now are dynamic pages, which were specifically disallowed via robots.txt.

This 68 message thread spans 3 pages: 68
 

Featured Threads

Hot Threads This Week

Hot Threads This Month