Forum Moderators: open
I've seen that most replies to questions/issues in forums (in general, not only here) tell causes of the issue, but very few give solutions. So, there are some suggestions:
- Check your links. If there are two different URIs that point to the same resource, this resource could easily listed twice. You should check both internal and inbound links.
- About dynamic content... if this is the true cause of your issue, you'll have a lot of work... The best I can suggest you is to check the name of your dynamic files variables. If there are files that use the same name for different porposes, any automated system (a robot, for example) will have a lot of problems to deal with that correctly.
Try to keep your URIs clear, unambiguous and syntetic. If you do, and the bot still has problems to crwal you, it's due to a G's error and not yours.
Greetings,
Herenvardö
ESTIMATE being the joke word. Am I the only one who fails to see why something so simple as "How many pages in a web site" or "Which pages link to your site" has to be an ESTIMATE.
I thought computers can work out anything mathematical in fractions of a second.
I would be happy if I had so many pages, but I dont, I think I got about 2000 pages.
It could also be a glitch in Google spidering and which will be corrected next month, because they have added so many site, which I dont like, it will not be long before we only see dynamic pages.
The increased number of pages isn't only being reported by users, but also by Google themselves. They happily reported they doubled the size of their index recently. They point out that this mainly impacts terms that are rare - like searching for people's names.
As far as I (and other people have reported) there are no noticeable differences in the search results for most common searches.
Other people speculate that Google increased their index size today to counter anticipation for Microsoft's anticipated search engine release today - which appears to have turned out to be just them moving their Beta search engine to a new domain name (still Beta) and with a modified interface.
So I see on G's home page. Still, makes me wonder why Google is reporting double the number of pages I actually have right now.
[google.com...] blog/
(remove space - odd that the forum editted that)
Whether they are counting duplicate pages or not in that new number is hard to say. I agree though, for some of my sites the number seems much higher than I would have guessed. In my case, though, I'm guessing it is some dynamically generated pages (ie, forums) and it includes different parameters - sorting by date ascending and descending, etc.
What will be more interesting is when they do their next ranking update, which I'm very eagerly anticipating.
Not just on the home page, but in their blog:[google.com...] blog/
(remove space - odd that the forum editted that)
Posted a new thread when I saw the blog, then noticed this entry.
Timing is everything, think maybe Goog is trying to steal MSN's thunder and put some pressure on them at the same time?