Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: open
Personally, my favorite is the 880M images. The bump from ~400M plus freshening the data makes it much more useful.
You wouldn't say that if it was your site getting a lot of useless traffic because it happens to have some popular search term pics on it.... :-)
The jpg's on our site suck back a lot more bandwidth the the text. Anyway, I shan't ban the imagebot... we still have our necks above water bandwidth-wise. Some of them click on through.
As Google covers more online turf, it is also digging deeper into Web pages. Roughly 40 per cent of the Web pages scanned by Google weren't fully indexed until the latest improvements, Brin said. Now all but about 20 per cent of the Web pages that Google covers are fully indexed.(emphasis added)
Now, is it common in Australian media to use the term "Web pages" meaning "web sites" or are "pages" to be understood as individual pages?
If it's not common in Australian media, then one might wonder if it's common inside Google in stead? It's one of two:
(1) the term is used meaning "sites" in which case Google is now indexing deeper, ie. more pages from same site
(2) the term is used meaning "pages" in which case Google is indexing wider, ie. more code than previously is now indexed (eg. invisible page elements, such as scripts, css, and markup tags)
(sidenote: if it's (1) then the 4 bio. figure is much larger when translated to page count)
Even with its expanded reach, Google still isn't close to capturing the constantly expanding constellation of online content. By some estimates, there are 10 billion pages on the Web....
Google has made five significant changes to its algorithmic formulas in the last two weeks, Brin said.
Google has been regularly upgrading its search engine since its late 1998 debut with a Web index of 25 million pages, but the potential threats from Yahoo and Microsoft have added more urgency.
"We have decided to put even more energy into our improvements and have turned up the notch on innovation a bit," Brin said.
I'm not too happy about it. I've had to upgrade my web hosting account twice since one of the requirements for Froogle inclusion is allowing the Google ImageBot. My bandwidth requirements have more than doubled.
Froogle has been good to me, but Google Images is sucking up hordes of unnecessary bandwidth.
75% of my pages show a URL but no cached page.
If you didn't does you page cache show up in other search engines which have the cache feature like Gigablast?
Of the 4.28 billion "web pages" how many are urls only
with no cache?
If Google visits a page where theres no noarchive META tag, it will of course cache it.
Google will not purposely forget to cache a page. It all depends to us, whether or not we want our cache showing in Google
Personally I believe ATW, Inktomi and Google had the most significant parts of the web indexed several months ago, now we are in to a meaningless battle of who has the biggest index for largely irrelevant reasons.
Quality is what counts.....not quantity. Right now Google has the quantity, but seriously lacks the quality!
or do you just not like them because you arent #1? lol
one good thing is they have eliminated all spam for my sector since the Brandy update. [thanks to LSI i think]
I can only convert a relevant visitor, a visitor that found one of my site due to Google's irrelevance costs me money for no return.
I can live with the cost of bandwidth for irrelevant results, but I hate the fact that the Google algo fails to show the most relevant results when used.
A decent percentage of my traffic is now complete junk. Combine this with some of those looking for what I have to sell (who can't find me) and the situation becomes ridiculous!
Search engines are supposed to deliver targeted, relevant results......not junk traffic. Google is now delivering more junk traffic than good relevant traffic......so yes, I'm whining!
why do i see so many anti G posts? I'm seeing very relevant results here, and even if they dont favor me at times I have to admit they are doing a genius job. or do you just not like them because you arent #1? lol
No - if only it were so. What has happened, spacehopper, is that many sites that almost entirely disappeared from Google in November 2003, are now back to the top ranking they probably deserved.
Us folks are just trying to figure out whether we've been stuffed big-time, or whether something clever was happening :)
The simple argument is that: if we ranked well then, and then disappeared, and now rank well again, what has happened in the interim?
BTW: You will do very well on these boards if you continue to mildly criticise webmasters, and admire Google.
You'll be a moderator one day!
for example, all our products have a chemical description/keyword. we are coming up quite high for these chemical words.
we are still ranked high for the primary keywords but i can definately see the irrellevance kicking in.
i think they are still fiddling with the algorithms, im watching my position slide up almost one slot on the SERPS per day for the last week now.
i also think they are finding the balance between their new LSI technology which im sure they have, and their more traditional ranking methods.
i had a good laugh 2 days ago when i went to google.com and for a few minutes it was pulling from a rare datacenter. we were like #2 for a highly competitive keyword, but it was a very deep page on our site with no PR and that word only occurred once in the page (title).
im not saying that they dont have stuff to work through, i just still find it a lot better than other places.
im anxious to see what yahoo will look like once it completely drops google.
Because Google cares so much about outbound links, I can find Pages featuring Mango Sorbet and Kiwi Smoothie recipes, but not a Mango Kiwi Sorbet recipe.
That's why I'm annoyed.
Note: apologies to any moderators for dropping a specific search phrase, but I really wanted to make sorbet tonight.
Thanks and best regards from Serbia.