| 4:52 am on Jun 11, 2005 (gmt 0)|
What happened to the cool Google swag adwords advertisers used to get. I need a new beach towel.
| 5:51 am on Jun 11, 2005 (gmt 0)|
I've been doing research on the Holocaust and was so disturbed to see revisionist sites getting so much mileage out of Google. I understand Google's philosophy about using computer algo's to decide the "importance" of a site. However Google (and the other search engines) are being used to get all WW2 / Holocaust words into Google as part of a hate campaign.
Two missions of Google go against allowing these sites to thrive. One is that Google wants people to find reliable information rather than spam. Two is "Don't be evil". Will Google look into addressing this important issue? It particularly concerns me because the lies perpetrated by a few motivated individuals is not being appropriately weighed against the testimonies of hundreds of thousands of witnesses and the experience of millions. Their propaganda sounds very convincing. And many kids today are getting the bulk of their information about history from Google searches.
| 8:32 am on Jun 11, 2005 (gmt 0)|
I run a site composed of subsites on history, literature and travel.
My Adsense earnings seem "tethered" to the ground. When my site gets 10,000 impressions, I make $10 [NOTICE: THESE ARE BOGUS NUMBERS]. When it gets 20,000 impressions, I make $12. When it gets 30,000 it gets $12.50. The effect is mostly falling CPC.
This "tethering" gives me a perverse incentive to remove Adsense—as my site gets more hits, I could take Adsense off pages and put other ads on. I HATE other ads, and I want to fill Larry and Serge's pockets with gold. It seems strange that Google would incentivize people to remove Adsense. Why is this happening, and how can I "untether" myself?
Kudos for answering questions; we appreciate it.
| 8:43 am on Jun 11, 2005 (gmt 0)|
GG i hade a great oppotunity to speak to a tycoon (tankers ,airplanes,ech).I just mention to him "why don't you go on the Internet business as well,with your money you can make the number one search engine of the world (he can buy Google 20 times).It wil be a great idea to change enviroment the web search if a Tycoon spend a few of his trillions to put down your Decadence SE.
| 5:44 pm on Jun 11, 2005 (gmt 0)|
Buy Google 20 times?
Google's market cap is currently 78 billion.
A person? According to Forbes magazine's 2004 numbers, the world's richest persons, Bill Gates and Warren Buffett, are worth 46 and 42 billion respectively.
A defense company? Well, General dynamics has a market cap of 22 billion, Northrop Grumman 20, and Boeing 53.
And twenty times? That's almost six times what Microsoft is worth.
| 9:38 pm on Jun 11, 2005 (gmt 0)|
One version of PR is respresented in the Google dir, another is represented in the toolbar, if Google is worried about PR misuse, why have two versions that don't respresent "true" PR on public display, why not turn them off?
| 10:50 pm on Jun 11, 2005 (gmt 0)|
Why not, indeed? Public pagerank was the first big mistake. Tacitly admitted by the current policy of updating toolbar pagerank only once every three or four months.
| 2:53 am on Jun 12, 2005 (gmt 0)|
1.Did the "Bourbon" update target Adsense sites in any way?
I've dominated the SERPs for my niche since 1998, and the only change I made this year is the addition of Adsense. Bourbon buried me, and NO sites above me on any key word searches have Adsense - I'm the first Adsense site on any of my key word SERPs. Traffic has went from 10,000 refs a day to 20 or 30 uniques.... killing my Adsense earnings, but more importantly (and life-altering) killing my business.
2.Would it be better to remove Adsense code?
Just makes me wonder ....
| 12:47 pm on Jun 12, 2005 (gmt 0)|
My site was penalized for duplicate content on June 4th. Google shows www.myurl.com as main results, but also shows all pages such as myurl.com as supplemental results (without the www). Until the 4th, there were no results from my site that did not include the www, listed on Google. As soon as the non-www pages were indexed, I lost all of my good standings.
I have had an .htaccess file for a year that rewrites the URL to always be www.
So, my question is, how do I go about removing the non-www pages from the index, without losing my www pages?
| 12:35 am on Jun 13, 2005 (gmt 0)|
Same problem here with duplicate (www and non-www). Why do none of the other search (engines at least with my site) have this problem?
Anyway - how can it be a penalty or affect search rankings? .... it is obvious that they are the same site. I am sure that they have someone smart enough to figure that out.
Think about it - Google has a brilliant search engine that takes all types of variables in consideration when ranking a site - and the spiders (and/or search software) are too dumb to figure out that a www site and a non-www site with exactly the same content are the same site? duh ....
| 4:21 am on Jun 13, 2005 (gmt 0)|
My site was sandboxed for duplicate content recently (misconfigured webserver was the issue). I then changed the IP address and DNS servers of the affected domain name and now my site has been banned (PR0, no sign of it in Google).
Should I change back to the old DNS servers and old IP to get unbanned until I come out of the sandbox? I already emailed a reinclusion request and am willing to PM the tracking # to GG.
| 5:05 am on Jun 13, 2005 (gmt 0)|
check out this posing from GG ... specifically message #12.... regarding the proper reinclusion request proceedure..
| 7:14 am on Jun 13, 2005 (gmt 0)|
how about making the 404 pages disappear a little faster? They seem to linger for ages. Right now I'm seeing Dec 2004 cache for deleted pages that show indented to the new ones (different URL path, case letter and can't 301 all of them).
The problem is that people click on the second result too and get a 404 error. Websites lose sales, clicks etc. because visitors might give up and go somewhere else, and Google loses too when the user doesn't find what he was looking for right away. Every visitor counts. I know they're labeled "supplemental", but I doubt most users know what it means, they think all listings are the same. Let's be realistic.
I suggest that if Gbot gets a 404 error during two consecutive crawls, let the page expire. If it was a server or programming mistake (unlikely), it will be picked on the third visit. If the webmaster didn't catch the error for 2-3 weeks, he deserves to wait a week or two to be indexed again ;)
| 7:15 am on Jun 13, 2005 (gmt 0)|
Hi Google guy,
Please excuese me if my question is long and specific but I think if you answer my question you will also answer it for many webmasters experiencing the same.
Can you please give me a clue as to what happened with my site? I used to be ranked 11 for my key term. Then one day I wasn't in the top 1000 at all. Then the next I was, then the next I wasn't. Then about 2 weeks ago I wasn't in the top 1000 at all, and I still am not. I was hoping as the Bourbon update comes to an end I would regain my position but still I am not even in the top 1000. Seems like an overly dramatic drop from 11 to not in top 1000. My site is still in Google's index because when I search for www.mysite.com I show up, and for some unuseful keyworkds I show up, but my visitor numbers are around 10% of what they were a month ago. Can you give me and many others any help as to how to regain our positions?
| 7:28 am on Jun 13, 2005 (gmt 0)|
I'd still like to know where you get your hair cut!
| 2:22 pm on Jun 13, 2005 (gmt 0)|
Why does Google provide different SERPS based on the browser being used?
I am getting very different results, from the same datacenter (I am assuming this because I moused over the cache button) using IE and Firefox. Everything else is the same, same computer, same IP at my end, same search term.
| 11:30 pm on Jun 13, 2005 (gmt 0)|
I think GG has left the room. If so, will someone please close this thread?
| 3:47 am on Jun 14, 2005 (gmt 0)|
An update on my previous post above - Another thing that changed recently on my domain was the address of the company (just the street, not the city, state). So technically, my whois reg did change, however the company it the same, same owner, etc. Would this affect my google pagerank and status with Google? I went from being in the sandbox with a PR4 (due to duplicate content via misconfigured webserver) to PR0 with no sign of the site on Google. Thanks!
| 6:33 pm on Jun 14, 2005 (gmt 0)|
Is there any possibility you could put pressure on Google to correct the problem with 302 hijackings? Most content writers are totally unaware of all this technical stuff and if I hadn't found webmaster world I would have had no idea what happened to my site. Even then I had to get expert assistance to carry out the instructions you put in the Bourbon 4 thread.
It doesn't seem fair that only the web savvy people can prevent this problem while most everyday folk with a web site and information to share are left out of the loop.
| 7:09 pm on Jun 14, 2005 (gmt 0)|
First, I'm green at this, so forgive any general ignorance. Our website has a great google rank. Unfortunately, our website is going to need to now be directed to a URL on our parent's website. We've been told that the best way to redirect AND keep our current google ranking is to use a java OnMouseOver redirect, but I've read that may get us banned. I couldn't tell if that ban was due to redirecting to a shady server or simply by the code makeup of the page. My question for you is *what is the best method of accomplishing a redirect of this nature without losing our current rank.*
Thanks in advance
| 5:14 pm on Jun 15, 2005 (gmt 0)|
Guy from Google,
Below a few questions that are centered around the main question.
How can I make my sql/php site more Googlebot friendly?
1)How can I remove a session ID without getting penalized for using “cloaking” strategies (with mod_rewrite) but still be able to make the URL’s friendly for Googlebot? Could you point me in the right direction or provide simple solutions?
2)Will Googlebot with help of Google Sitemap Generator index all dynamic pages of a website or is there still a restriction (as stated in: [google.com...] ”we limit the number of dynamic pages we index”)? Does it still depend on the amount of parameters used?
3)http://www.google.com/webmasters/guidelines.html “Don't use "&id=" as a parameter in your URLs, as we don't include these pages in our index.” In the example url above, does “unid=” confuse Googlebot (= not indexing the urls)?
4)I see many of my dynamic urls in the Google database which are obviously noticed by Googlebot, but not indexed. Are these urls “lost forever”, inaccessible to Googlebot or simply ignored?
| This 201 message thread spans 7 pages: < < 201 ( 1 2 3 4 5 6  ) |