Welcome to WebmasterWorld Guest from 126.96.36.199
1- ask one question
2- be brief.
3- no commenting on other posts.
4- no specifics please
5- violators will go posting off ;-)
6- 1 q - 1 q only.
7- thread will be scrubbed of junk/offtopic/etc
After we get 10-20 q's we will submit them to the plex...
I've been doing research on the Holocaust and was so disturbed to see revisionist sites getting so much mileage out of Google. I understand Google's philosophy about using computer algo's to decide the "importance" of a site. However Google (and the other search engines) are being used to get all WW2 / Holocaust words into Google as part of a hate campaign.
Two missions of Google go against allowing these sites to thrive. One is that Google wants people to find reliable information rather than spam. Two is "Don't be evil". Will Google look into addressing this important issue? It particularly concerns me because the lies perpetrated by a few motivated individuals is not being appropriately weighed against the testimonies of hundreds of thousands of witnesses and the experience of millions. Their propaganda sounds very convincing. And many kids today are getting the bulk of their information about history from Google searches.
My Adsense earnings seem "tethered" to the ground. When my site gets 10,000 impressions, I make $10 [NOTICE: THESE ARE BOGUS NUMBERS]. When it gets 20,000 impressions, I make $12. When it gets 30,000 it gets $12.50. The effect is mostly falling CPC.
This "tethering" gives me a perverse incentive to remove Adsense—as my site gets more hits, I could take Adsense off pages and put other ads on. I HATE other ads, and I want to fill Larry and Serge's pockets with gold. It seems strange that Google would incentivize people to remove Adsense. Why is this happening, and how can I "untether" myself?
Kudos for answering questions; we appreciate it.
joined:Mar 2, 2005
Google's market cap is currently 78 billion.
A person? According to Forbes magazine's 2004 numbers, the world's richest persons, Bill Gates and Warren Buffett, are worth 46 and 42 billion respectively.
A defense company? Well, General dynamics has a market cap of 22 billion, Northrop Grumman 20, and Boeing 53.
And twenty times? That's almost six times what Microsoft is worth.
1.Did the "Bourbon" update target Adsense sites in any way?
I've dominated the SERPs for my niche since 1998, and the only change I made this year is the addition of Adsense. Bourbon buried me, and NO sites above me on any key word searches have Adsense - I'm the first Adsense site on any of my key word SERPs. Traffic has went from 10,000 refs a day to 20 or 30 uniques.... killing my Adsense earnings, but more importantly (and life-altering) killing my business.
2.Would it be better to remove Adsense code?
Just makes me wonder ....
I have had an .htaccess file for a year that rewrites the URL to always be www.
So, my question is, how do I go about removing the non-www pages from the index, without losing my www pages?
Anyway - how can it be a penalty or affect search rankings? .... it is obvious that they are the same site. I am sure that they have someone smart enough to figure that out.
Think about it - Google has a brilliant search engine that takes all types of variables in consideration when ranking a site - and the spiders (and/or search software) are too dumb to figure out that a www site and a non-www site with exactly the same content are the same site? duh ....
Should I change back to the old DNS servers and old IP to get unbanned until I come out of the sandbox? I already emailed a reinclusion request and am willing to PM the tracking # to GG.
joined:Dec 29, 2003
The problem is that people click on the second result too and get a 404 error. Websites lose sales, clicks etc. because visitors might give up and go somewhere else, and Google loses too when the user doesn't find what he was looking for right away. Every visitor counts. I know they're labeled "supplemental", but I doubt most users know what it means, they think all listings are the same. Let's be realistic.
I suggest that if Gbot gets a 404 error during two consecutive crawls, let the page expire. If it was a server or programming mistake (unlikely), it will be picked on the third visit. If the webmaster didn't catch the error for 2-3 weeks, he deserves to wait a week or two to be indexed again ;)
Why does Google provide different SERPS based on the browser being used?
I am getting very different results, from the same datacenter (I am assuming this because I moused over the cache button) using IE and Firefox. Everything else is the same, same computer, same IP at my end, same search term.
Is there any possibility you could put pressure on Google to correct the problem with 302 hijackings? Most content writers are totally unaware of all this technical stuff and if I hadn't found webmaster world I would have had no idea what happened to my site. Even then I had to get expert assistance to carry out the instructions you put in the Bourbon 4 thread.
It doesn't seem fair that only the web savvy people can prevent this problem while most everyday folk with a web site and information to share are left out of the loop.
Thanks in advance
Below a few questions that are centered around the main question.
How can I make my sql/php site more Googlebot friendly?
1)How can I remove a session ID without getting penalized for using “cloaking” strategies (with mod_rewrite) but still be able to make the URL’s friendly for Googlebot? Could you point me in the right direction or provide simple solutions?
2)Will Googlebot with help of Google Sitemap Generator index all dynamic pages of a website or is there still a restriction (as stated in: [google.com...] ”we limit the number of dynamic pages we index”)? Does it still depend on the amount of parameters used?
3)http://www.google.com/webmasters/guidelines.html “Don't use "&id=" as a parameter in your URLs, as we don't include these pages in our index.” In the example url above, does “unid=” confuse Googlebot (= not indexing the urls)?
4)I see many of my dynamic urls in the Google database which are obviously noticed by Googlebot, but not indexed. Are these urls “lost forever”, inaccessible to Googlebot or simply ignored?