Welcome to WebmasterWorld Guest from 54.144.243.34

Questions for GoogleGuy

   
8:29 am on Jun 2, 2005 (gmt 0)

WebmasterWorld Administrator brett_tabke is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Rules of this thread:

1- ask one question
2- be brief.
3- no commenting on other posts.
4- no specifics please
5- violators will go posting off ;-)
6- 1 q - 1 q only.
7- thread will be scrubbed of junk/offtopic/etc

After we get 10-20 q's we will submit them to the plex...

9:48 am on Jun 2, 2005 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



kbba04527, just search for [brett tabke 26] and it's at #3, I think. :) Here's the direct link:
[searchengineworld.com...]

I think the book Google Hacks also has it toward the end of the book, too.

It's very deep stuff. If you just re-read those 26-steps once a month, you won't need 90% of SEO questions/threads. :)

9:50 am on Jun 2, 2005 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



mickeymart, the remaining changes should decrease spam more. The evaluation indicates better precision/relevance as well--I wouldn't get hung up on just thinking of such things as spam-only changes.
9:52 am on Jun 2, 2005 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Dayo_UK, if it's what I think it is, I wouldn't ban it. It might accept gzipped pages--check the headers that it sends. That would reduce bandwidth a lot.
9:52 am on Jun 2, 2005 (gmt 0)

10+ Year Member



Cheers GoogleGuy..

I will add though I run 4 retail / etail sites and all get very good results from Google using standard White Hat Tactics, Dynamic Meta Tags and as the saying goes... content is king.. so Good Images and Good Descriptions...

..but always good to brush up and keep reading articles, threads and posts.

9:53 am on Jun 2, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Okay GG, I will ping again. Just getting canned responses at the moment which is distressing. But what can you do when it's a free service and they are doing ME the favour? ;)

I was hoping you had a super-secret method of contacting elite members of a department who care for idiots like myself

9:54 am on Jun 2, 2005 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Bzzzt--Dc71, you used it up on boxers or briefs. Gotta save up the good ones next time. ;)

(Bad design and/or trademarks don't necessarily affect the world of search engine scoring, but both can have real-world impacts, I would imagine.)

9:57 am on Jun 2, 2005 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



shri, for the most part, no--not to the extent that someone else can cause another site damage. But I'm not going to preclude it completely, e.g. if someone has reported spam and a person is checking it out, if they have good reason to know that the incoming links are genuinely associated with/created by a site, I think most people would consider it fair to take that into account.
9:58 am on Jun 2, 2005 (gmt 0)

10+ Year Member



Other search engines don't seem to have a problem with my forum, photo gallery or reviews sections.

The photo gallery and reviews sections are my most popular with visitors, but Google hates them.

It's a white-hat content site, and I'm thinking the best thing is to use robots.txt to block just Google from the forum, photo gallery and reviews sections and then use the URL removal tool.

I removed session ID's for guests on the forum, but the old pages are still in Google Supplemental. I can only write so many words to describe each image in the gallery to make the page unique before it will look silly to my visitors, my reviews section is slowly building content...but I rely on my visitors for that.

Is Google working on better methods of determining what is duplicate content and what isn't?

10:01 am on Jun 2, 2005 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



union_jack, I think the value you give to users is often proportional to the value add of your site: unique content, a unique service (e.g. comparisons of different cell phone or web hosting plans), or even a unique perspective. Too often though, affiliate sites don't add much value to users.

It's fine for someone to say "the user wanted to book a hotel room, and I let him do that." But if 50 other sites are trying to do the same thing, and those 50 sites end up crowding out good results like the hotel's real home page with an actual phone number, then those 50 affiliate sites are clearly hurting diversity, esp. if all the sites are cookie-cutter/templates or nothing but repackaged feeds.

So my answer would be to think hard about your value-add compared to other affiliate sites in whatever niche you're targeting.

10:02 am on Jun 2, 2005 (gmt 0)

10+ Year Member



GG

It seems to me that alot more manual intervention is going on in the SERPS these days - I am seeing spammers getting hit a lot faster than in the past that would suggest this is the case - can you confirm this?

uk web guy

10:02 am on Jun 2, 2005 (gmt 0)

WebmasterWorld Senior Member beedeedubbleu is a WebmasterWorld Top Contributor of All Time 10+ Year Member



GG you can call me naive if you want but I just don't see how accepting payment for inclusion would lead to a conflict of interest. Surely it would only lead to better results if sites were screened before inclusion and there was still a free option for non-commercials? Google could also profit by $nnn per annum for each commercial site it included.

I think that this is only one question :)

10:04 am on Jun 2, 2005 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Also good points, Marketing Guy. You can be sure that I feel the same way, and I emphasize those points every chance I get. I do think that overall AdSense has been a really nice way for sites to monetize traffic that would have been really difficult to find advertisers for before. And I think AdSense now is better than AdSense a year ago (in terms of relevance of ads, plus how the system and the payments and everything else works). So I do have hope that as AdSense matures, your (and mine) concerns will be addressed. I think there are lots of ways to potentially make sure that AdSense is useful for everyone except for people that want to make spammy or autogenerated sites.
10:06 am on Jun 2, 2005 (gmt 0)

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



You've stumped me, Maia. Without seeing the site, I couldn't give much advice. But you're in the right place--a lot of members here will give a you advice on your site, with some gentle persuasion from you. :)
10:07 am on Jun 2, 2005 (gmt 0)

10+ Year Member



GG, How can we know if our site was penalized because of something (for example false spam report from competitors) or not?

We got hit during allegra (75% of traffic down from Google), and with bourbon we lost remaining traffic from google.
we still get some visitors, google sends us about the same number of visitors like altavista. MSN and Yahoo are way above.

We really think that we are completely whitehat. And I think its impossible to lose 30.000 visitors from google without some kind of penalty.

10:08 am on Jun 2, 2005 (gmt 0)

10+ Year Member



  • What considerations do I have to make to when creating a foreign language site? (for me, Korean site for Korean market...but applies to any non-english based site.)

    From what I can tell, there ARE different considerations to make when trying to acheive ranking on my Korean sites in Korean portals/Goog Kor.

    As you probably know, the whole market and how sites are presented in portals is completely different out here, and many of that work in my "western" sites just don't seem to "take" out here (as well).

    note: I'm very whitehat about it all...but still...even the basic rules do seem to a bit different.....

    [edited by: GrendelKhan_TSU at 10:28 am (utc) on June 2, 2005]

  • 10:08 am on Jun 2, 2005 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    >>..if they have good reason to know that the incoming links are genuinely associated with/created by a site..<<<
    So what you are saying in essence here Gg, is that if someone files a spam report on a competitor and then blog spams that competitors url, then evaluator of the spam report, will likely attribute those links to the owner of the site and drop it from googles index?
    10:12 am on Jun 2, 2005 (gmt 0)



    <snip>

    [edited by: Brett_Tabke at 1:28 pm (utc) on June 2, 2005]
    [edit reason] No specifics. Please reread the first msg of this thread [/edit]

    10:12 am on Jun 2, 2005 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    GoogleGuy

    A question from my 17 years old daughter.

    She and her school class friends find it difficult to run effective searches on Google which bring relevant results without the need to run several searches because they don´t know the basics elements of how to search. So I showed her the pages bellow and she asked whether its possible to add a direct link to that page on Google search box in addition to the standard links : Images Groups News Froogle Local more »

    in Danish:
    [google.dk...]

    in English:
    [google.com...]

    Thanks!

    10:13 am on Jun 2, 2005 (gmt 0)

    WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



    I am seeing spammers getting hit a lot faster than in the past that would suggest this is the case - can you confirm this?

    If you go back to say Feb 2004, I think spam had gotten to be an issue because we hadn't allocated enough resources to it. Google isn't a monolith; within the company there's always room for different opinions about what different priorities should be. But I'm happy that I think Google is paying more attention to this issue now. My opinion is that spam is getting hit faster, and that for the rest of this year, it will continue to get harder to spam.

    It's funny because a while ago, people would post on WebmasterWorld and say "When is Google going to pay attention to spam?" Recently I saw a post where something said "Why is Google so harsh on spam lately--why don't they work on other things besides rooting out spam?" I guess you can't please all of the people all of the time. :)

    10:19 am on Jun 2, 2005 (gmt 0)



    Hi GoogleGuy,

    Thanks for your time and effort.

    Question about banned sites (we might just got one for the first time in ten years....). Both specific and general.

    1. Assume the site command (site:example.com) returns NO pages.

    2. When one types the URL (like: www.example.com) the standard is a return by Google of:

    # If the URL is valid, try visiting that web page by clicking on the following link: www.example.com"
    # Find web pages from the site www.example.com
    # Find web pages that contain the term "www.example.com"

    But our site returns ONLY the last two.

    Checking many domains we can fish from the net – there seem to be some sites that return all three and some that return only the last two.

    - Can you clarify how to detect if a site is banned or not?
    - Can you clarify Google’s policy as to when NOT to return any URL when a URL is entered as search query?
    - Are banned sites being crawled?

    Thanks again.

    10:22 am on Jun 2, 2005 (gmt 0)

    WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



    Oof. I thought I could catch up, but then people kept asking interesting questions that each needed a few links or another paragraph, and I turn around and it's Pi O'clock in the morning and I have to run a whole set of meetings all tomorrow morning. And now my insomnia is cured with a healthy dose of Q&A in this thread and just answers here: [webmasterworld.com...]

    Thanks for doing this Brett--this was good. I don't know whether I'll circle back around to this thread--I'm planning on dropping any Bourbon-related index "weather reports" in the thread from the previous paragraph. I might put more effort into comments/advice in that other thread.

    Apologies if I had typos or anything, I was just typing really fast. It's more fun if you don't proofread it before you submit it, anyway. Maybe I ought to get me one of these blog things; I hear they're really popular with the kids these days. :)

    Talk to folks later.

    10:22 am on Jun 2, 2005 (gmt 0)

    WebmasterWorld Senior Member 10+ Year Member



    GG,

    Is there any reason why you didn't answer my question? Do you need me to re-phrase it?

    10:26 am on Jun 2, 2005 (gmt 0)

    WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



    mrMister, I just missed it in my haste. I'd recommend #1 or #4.
    10:28 am on Jun 2, 2005 (gmt 0)

    10+ Year Member



    I have an inkling that one of the reasons a site I work with has suffered *could* be related to the use of affiliate content. (I will of course dance naked on my roof if it bounces back of course)

    In this update, I seem to have attracted a 'rank minus 70 spot penalty' even for pages that contain unique useful content. These pages simply do not rank for anything anymore, unless I unique-stringify the query. I still have the same number of pages indexed, get spidered frquently, have no supplementals in the Serps, and generally can find no obvious sign (ranking aside!) of any obvious penalty, so am naturally a little puzzled.

    Would blocking gbot access to the affiliate content enable my otherwise healthy (unique) pages to resurface. Or would I need to email a human at the plex and request the 'offensive' filter be lifted?

    [edited by: TravelMan at 10:43 am (utc) on June 2, 2005]

    10:30 am on Jun 2, 2005 (gmt 0)



    GG i do appreciate your efforts, but please can you say something about the post, it took me a long time to write. You said you wanted to hear feedback on that issue, thats what i did? Please..
    10:31 am on Jun 2, 2005 (gmt 0)

    WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



    ucool, that is the longest post I have ever seen on WebmasterWorld. It scares me. I could print it out and use it for a blanket.

    Just kidding. It sounds like the problem was the search <snip> showed a different url a while ago. I tried it just now, and it looks like we have only pages from the Supplemental Index now. So we're not really crawling/indexing/serving your site up, except for the Supplemental Index. Since you recently were crawled and had good results, it could have been the actions of the previous owner. I would send in a reinclusion request (go to google.com/support and click until you can specify that as a subject line) and put a summary of the info into the report.

    [edited by: lawman at 11:53 am (utc) on June 2, 2005]

    10:34 am on Jun 2, 2005 (gmt 0)



    Ok GoogleGuy! Thanks for the answer! I will do this, i appreciate its the longest every post, but i had to get to you somehow. I have a reinclusion request but i will do what you said and put it with this post details :) Is there anyway i can flag it for your review? What were your thoughts on full removal? It just seems "dead". To be honest im begging if you could just look at the entry in the google db. Im sure its somehow corrupted, or something. Or is it spam penalised... Sorry folks if this was too specific....
    10:35 am on Jun 2, 2005 (gmt 0)

    WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member



    Looks like Brett put the Q&A up on the front page, but I am going to sneak away to my bed now. :)
    10:35 am on Jun 2, 2005 (gmt 0)

    10+ Year Member



    GG,

    Is it okay to have a domain alias or more specifically point two domains (one eg domain1.com, and the other a domain2.ccTLD) to the same content or would this incur a duplicate content penalty? Would using 301 redirects be a much better method?

    Thanks

    10:36 am on Jun 2, 2005 (gmt 0)

    5+ Year Member



    Dup content in regards to a uk / usa site..
    will google.co.uk index differ from the .com as in i'm thinking of using geo targetting so USA user see $ and UK see £ however the index will only refect $ due to the bots being from a us ip address.. Will it work now or in the furture to help direct users to the correct currency..

    A few tests showed that we would recieve a penalty on both pages that only had $/£ difference which I understand but is there a work around?

    I have many problems with US customers not knowing the difference between $/£

    The customers see's £500 and phone and ask about a product and when the price is converted you often lose that customer..

    Thanks..

    This 201 message thread spans 7 pages: 201
     

    Featured Threads

    Hot Threads This Week

    Hot Threads This Month