Welcome to WebmasterWorld Guest from 54.167.110.211

Forum Moderators: Robert Charlton & aakk9999 & andy langton & goodroi

Message Too Old, No Replies

Meeting with a Google engineer

     
3:45 pm on Sep 26, 2006 (gmt 0)

New User

10+ Year Member

joined:Jan 28, 2003
posts:33
votes: 0


I have a meeting with a Google Engineer tomorrow about Organic(!) Search. I have exactly one hour!

What would you ask him? I have some ok questions but i think i will forget something...
I'm gonna post the usable answers here of course!

1:33 pm on Sept 27, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member whitey is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 28, 2005
posts:3071
votes: 27


About the sandbox... its related to links, older domains have benefit from having more links.

Hmmm .... i can show you a result to demonstrate only the top few ranking sites for the identified search term seem to have any justification for outranking our site when looking at title tag, body content, and linking considerations.

We have an older site. I wonder if the filters are working correctly, and if they are, what does it take to unblock them.

Links? - we have plenty. New links - that isn't happening either, we have authority links , trust links, links built at the right pace, deep links.

I'm thinking that the recent fixes or mass changes may have triggered some people into some sort of filter.

Do you have any other insights from this engineer that might be more explicit.

[edited by: tedster at 3:12 am (utc) on Sep. 28, 2006]

1:39 pm on Sept 27, 2006 (gmt 0)

Junior Member

5+ Year Member

joined:June 16, 2006
posts:188
votes: 0


Phil -

I guess I'm confused here. You said that you don't want this certain page cached because of security/safety reasons.

To fix this, you put the noarchive tag in and now it's gone from the serps in google. Isn't this what you wanted? Isn't that the point of the noarchive?

1:45 pm on Sept 27, 2006 (gmt 0)

Full Member

5+ Year Member

joined:June 7, 2006
posts:290
votes: 0


I would have liked to know the best way to have scrapers removed from the SERPs...I follow my own procedures, but if there were a special spam report that incorporates a pseudo-DMCA, that'd be ideal. One site I watch lost just about every ranking it had due to a couple thousand subdomain overseas scrapers...annoying.
2:00 pm on Sept 27, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Jan 5, 2006
posts:2095
votes: 2


The copyscape site is very useful. In the past we have copied and pasted manufacturer descriptions and used them. We are going to experiment with a smaller site with this and see what happens.
2:22 pm on Sept 27, 2006 (gmt 0)

Junior Member

10+ Year Member

joined:Nov 30, 2004
posts:42
votes: 0


Tomorrow I'm the one that has a meeting with someone from Google. Keep posting questions, I will use the interesting ones :)
2:51 pm on Sept 27, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:May 29, 2005
posts:866
votes: 0


Nedprof,
This is all becoming very interesting. New users, meetings with google engineers?..
Well, welcome to WebmasterWorld.
Yeah and I posted a question already that evidently didn't get covered.
3:38 pm on Sept 27, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member jetteroheller is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 22, 2005
posts:2987
votes: 2


shooting from the hip... on your not ranking well.
1. no duplicate content on your site/ did nobody steal your copy? you can look at this with "copyscape"

I testest short with copyscape, did not find.

An issue could be

Each of my pages has a contact form

example.com/folder/file.htm has a link to

cgi.example.com/cgi-bin/formmail.pl?by=example.com/folder/file

The formmail page used until Friday last week exactly the same
title line.

My new version since Friday has as title "Contact form"

2. no strange javascripts with links build in on your site

Please could You be more specific?

3. no "nofolow" tags on your site.

Never used

4. don't use your keyword too many times on a url

I have many different keywords.

4:04 pm on Sept 27, 2006 (gmt 0)

New User

10+ Year Member

joined:June 10, 2002
posts: 21
votes: 0


please ask how google threat multilanguage sites:
- does server location influence rankings on specific datacenters (in my experience sites located in italy only rank on google.it for italian and foreign keywords)?
- is site's structure important for a multilanguage ranking (do we have to use different domains for each language or it is enought to use different pages)?
- does the homepage language suggest to google the main language for that site?
7:46 pm on Sept 27, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
posts:18903
votes: 0


>> ask the engineer if something as simple as the www versus NON-www issue could put most of your pages in supplemental. <<

We already know that it can and it does.

The Supplemental Index is a repository for several types of URLs.

Firstly it contains URLs that return Duplicate Content compared to other URLs - www vs. non-www, multiple domains, variable dynamic parameters, capitalisation issues (IIS only), http vs. https, etc (with most or all of the duplicates in Supplemental, URL-only, or removed from the index).

Next, it contains URLs that are now redirects or are recently 404, and they hang about in the index for a year.

The Supplemental Index also contains the previous version of the content for normally indexed pages. You see this when the exact same URL can be a normal result for some search queries, and a Supplemental Result for other search queries (Supplemental when you use words that were on the old version of the page and are NOT on the new version of the page - you can see that old content in the snippet, even though it is no longer in the cache or on the real page).

9:07 pm on Sept 27, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Apr 15, 2003
posts:908
votes: 12


Ask him how a site can (a) have a respectable PageRank on the root URL, (b) ample number pages from the site, including the root URL, crawled AND cached as normal and updated on a continuing basis, and (c) have no pages in the index. I've seen several sites in this condition over the past 3 months.
11:45 pm on Sept 27, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member whitey is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 28, 2005
posts:3071
votes: 27


Could we expand a bit on these earlier questions which still leave a few answers

About the sandbox... its related to links, older domains have benefit from having more links.

I've got reservations over the answer because :

i can show you a result to demonstrate only the top few ranking sites for the identified search term seem to have any justification for outranking our site when looking at title tag, body content, and linking considerations.

We have an older site. I wonder if the filters are working correctly, and if they are, what does it take to unblock them.

Links? - we have plenty. New links - that isn't happening either, we have authority links , trust links, links built at the right pace, deep links.

I'm thinking that the recent fixes or mass changes may have triggered some people into some sort of filter.

Do you have any other insights from this engineer that might be more explicit.

I'll show you the example which will take 1 min to understand in specifics.

[edited by: tedster at 3:15 am (utc) on Sep. 28, 2006]
[edit reason] fixed quote [/edit]

1:25 am on Sept 28, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member

joined:July 26, 2006
posts: 1619
votes: 0


I would like to know what google is doing to combat the site scrapers.

I've even seen them scrape google search results, take out the html, post the page and plaster it with adwords.

Why will adwords NOT police these sites. Why are they allowing garbage sites to run their ads and chase away their advertisers.

I pulled all our advertising for content match when sites kept coming up in the results, running our ppc ads on our own stolen content.

6:55 am on Sept 28, 2006 (gmt 0)

Junior Member

10+ Year Member

joined:Nov 30, 2004
posts:42
votes: 0



Nedprof,
This is all becoming very interesting. New users, meetings with google engineers?..
Well, welcome to WebmasterWorld.
Yeah and I posted a question already that evidently didn't get covered.
I'm reasonably new at this forum, but experienced in SEO/SEM in the Netherlands ;)

I will list all the questions en see what I find interesting.

[edited by: NedProf at 6:56 am (utc) on Sep. 28, 2006]

12:13 pm on Sept 28, 2006 (gmt 0)

Preferred Member

10+ Year Member

joined:Mar 13, 2003
posts:630
votes: 0


I'd ask wether there is something like a sandbox or not.
2:43 pm on Sept 28, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member whitey is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 28, 2005
posts:3071
votes: 27


Added to the above questions on sandbox issues, is there a possible large scale sandbox malfunction? :

http://www.webmasterworld.com/supporters/3085210.htm

I just came across this blog which indicates to me that filters may be stuck on sites, when perhaps there is no reason for them to be in place.

Have a look at this experiment which shows how, by changing the URL's on a site, the filter that suppresses rankings, is released and results return [ except where there is a known error ]. 4 sites reappeared and one held.

How to release the filter on your site [aaronshear.com]

Is there a Google problem with the sandbox or is this just webmastery getting around the routine?

[edited by: Whitey at 2:45 pm (utc) on Sep. 28, 2006]

4:18 pm on Sept 28, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Feb 18, 2003
posts:745
votes: 0


Phil,

Alot of sites vanished 3 months ago. Could it be a timing issue? Sometimes we tend to blame what we just did to our site for SERP problems when it could be an issue that has been going on for months or years that finally got spotted by the SEs.

JD

5:44 pm on Sept 28, 2006 (gmt 0)

Junior Member

10+ Year Member

joined:Mar 12, 2003
posts:196
votes: 0


They are very willing to answer questions, they want to learn how to make their results better.

Just ask the right questions.

Don't ask: Why is my site not in the index. You can easily find these stock answers in their terms of services, and webmaster guidelines.

Ask questions that they can actually answer.

If I where to do "blank" what would happen?

Just my 2 cents

9:33 pm on Sept 28, 2006 (gmt 0)

Junior Member

10+ Year Member

joined:Feb 6, 2006
posts:42
votes: 0


Is it just me, or is all this a little hard to believe?

If Hecules2 has an "inside" friend, then so be it. But if that's the case, then get information that's worth something, not information that's circulated around the SEO community for years, posted on Cutts blog, and frankly worthless.

GaryTheScubaGuy

9:48 pm on Sept 28, 2006 (gmt 0)

Junior Member

10+ Year Member

joined:Feb 6, 2006
posts:42
votes: 0


Forgot I can't edit in this forum...

Hi everyone,

I just had my meeting with the engineer. It was pretty cool. He had tips for getting reindexed: Fix everything and start using google sitemaps.

Everything? Wow that's broad. Start using Google sitemaps...that's VERY old...

Sites that used a lot of spam or are from known spammers will not be reindexed.

No kidding? When you say won't do you mean will not or aren't? Because I've reported them for over a year and they are still there. Matt Cutts even mentioned one of my concerns in his WebmasterWorld radio interview 6 months ago and its still there.

About the sandbox... its related to links, older domains have benefit from having more links.

Wow, now this is a revelation

The main goal should be getting links with good link descryption from relevant sites.

So? Any other revelations? Anything else NEW?...

And no he didn't know who Google Guy was...

Are you sure this was an engineer and not a Google cook? Who doesn't know?

Seeing this, and the genuine responses, plis the additional posts trying to inflate this rediculousness even higher, really reinforces why I visit and post here so rarely.

Hecules2, your one of 2 things, and I hope for the respect this forums draws the 2nd, and not the first.

The second would be someone who knows better than to give away valuable information.

GaryTheScubaGuy

9:59 pm on Sept 28, 2006 (gmt 0)

Junior Member

10+ Year Member

joined:Feb 6, 2006
posts:42
votes: 0



>> ask the engineer if something as simple as the www versus NON-www issue could put most of your pages in supplemental. <<

We already know that it can and it does.

The Supplemental Index is a repository for several types of URLs.

Firstly it contains URLs that return Duplicate Content compared to other URLs - www vs. non-www, multiple domains, variable dynamic parameters, capitalisation issues (IIS only), http vs. https, etc (with most or all of the duplicates in Supplemental, URL-only, or removed from the index).

Next, it contains URLs that are now redirects or are recently 404, and they hang about in the index for a year.

The Supplemental Index also contains the previous version of the content for normally indexed pages. You see this when the exact same URL can be a normal result for some search queries, and a Supplemental Result for other search queries (Supplemental when you use words that were on the old version of the page and are NOT on the new version of the page - you can see that old content in the snippet, even though it is no longer in the cache or on the real page).

Webmaster Tools at Google has a tool to merge the two. It will affect your supplementals, but IMHO, and from experience, in a positive way. It merges more than just your supplemental results.

8:38 am on Sept 29, 2006 (gmt 0)

Junior Member

10+ Year Member

joined:Nov 30, 2004
posts:42
votes: 0


To bad, I have to disappoint you all, it was not an Google engineer but a Sales engineer from Google. He knew a lot of Google Maps, Google Analytics and the API's. The meeting was a good one, but he couldn't help me with questions about natural search.
12:44 pm on Sept 29, 2006 (gmt 0)

New User

5+ Year Member

joined:Aug 27, 2006
posts: 28
votes: 0


Understood NedProf. We have a former Google employee working for us and it's surprising how little one department knows about the other. Heck, I'm shocked at how little they knew about their own department!

Ducki

1:10 pm on Sept 29, 2006 (gmt 0)

Junior Member

10+ Year Member

joined:Jan 22, 2005
posts:104
votes: 0


LOL That tells us some of the "google employees" posting in this forum may not have a clue. They are just guessing like most of us.

Just because you are a "google engineer" or "google employee" doesn't mean you know what is going on. Or are able to share with us what is really going on. (Corporations do have secrets)

While they have shared alot of information with us; it could be mostly speculation or "spin" on their part!

2:48 pm on Sept 29, 2006 (gmt 0)

Junior Member

10+ Year Member

joined:Nov 30, 2004
posts:42
votes: 0


I can confirm that, the guy from Google told me that they know as much as you can read on the web. They have only one advantage: they are connected with the Googleplex. And the Googleplex can confirm answers and fix things.
8:08 pm on Oct 3, 2006 (gmt 0)

New User

10+ Year Member

joined:Nov 16, 2004
posts:4
votes: 0


For anyone still following this thread, I have pulled some questions from the responses here and will ask them to Matt Cutts. He will be on our WebmasterRadio.fm show today at 5pm EST called the Search Pulse. I expect to ask him a handful of questions from this thread.
8:34 am on Oct 6, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member whitey is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 28, 2005
posts:3071
votes: 27


RankSmart - sorry i missed the radio - did anything get answered?
1:03 pm on Oct 6, 2006 (gmt 0)

New User

10+ Year Member

joined:Jan 28, 2003
posts:33
votes: 0


Really egoboosting to hear matt cutts discussing my thread.
10:14 pm on Oct 6, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
posts:18903
votes: 0


Ask him why the site:domain.com search is tally foobarred for the last week....
This 58 message thread spans 2 pages: 58
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members