homepage Welcome to WebmasterWorld Guest from 54.197.183.230
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 58 message thread spans 2 pages: < < 58 ( 1 [2]     
Meeting with a Google engineer
Hercules2

10+ Year Member



 
Msg#: 3097682 posted 3:45 pm on Sep 26, 2006 (gmt 0)

I have a meeting with a Google Engineer tomorrow about Organic(!) Search. I have exactly one hour!

What would you ask him? I have some ok questions but i think i will forget something...
I'm gonna post the usable answers here of course!

 

Whitey

WebmasterWorld Senior Member whitey us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 3097682 posted 1:33 pm on Sep 27, 2006 (gmt 0)

About the sandbox... its related to links, older domains have benefit from having more links.

Hmmm .... i can show you a result to demonstrate only the top few ranking sites for the identified search term seem to have any justification for outranking our site when looking at title tag, body content, and linking considerations.

We have an older site. I wonder if the filters are working correctly, and if they are, what does it take to unblock them.

Links? - we have plenty. New links - that isn't happening either, we have authority links , trust links, links built at the right pace, deep links.

I'm thinking that the recent fixes or mass changes may have triggered some people into some sort of filter.

Do you have any other insights from this engineer that might be more explicit.

[edited by: tedster at 3:12 am (utc) on Sep. 28, 2006]

MrStitch

5+ Year Member



 
Msg#: 3097682 posted 1:39 pm on Sep 27, 2006 (gmt 0)

Phil -

I guess I'm confused here. You said that you don't want this certain page cached because of security/safety reasons.

To fix this, you put the noarchive tag in and now it's gone from the serps in google. Isn't this what you wanted? Isn't that the point of the noarchive?

JoeSinkwitz

5+ Year Member



 
Msg#: 3097682 posted 1:45 pm on Sep 27, 2006 (gmt 0)

I would have liked to know the best way to have scrapers removed from the SERPs...I follow my own procedures, but if there were a special spam report that incorporates a pseudo-DMCA, that'd be ideal. One site I watch lost just about every ranking it had due to a couple thousand subdomain overseas scrapers...annoying.

trinorthlighting

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3097682 posted 2:00 pm on Sep 27, 2006 (gmt 0)

The copyscape site is very useful. In the past we have copied and pasted manufacturer descriptions and used them. We are going to experiment with a smaller site with this and see what happens.

NedProf

5+ Year Member



 
Msg#: 3097682 posted 2:22 pm on Sep 27, 2006 (gmt 0)

Tomorrow I'm the one that has a meeting with someone from Google. Keep posting questions, I will use the interesting ones :)

texasville

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3097682 posted 2:51 pm on Sep 27, 2006 (gmt 0)

Nedprof,
This is all becoming very interesting. New users, meetings with google engineers?..
Well, welcome to WebmasterWorld.
Yeah and I posted a question already that evidently didn't get covered.

jetteroheller

WebmasterWorld Senior Member jetteroheller us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 3097682 posted 3:38 pm on Sep 27, 2006 (gmt 0)

shooting from the hip... on your not ranking well.
1. no duplicate content on your site/ did nobody steal your copy? you can look at this with "copyscape"

I testest short with copyscape, did not find.

An issue could be

Each of my pages has a contact form

example.com/folder/file.htm has a link to

cgi.example.com/cgi-bin/formmail.pl?by=example.com/folder/file

The formmail page used until Friday last week exactly the same
title line.

My new version since Friday has as title "Contact form"

2. no strange javascripts with links build in on your site

Please could You be more specific?

3. no "nofolow" tags on your site.

Never used

4. don't use your keyword too many times on a url

I have many different keywords.

nicco

10+ Year Member



 
Msg#: 3097682 posted 4:04 pm on Sep 27, 2006 (gmt 0)

please ask how google threat multilanguage sites:
- does server location influence rankings on specific datacenters (in my experience sites located in italy only rank on google.it for italian and foreign keywords)?
- is site's structure important for a multilanguage ranking (do we have to use different domains for each language or it is enought to use different pages)?
- does the homepage language suggest to google the main language for that site?

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3097682 posted 7:46 pm on Sep 27, 2006 (gmt 0)

>> ask the engineer if something as simple as the www versus NON-www issue could put most of your pages in supplemental. <<

We already know that it can and it does.

The Supplemental Index is a repository for several types of URLs.

Firstly it contains URLs that return Duplicate Content compared to other URLs - www vs. non-www, multiple domains, variable dynamic parameters, capitalisation issues (IIS only), http vs. https, etc (with most or all of the duplicates in Supplemental, URL-only, or removed from the index).

Next, it contains URLs that are now redirects or are recently 404, and they hang about in the index for a year.

The Supplemental Index also contains the previous version of the content for normally indexed pages. You see this when the exact same URL can be a normal result for some search queries, and a Supplemental Result for other search queries (Supplemental when you use words that were on the old version of the page and are NOT on the new version of the page - you can see that old content in the snippet, even though it is no longer in the cache or on the real page).

rainborick

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3097682 posted 9:07 pm on Sep 27, 2006 (gmt 0)

Ask him how a site can (a) have a respectable PageRank on the root URL, (b) ample number pages from the site, including the root URL, crawled AND cached as normal and updated on a continuing basis, and (c) have no pages in the index. I've seen several sites in this condition over the past 3 months.

Whitey

WebmasterWorld Senior Member whitey us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 3097682 posted 11:45 pm on Sep 27, 2006 (gmt 0)

Could we expand a bit on these earlier questions which still leave a few answers

About the sandbox... its related to links, older domains have benefit from having more links.

I've got reservations over the answer because :

i can show you a result to demonstrate only the top few ranking sites for the identified search term seem to have any justification for outranking our site when looking at title tag, body content, and linking considerations.

We have an older site. I wonder if the filters are working correctly, and if they are, what does it take to unblock them.

Links? - we have plenty. New links - that isn't happening either, we have authority links , trust links, links built at the right pace, deep links.

I'm thinking that the recent fixes or mass changes may have triggered some people into some sort of filter.

Do you have any other insights from this engineer that might be more explicit.

I'll show you the example which will take 1 min to understand in specifics.

[edited by: tedster at 3:15 am (utc) on Sep. 28, 2006]
[edit reason] fixed quote [/edit]

Bewenched

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3097682 posted 1:25 am on Sep 28, 2006 (gmt 0)

I would like to know what google is doing to combat the site scrapers.

I've even seen them scrape google search results, take out the html, post the page and plaster it with adwords.

Why will adwords NOT police these sites. Why are they allowing garbage sites to run their ads and chase away their advertisers.

I pulled all our advertising for content match when sites kept coming up in the results, running our ppc ads on our own stolen content.

NedProf

5+ Year Member



 
Msg#: 3097682 posted 6:55 am on Sep 28, 2006 (gmt 0)


Nedprof,
This is all becoming very interesting. New users, meetings with google engineers?..
Well, welcome to WebmasterWorld.
Yeah and I posted a question already that evidently didn't get covered.
I'm reasonably new at this forum, but experienced in SEO/SEM in the Netherlands ;)

I will list all the questions en see what I find interesting.

[edited by: NedProf at 6:56 am (utc) on Sep. 28, 2006]

plasma

10+ Year Member



 
Msg#: 3097682 posted 12:13 pm on Sep 28, 2006 (gmt 0)

I'd ask wether there is something like a sandbox or not.

Whitey

WebmasterWorld Senior Member whitey us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 3097682 posted 2:43 pm on Sep 28, 2006 (gmt 0)

Added to the above questions on sandbox issues, is there a possible large scale sandbox malfunction? :

http://www.webmasterworld.com/supporters/3085210.htm

I just came across this blog which indicates to me that filters may be stuck on sites, when perhaps there is no reason for them to be in place.

Have a look at this experiment which shows how, by changing the URL's on a site, the filter that suppresses rankings, is released and results return [ except where there is a known error ]. 4 sites reappeared and one held.

How to release the filter on your site [aaronshear.com]

Is there a Google problem with the sandbox or is this just webmastery getting around the routine?

[edited by: Whitey at 2:45 pm (utc) on Sep. 28, 2006]

jdancing

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3097682 posted 4:18 pm on Sep 28, 2006 (gmt 0)

Phil,

Alot of sites vanished 3 months ago. Could it be a timing issue? Sometimes we tend to blame what we just did to our site for SERP problems when it could be an issue that has been going on for months or years that finally got spotted by the SEs.

JD

ashear

10+ Year Member



 
Msg#: 3097682 posted 5:44 pm on Sep 28, 2006 (gmt 0)

They are very willing to answer questions, they want to learn how to make their results better.

Just ask the right questions.

Don't ask: Why is my site not in the index. You can easily find these stock answers in their terms of services, and webmaster guidelines.

Ask questions that they can actually answer.

If I where to do "blank" what would happen?

Just my 2 cents

GaryTheScubaGuy

5+ Year Member



 
Msg#: 3097682 posted 9:33 pm on Sep 28, 2006 (gmt 0)

Is it just me, or is all this a little hard to believe?

If Hecules2 has an "inside" friend, then so be it. But if that's the case, then get information that's worth something, not information that's circulated around the SEO community for years, posted on Cutts blog, and frankly worthless.

GaryTheScubaGuy

GaryTheScubaGuy

5+ Year Member



 
Msg#: 3097682 posted 9:48 pm on Sep 28, 2006 (gmt 0)

Forgot I can't edit in this forum...

Hi everyone,

I just had my meeting with the engineer. It was pretty cool. He had tips for getting reindexed: Fix everything and start using google sitemaps.

Everything? Wow that's broad. Start using Google sitemaps...that's VERY old...

Sites that used a lot of spam or are from known spammers will not be reindexed.

No kidding? When you say won't do you mean will not or aren't? Because I've reported them for over a year and they are still there. Matt Cutts even mentioned one of my concerns in his WebmasterWorld radio interview 6 months ago and its still there.

About the sandbox... its related to links, older domains have benefit from having more links.

Wow, now this is a revelation

The main goal should be getting links with good link descryption from relevant sites.

So? Any other revelations? Anything else NEW?...

And no he didn't know who Google Guy was...

Are you sure this was an engineer and not a Google cook? Who doesn't know?

Seeing this, and the genuine responses, plis the additional posts trying to inflate this rediculousness even higher, really reinforces why I visit and post here so rarely.

Hecules2, your one of 2 things, and I hope for the respect this forums draws the 2nd, and not the first.

The second would be someone who knows better than to give away valuable information.

GaryTheScubaGuy

GaryTheScubaGuy

5+ Year Member



 
Msg#: 3097682 posted 9:59 pm on Sep 28, 2006 (gmt 0)


>> ask the engineer if something as simple as the www versus NON-www issue could put most of your pages in supplemental. <<

We already know that it can and it does.

The Supplemental Index is a repository for several types of URLs.

Firstly it contains URLs that return Duplicate Content compared to other URLs - www vs. non-www, multiple domains, variable dynamic parameters, capitalisation issues (IIS only), http vs. https, etc (with most or all of the duplicates in Supplemental, URL-only, or removed from the index).

Next, it contains URLs that are now redirects or are recently 404, and they hang about in the index for a year.

The Supplemental Index also contains the previous version of the content for normally indexed pages. You see this when the exact same URL can be a normal result for some search queries, and a Supplemental Result for other search queries (Supplemental when you use words that were on the old version of the page and are NOT on the new version of the page - you can see that old content in the snippet, even though it is no longer in the cache or on the real page).

Webmaster Tools at Google has a tool to merge the two. It will affect your supplementals, but IMHO, and from experience, in a positive way. It merges more than just your supplemental results.

NedProf

5+ Year Member



 
Msg#: 3097682 posted 8:38 am on Sep 29, 2006 (gmt 0)

To bad, I have to disappoint you all, it was not an Google engineer but a Sales engineer from Google. He knew a lot of Google Maps, Google Analytics and the API's. The meeting was a good one, but he couldn't help me with questions about natural search.

Ducki

5+ Year Member



 
Msg#: 3097682 posted 12:44 pm on Sep 29, 2006 (gmt 0)

Understood NedProf. We have a former Google employee working for us and it's surprising how little one department knows about the other. Heck, I'm shocked at how little they knew about their own department!

Ducki

tiori

5+ Year Member



 
Msg#: 3097682 posted 1:10 pm on Sep 29, 2006 (gmt 0)

LOL That tells us some of the "google employees" posting in this forum may not have a clue. They are just guessing like most of us.

Just because you are a "google engineer" or "google employee" doesn't mean you know what is going on. Or are able to share with us what is really going on. (Corporations do have secrets)

While they have shared alot of information with us; it could be mostly speculation or "spin" on their part!

NedProf

5+ Year Member



 
Msg#: 3097682 posted 2:48 pm on Sep 29, 2006 (gmt 0)

I can confirm that, the guy from Google told me that they know as much as you can read on the web. They have only one advantage: they are connected with the Googleplex. And the Googleplex can confirm answers and fix things.

ranksmart

10+ Year Member



 
Msg#: 3097682 posted 8:08 pm on Oct 3, 2006 (gmt 0)

For anyone still following this thread, I have pulled some questions from the responses here and will ask them to Matt Cutts. He will be on our WebmasterRadio.fm show today at 5pm EST called the Search Pulse. I expect to ask him a handful of questions from this thread.

Whitey

WebmasterWorld Senior Member whitey us a WebmasterWorld Top Contributor of All Time 5+ Year Member



 
Msg#: 3097682 posted 8:34 am on Oct 6, 2006 (gmt 0)

RankSmart - sorry i missed the radio - did anything get answered?

Hercules2

10+ Year Member



 
Msg#: 3097682 posted 1:03 pm on Oct 6, 2006 (gmt 0)

Really egoboosting to hear matt cutts discussing my thread.

g1smd

WebmasterWorld Senior Member g1smd us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3097682 posted 10:14 pm on Oct 6, 2006 (gmt 0)

Ask him why the site:domain.com search is tally foobarred for the last week....

This 58 message thread spans 2 pages: < < 58 ( 1 [2]
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved