Forum Moderators: Robert Charlton & goodroi
About the sandbox... its related to links, older domains have benefit from having more links.
Hmmm .... i can show you a result to demonstrate only the top few ranking sites for the identified search term seem to have any justification for outranking our site when looking at title tag, body content, and linking considerations.
We have an older site. I wonder if the filters are working correctly, and if they are, what does it take to unblock them.
Links? - we have plenty. New links - that isn't happening either, we have authority links , trust links, links built at the right pace, deep links.
I'm thinking that the recent fixes or mass changes may have triggered some people into some sort of filter.
Do you have any other insights from this engineer that might be more explicit.
[edited by: tedster at 3:12 am (utc) on Sep. 28, 2006]
shooting from the hip... on your not ranking well.
1. no duplicate content on your site/ did nobody steal your copy? you can look at this with "copyscape"
I testest short with copyscape, did not find.
An issue could be
Each of my pages has a contact form
example.com/folder/file.htm has a link to
cgi.example.com/cgi-bin/formmail.pl?by=example.com/folder/file
The formmail page used until Friday last week exactly the same
title line.
My new version since Friday has as title "Contact form"
2. no strange javascripts with links build in on your site
Please could You be more specific?
3. no "nofolow" tags on your site.
Never used
4. don't use your keyword too many times on a url
I have many different keywords.
We already know that it can and it does.
The Supplemental Index is a repository for several types of URLs.
Firstly it contains URLs that return Duplicate Content compared to other URLs - www vs. non-www, multiple domains, variable dynamic parameters, capitalisation issues (IIS only), http vs. https, etc (with most or all of the duplicates in Supplemental, URL-only, or removed from the index).
Next, it contains URLs that are now redirects or are recently 404, and they hang about in the index for a year.
The Supplemental Index also contains the previous version of the content for normally indexed pages. You see this when the exact same URL can be a normal result for some search queries, and a Supplemental Result for other search queries (Supplemental when you use words that were on the old version of the page and are NOT on the new version of the page - you can see that old content in the snippet, even though it is no longer in the cache or on the real page).
About the sandbox... its related to links, older domains have benefit from having more links.
I've got reservations over the answer because :
i can show you a result to demonstrate only the top few ranking sites for the identified search term seem to have any justification for outranking our site when looking at title tag, body content, and linking considerations.We have an older site. I wonder if the filters are working correctly, and if they are, what does it take to unblock them.
Links? - we have plenty. New links - that isn't happening either, we have authority links , trust links, links built at the right pace, deep links.
I'm thinking that the recent fixes or mass changes may have triggered some people into some sort of filter.
Do you have any other insights from this engineer that might be more explicit.
I'll show you the example which will take 1 min to understand in specifics.
[edited by: tedster at 3:15 am (utc) on Sep. 28, 2006]
[edit reason] fixed quote [/edit]
I've even seen them scrape google search results, take out the html, post the page and plaster it with adwords.
Why will adwords NOT police these sites. Why are they allowing garbage sites to run their ads and chase away their advertisers.
I pulled all our advertising for content match when sites kept coming up in the results, running our ppc ads on our own stolen content.
I'm reasonably new at this forum, but experienced in SEO/SEM in the Netherlands ;)
Nedprof,
This is all becoming very interesting. New users, meetings with google engineers?..
Well, welcome to WebmasterWorld.
Yeah and I posted a question already that evidently didn't get covered.
I will list all the questions en see what I find interesting.
[edited by: NedProf at 6:56 am (utc) on Sep. 28, 2006]
http://www.webmasterworld.com/supporters/3085210.htmI just came across this blog which indicates to me that filters may be stuck on sites, when perhaps there is no reason for them to be in place.
Have a look at this experiment which shows how, by changing the URL's on a site, the filter that suppresses rankings, is released and results return [ except where there is a known error ]. 4 sites reappeared and one held.
How to release the filter on your site [aaronshear.com]
Is there a Google problem with the sandbox or is this just webmastery getting around the routine?
[edited by: Whitey at 2:45 pm (utc) on Sep. 28, 2006]
Just ask the right questions.
Don't ask: Why is my site not in the index. You can easily find these stock answers in their terms of services, and webmaster guidelines.
Ask questions that they can actually answer.
If I where to do "blank" what would happen?
Just my 2 cents
If Hecules2 has an "inside" friend, then so be it. But if that's the case, then get information that's worth something, not information that's circulated around the SEO community for years, posted on Cutts blog, and frankly worthless.
GaryTheScubaGuy
Hi everyone,I just had my meeting with the engineer. It was pretty cool. He had tips for getting reindexed: Fix everything and start using google sitemaps.
Everything? Wow that's broad. Start using Google sitemaps...that's VERY old...
Sites that used a lot of spam or are from known spammers will not be reindexed.
No kidding? When you say won't do you mean will not or aren't? Because I've reported them for over a year and they are still there. Matt Cutts even mentioned one of my concerns in his WebmasterWorld radio interview 6 months ago and its still there.
About the sandbox... its related to links, older domains have benefit from having more links.
Wow, now this is a revelation
The main goal should be getting links with good link descryption from relevant sites.
So? Any other revelations? Anything else NEW?...
And no he didn't know who Google Guy was...
Are you sure this was an engineer and not a Google cook? Who doesn't know?
Seeing this, and the genuine responses, plis the additional posts trying to inflate this rediculousness even higher, really reinforces why I visit and post here so rarely.
Hecules2, your one of 2 things, and I hope for the respect this forums draws the 2nd, and not the first.
The second would be someone who knows better than to give away valuable information.
GaryTheScubaGuy
>> ask the engineer if something as simple as the www versus NON-www issue could put most of your pages in supplemental. <<We already know that it can and it does.
The Supplemental Index is a repository for several types of URLs.
Firstly it contains URLs that return Duplicate Content compared to other URLs - www vs. non-www, multiple domains, variable dynamic parameters, capitalisation issues (IIS only), http vs. https, etc (with most or all of the duplicates in Supplemental, URL-only, or removed from the index).
Next, it contains URLs that are now redirects or are recently 404, and they hang about in the index for a year.
The Supplemental Index also contains the previous version of the content for normally indexed pages. You see this when the exact same URL can be a normal result for some search queries, and a Supplemental Result for other search queries (Supplemental when you use words that were on the old version of the page and are NOT on the new version of the page - you can see that old content in the snippet, even though it is no longer in the cache or on the real page).
Webmaster Tools at Google has a tool to merge the two. It will affect your supplementals, but IMHO, and from experience, in a positive way. It merges more than just your supplemental results.
Just because you are a "google engineer" or "google employee" doesn't mean you know what is going on. Or are able to share with us what is really going on. (Corporations do have secrets)
While they have shared alot of information with us; it could be mostly speculation or "spin" on their part!