I think I would look at this as a possible issue.
|I also added reverse dnslookup and verify googlebot, msnbot and slurp and dump meta noindex or index if they are the right bots. |
Have you tested this to make sure your correctly sending the correct meta file?
When was the dns reverse lookup added?
I added the reverse dnslookup code after I found out that there's no traffic coming in...
and I confirm it with the google cache that it indexed the page with the meta INDEX but all my site pages accessed by other user agents will show nofollow noindex.
If you're going to change something for spoofed user-agents why not just serve them up a 403 with no content at all?
I thought about that... it's a good point... I probably should...
herculano it does look like to me it could be a human review type filtering.
Have you been doing anything at all that would cause a negative review from Google, or have you been involved in some 3 way linking lately?
what is the base tag that you enabled?
Is it important to have?
Check the home page for canonicalization. If you see multiple versions of the same page, then:
Apply htaccess 301 redirection if possible, and if not possible, use the rel="canonical" to refer Google to your best canonical version of the homepage.
Copy text from your homepage, text you believe to be unique, and place in quotes in Google. What do you see?
@bwnbwn all of my links are one way unique links but some might not be as relevant than others, and none of them are paid.
@erku, that's to tell the browser and the se(I believe) to use the base url for all relative urls.
@cainIV I tried searching some unique string on my page I did see some index.php pages with url parameters showing up in the index previous ppc efforts, I have since added items in google webmaster tool to ignore all possible parameters last jan 10th, which I'm guessing maybe I have duplicate pages because of the parameters, however google webmaster tools doesn't tell me I have html duplicate pages, I do have 2 duplicate title and duplicate meta description, which was also caused by the url parameters.
- do you feel this is possibly the cause of my delistment? of all pages across the site from google? I'm trying to make sure I learn from this mistake so I don't commit it again in the future.
|@cainIV I tried searching some unique string on my page I did see some index.php pages with url parameters showing up in the index previous ppc efforts |
You should only have one page on your website show up for unique searches of that text. If any of that text matches previous ppc efforts, simply rewrite the text.
|I did see some index.php pages |
Please elaborate on this part for us.
the string is actually a disclaimer under our page so we have that across the site, is this a bad idea? it's part of our site template.
I saw a bunch of index.php?param=blah¶m2=blah2... in google
for 2 days.. we saw google serving our pages on the first page and gave us some traffic and today it disappeared again... we saw about 10% each day of what we usally get...
is this teasing any indication of what they are trying to test? Is this a sympton of one of their filtering algorithm?
I have seen behavior that they only show a group of keywords index for your site.. for example:
for example I use to rank on all of these:
cheap green widget
affordable green widget
small green widget
cheap red widget
inexpensive red widget
unique red widget
holiday red widget
from time to time google only show our page with all
"green widget" keywords
and on other times google show our page with all
"red widget" keywords
I'm not too sure what filtering this symptons are for.
Check unique content on the pages, not the disclaimer area. Google likely filters that out of the equation if all of the pages have unique content. I prefer to use an image for the disclaimer to reduce the amount of duplicate text, but it's likely not necessary.
The easiest way to glean insights in my opinion is to copy the unique text from your landing pages, and compare this to the actual pages returned from your website in google.com.
The pages that are returned should match exactly, without parameters.
Make sure you do some reading about canonicalization at the home page of the website. I find far too many times this is still a problem for webmasters, even knowledgeable ones, and it is a good item to check off of your list early rather than later in a ranking drop assessment
I just read up on the canonicalization of home page from google blogspot. This is what you were talking about right? I never knew about this. Thanks. I'll add this.
To make this three links consolidate to one.
add to head
<link rel="canonical" href="http://www.example.com/product.php?item=swedish-fish" />
Let me look at the other stuff you wrote.
are any links fishy from G's perspective? Forget your view, look at them as Google might. Who do they link to normally? Are they related to your site? How do they link to you, 'sitename.com' or 'keyword keyword2' ? Do they look as bought links?
The proxy hijack in almost all cases occurs only when Google gives your site zero value in backlinks IMO.
walkman, they might seem like paid, but they are all not and we have over 11k backlinks to different pages of the site...
anyways, our site came back january 20th about 11 days in the dark... although I'm not really sure what fix caused it to come back but I'm glad its back, I'm implementing all fixes anyways to all our sites, they're all good to have...
happy camper again...
During the time your site was suppressed in the index, where did you rank for searches on your company/site name? Was it completely gone? Or, in the -30 to -50 range?
it was hovering around 1 to 10... but still on the first page...
it's weird I'm finding some of my sub pages are no longer indexed in google... but still have pr 4 or something.. and it doesn't have a cache version too...