301's were not a factor in my loss either. The only changes that have been done to our site were adding about 500 or so unique products and adding a bunch of new unique content pages.
Our plural "widgets" term jumped back last night, but our "holiday widget" and "widget" terms are still in the 100+ positiion after being in the 1st spot for years.
Fear the iron fist of G. -50 on 5 sites. Thanks Mr G.
I have an authority site (10+ years old) that has been totally knocked out of SERPS for most holiday keyword phrases - traffic is down 60%. The only thing I did recently out of the ordinary was to 301 redirect from /directory/index.html to /directory/ to help with duplicate content as per this thread:
Most inbound links (thousands) were pointing to /directory/index.html page including all internal links to homepage - www.example.com/index.html. When pagerank updated in October - I went from a PR6 to PR5 on homepage. I'm now questioning whether the pagerank is passing.
Should I remove the redirect?
I wouldn't remove the 301. I WOULD fix all those internal links.
301s are a good way to catch old link juice, not to mention help actual visitors find the page the were looking for. They are not an excuse to be haphazard with your internal navigation.
I completely agree. I've heard from a few people the past couple of days who panicked - thinking that somehow their 301's were causing ranking problems so they removed them. Now they have more ranking problems.
Don't remove your 301 redirects unless you KNOW you were doing something deceptive with them. Legitimate 301 redirects are a best practice.
[edited by: tedster at 4:37 pm (utc) on Nov. 15, 2008]
Have people seen fluctations in the site: command?
I've seen traffic up for sites which have decreased counts with the site: command which is odd - you would think it is the other way around.
Yes, the past week or so I've seen more fluctuation in the site: operator results than normal. Because of the way data is sharded at Google, the reported numbers are a challenge for Google to get accurate - and because they are not core search results, they are also not a top priority. I always suspect some kind of significant back end changes when the site: operator results start dancing.
>>Legitinate 301 redirects are a best practice.
Agree, this is not the problem. Google has no problem with 301s.
- Shop keeper ranks for "large Wigets" in number 1 in google for some time
- Shop keeper due to hign traffic surge from being 1 in google stocks up on "large widgets" ready for Christmas sales
- Shop keepers website gets whacked by Googles algo dial twist just a couple of weeks before the main trading period and now finds that "large Widgets" is in position 50+ in google hence no traffic
- Shop keeper buys into adwords for term "Large Widgets" and gives back to google a portion of his profits, moans like hell about it in WebmasterWorld, looks for tweaks and reasons why, hates google for it, but buys adwords all the same.
- googles final quarter profits surge again as a result
Next year the cycle repeats again with a different algo dial knob twist that has the same end result and another load of shopkeepers that ranked for something have the same experience!
Call me cynical but havent we all seen this before? By now the penny should drop - google are in the business of selling traffic, not giving it away
As such google can only ever be a part of a websites marketing reach, its just it has such a hold that its often hard to see past this
In the meantime best practice is to leave things alone, sure if something needs fixing then fix it but dont change your site to fit in with an ever changing google algo - it will just drive you crazy.... and i should know im still taking a dozen "calms tablets" a day and humming and rocking in my chair muttering "bloody google..."
|google are in the business of selling traffic, not giving it away |
Join the club Rich. Google really took away the Internet from everybody. Plus now there’s a price tag on everything because of them.
The only positive point I can tell you is I have one commerce site that goes untouched by Google year after year. It has absolutely no connections with any Google program such as Adsense, Adwords, WMT, or G-Mail. I use a few minor tricks on it to keep Google’s snoopy ___ in check. It hums merrily along in most engines. The caveat like you say is if I touch it or add to it Google will somehow and someway destroy the business. That’s why I relegate it to price and gif changes. I am careful to make it appear the same. None of what I say though is guaranteed to work because Google wants its money more than anybody does. They’ll find a crack eventually.
>> Legitimate 301 redirects are a best practice.
what are "non-legitimate" 301's?
Legitimate 301 redirects are a best practice means the 301 is a proper 301 to resolves correctly. Hosting companies do 301's but the header shows it to be a 302 not a 301. This is what he means by being a Legitimate 301.
I asked my host before I knew the difference to set up a 301 they did but it really was a 302 when checked so if you don't do the 301 after your host, programmer, or whoever set it up always check to make sure it throws a proper 301 and not a 302.
This may actually be some of the issues faced by some of the sites that have experieced issues after setting up 301's when if not confirmed my actually be 302's and yes this will cause serious poblems.
After setting up 301/302/404 or whatever status code you should always do a header check, always.
|what are "non-legitimate" 301's? |
Redirects that are in some way deceptive. I'm not going to give a tutorial on how to be deceptive, but one classic example would be a 301 redirect that is cloaked -- so that only search engines get that redirect, but regular browsers get something else.
|what are "non-legitimate" 301's? |
Also: webmasters in the past have bought high PR sites (domains, actually) based on their PR value; then they tried to pass that juice to other sites to boost SERPs via Redirects.
(Some domain drop sites will even tell you which expiring domains have many IBLs.)
Clever idea, but Google caught it, of course, as they usually do.
So what about whitenight's "Ghost DataSet", eh? I see many signs that Google has once again changed their back end infrastructure.
One tell that I see - the site: operator numbers are all over the map again. In the past, those numbers have been notoriously out of line many times because of the way Google shards data. They can only pull estimates, with no way to just get a straight count. But in recent months, Google was doing rather well with those number estimates.
Now we're all wonky again - I'd say something changed on the back end set-up that made the old method less accurate. And beyond just the numbers, a number of urls are missing from site: results that are still showing up in the SERPs and getting normal traffic.
(reference http://www.webmasterworld.com/google/3777510.htm [webmasterworld.com])
At pubCon, bwnbwn told me that he did ask Matt Cutts about the October 1 strangeness. Apparently Matt's answer was pretty much the same, with just a bit of new vocabulary. It seems Google had some trouble "polling one IP" and they only got partial data to integrate into their full results. That went live without the normal QA, and so we had some very funky SERPs.
It sure looks like the data that was missing was a "domain root" list for a subset of websites, eh? ...and it was the domain roots for some pretty good websites at that.
But the problem was fixed within a couple days, so if you're still seeing something unhappy for your site, don't just keep on hoping it's still a mistake. It looks this is what you get!
[edited by: tedster at 5:01 am (utc) on Nov. 17, 2008]
lol stop reading between the lines and then EXPLAINING in clear terms what I'm pointing at, tedster.
That info is only reserved for those who can get over my sarcasm, condescension, and/or bravado.
That's half my schtick [en.wikipedia.org]! and you know that! =P
As i intimated before, this was a huge update and change to the algo (although some may not notice it and others are surely noticing it).
So one can expect the full implications to be seen after New Years when Goog will not have to worry about relevant Holiday SERPs to really let this new algo go.
The 301 issues may be resolved by then.. or not. Haven't paid enough attention to them yet to tell if its a corollary issue or a unique issue.
If it hits one of my sites, I'll be much more inclined to study it.
[edited by: whitenight at 4:24 am (utc) on Nov. 17, 2008]
Ted, glad we are talking about the site: changes as well as I think this is something important to notice right now. Yes, site: commands have been a bit wonky in the past but they are a total mess right now especially when certain pages are sought as in:
site:example.com "just bring me pages with this text"
One thing I suspect is this: Google has expanded the number of pages in their primary index and they've also brought in many new pages from the supplemental index in this index. I do notice that some sites with less link strength are just having their pages deindexed altogether (from the primary or supplemental index, but this observation may just be the site: command messed up as well).
Here's something I like to think about. Google dropped the "supplemental results" tag [webmasterworld.com] over a year ago. At that time, there seemd to be just two partitions of their full data-set, which we called the regular index and the supplemental index.
About the same time, Google applied for a patent on selectively searching partitions of a database [webmasterworld.com]. That's "partitions" with an "s".
I'm assuming that Google has moved beyond a 2 partition infrastructure, and that whatever the /* hack shows us today, it isn't what we once saw.
I still find /* very useful. I find very nice correlations between the link strength of a site and the proportions of a sites pages that are shown when using /*
Will add one thing. The proportion of pages shown when using /* on the site command has also shown big fluctuations in some of the sites I monitor which also correspond to ranking changes.
|Here's something I like to think about. Google dropped the "supplemental results" tag over a year ago. |
Ok, lets talk about it.
Did supplementals go away completely or just behind the curtain?
If A, how does this explain the SERPs?
If B, how does this explain the SERPs?
How will a non-2 partition affect the SERPs?
How can we use this information to our advantage?
I am now seeing pharmacy auto-generated-page spam now for very popular terms. Even worse these sites redirect to another domain (subdomain even).
One example: 3 of the top 10 results are 3 different domains all with redirects to the spam auto-generated content site/page.
This has knocked out people like AOL from the top 10.
I need to research more terms to see how prevalent this is but I haven't seen this kind of crap since I stopped doing it in 2005/6 (last used DSG) because google was just too good at detecting it.
Something is definitely amiss..
I should not to my email above (Ted, please combine posts if you wish).
But the keyword terms are NOT pharmacy terms. They are in the health genre, but certainly not viagra/cialis junk like the results I am seeing.
So these spammers have broken through the barrier, at least for now.
302 internal redirects: are these now being considered suspicious?
I have a application sorting form setup, so when a visitor answers the three pre-qualification steps/questions---(based on the criteria they enter), they will be sent to a different internal application/page.
I noticied in my web logs, these pages are being counted as 302 redirect...
when someone clicks my apply now button the call to action goes as: /appsorter.php to criteriabasedapp1.html or criteriabasedapp2.html
/appsorter.php stays constant throughout all this, and shows up in logs as being 302'd thousands of times internally...
Or, is all this a non-issue you think?
*I do have the folder (containing all the app sorting files, php stuff/logic stuff) blocked by robots.txt, so there is noindexing, nofollowing etc...if that makes a difference.
Sorry to be so late joining this discussion. I am new to the webmasterworld board.
I do SEM/SEO work for local companies and have been involved in internet marketing since 1997.
One of my clients dropped off the map on October 31, and hasn't come back. I don't think this a result of any nefarious or black hat activities as I have worked with this client for several years and they are very serious about staying on the up-and-up with the search engines. They have been in operation since 1997 and get over 100,000 unique visitors a day.
Using the "site:" command on Google shows 92,900 pages indexed, but we don't show in SERPs -- even for the company name. if any one has any
I filed a re-inclusion request on the 11th and a second after doing additional research on the 18th. This clearly happened on or about Oct 31, and I was wondering if others are still experiencing this problem?
Interesting observation ...
I just noticed traffic from Google 1st page results coming into my site for terms that disappeared Nov 2nd. When I clicked on the referral link - sure enough - I'm back in position 4 from this data center. Weird thing is that there is no "cached" version of my page. There is a "similar pages" link - but no cached version.
I tried looking info up on various data centers and this site doesn't show up on 1st page results anywhere else - but I may be looking at old data center lists.
If I do a "site:example.com" command the page that is now ranking (on just one datacenter) shows up with a cached version - just not on the search result page for that one datacenter. I'm confused.
nmjudy - I also noticed that same thing. The keywords that keep dropping and coming back have show the homepage listing with no cache.
I'm seeing strong pickup for the Description or H2 text on one site. The phrase Keyword 1 Keyword 2 is neither in the domain name nor site title, nor anchor text (nor IBLs that I know of), but just the meta Description tag (once) H2 tag (once) and body text (once) on the home page.
If I can't reasonably fit text in a Title I bump it to the Description. I sometimes wonder if Google sees the meta Description as an extension of the Title, or Subtitle, which in a way it is. We know Google strongly weights the Title; maybe the Description gets more weight than we realized. (I doubt H2 > Description for ranking.)
I just checked and the 90,000+ pages that were indicated with the 'site:' command has suddenly grown to 218,000, but the 'cache:' shows no (none) pages for this site.
I just saw something really bizarre in my raw logs.
Google, according to the log, brought a visitor to Site A that usually comes from Site B. Site B has top 10 ranking for the search phrase; whereas Site A has none of the keywords, and never brought any visitors before for those keywords.
Both sites are on the same server. Either the raw logs got their wires crossed (which I have never seen before, and I've looked at the raw logs for different sites on the hosting account many times in the past). Or Google got its wires crossed.
I can't figure out another explanation.
If the latter situation is indeed the case, it would indicate Google's algo somehow looks at various websites on the same server and this glitch exposed it.
(The visitor came via EARTHLINK, INC in NYC. The timing, if anybody else happens to see the same anomaly, was 08/Nov/2008:21:49:35 -0500.)
I've made a mistake in the past with putting Google Analytics code for one site into another. But here's we're talking about raw logs.
"GET / HTTP/1.1" 400 312 "http://www.google.com/search?hl=en&q=keyword1+ keyword2&aq=1&oq= keyword1+keyword3" "-"
Note: There weren't any 301 redirects or any other redirects involved in this (at least not at my end). Google, however, may have done some funky redirect, e.g: default to a secondary site on the same server, based on some matching data in its historical site profile (connecting one site to another).
| This 186 message thread spans 7 pages: < < 186 ( 1 2 3 4 5  7 ) > > |