Welcome to WebmasterWorld Guest from 220.127.116.11
That's the Ghost Dataset.
I don't see it as a bad thing those home pages went "missing". I see it as a good thing.
Would you be kind enough to explain your rationale for this please.
[edited by: tedster at 9:28 pm (utc) on Nov. 5, 2008]
Our plural "widgets" term jumped back last night, but our "holiday widget" and "widget" terms are still in the 100+ positiion after being in the 1st spot for years.
Most inbound links (thousands) were pointing to /directory/index.html page including all internal links to homepage - www.example.com/index.html. When pagerank updated in October - I went from a PR6 to PR5 on homepage. I'm now questioning whether the pagerank is passing.
Should I remove the redirect?
Don't remove your 301 redirects unless you KNOW you were doing something deceptive with them. Legitimate 301 redirects are a best practice.
[edited by: tedster at 4:37 pm (utc) on Nov. 15, 2008]
- Shop keeper ranks for "large Wigets" in number 1 in google for some time
- Shop keeper due to hign traffic surge from being 1 in google stocks up on "large widgets" ready for Christmas sales
- Shop keepers website gets whacked by Googles algo dial twist just a couple of weeks before the main trading period and now finds that "large Widgets" is in position 50+ in google hence no traffic
- Shop keeper buys into adwords for term "Large Widgets" and gives back to google a portion of his profits, moans like hell about it in WebmasterWorld, looks for tweaks and reasons why, hates google for it, but buys adwords all the same.
- googles final quarter profits surge again as a result
Next year the cycle repeats again with a different algo dial knob twist that has the same end result and another load of shopkeepers that ranked for something have the same experience!
Call me cynical but havent we all seen this before? By now the penny should drop - google are in the business of selling traffic, not giving it away
As such google can only ever be a part of a websites marketing reach, its just it has such a hold that its often hard to see past this
In the meantime best practice is to leave things alone, sure if something needs fixing then fix it but dont change your site to fit in with an ever changing google algo - it will just drive you crazy.... and i should know im still taking a dozen "calms tablets" a day and humming and rocking in my chair muttering "bloody google..."
google are in the business of selling traffic, not giving it away
Join the club Rich. Google really took away the Internet from everybody. Plus now there’s a price tag on everything because of them.
The only positive point I can tell you is I have one commerce site that goes untouched by Google year after year. It has absolutely no connections with any Google program such as Adsense, Adwords, WMT, or G-Mail. I use a few minor tricks on it to keep Google’s snoopy ___ in check. It hums merrily along in most engines. The caveat like you say is if I touch it or add to it Google will somehow and someway destroy the business. That’s why I relegate it to price and gif changes. I am careful to make it appear the same. None of what I say though is guaranteed to work because Google wants its money more than anybody does. They’ll find a crack eventually.
I asked my host before I knew the difference to set up a 301 they did but it really was a 302 when checked so if you don't do the 301 after your host, programmer, or whoever set it up always check to make sure it throws a proper 301 and not a 302.
This may actually be some of the issues faced by some of the sites that have experieced issues after setting up 301's when if not confirmed my actually be 302's and yes this will cause serious poblems.
what are "non-legitimate" 301's?
Also: webmasters in the past have bought high PR sites (domains, actually) based on their PR value; then they tried to pass that juice to other sites to boost SERPs via Redirects.
(Some domain drop sites will even tell you which expiring domains have many IBLs.)
Clever idea, but Google caught it, of course, as they usually do.
One tell that I see - the site: operator numbers are all over the map again. In the past, those numbers have been notoriously out of line many times because of the way Google shards data. They can only pull estimates, with no way to just get a straight count. But in recent months, Google was doing rather well with those number estimates.
Now we're all wonky again - I'd say something changed on the back end set-up that made the old method less accurate. And beyond just the numbers, a number of urls are missing from site: results that are still showing up in the SERPs and getting normal traffic.
(reference http://www.webmasterworld.com/google/3777510.htm [webmasterworld.com])
At pubCon, bwnbwn told me that he did ask Matt Cutts about the October 1 strangeness. Apparently Matt's answer was pretty much the same, with just a bit of new vocabulary. It seems Google had some trouble "polling one IP" and they only got partial data to integrate into their full results. That went live without the normal QA, and so we had some very funky SERPs.
It sure looks like the data that was missing was a "domain root" list for a subset of websites, eh? ...and it was the domain roots for some pretty good websites at that.
But the problem was fixed within a couple days, so if you're still seeing something unhappy for your site, don't just keep on hoping it's still a mistake. It looks this is what you get!
[edited by: tedster at 5:01 am (utc) on Nov. 17, 2008]
As i intimated before, this was a huge update and change to the algo (although some may not notice it and others are surely noticing it).
So one can expect the full implications to be seen after New Years when Goog will not have to worry about relevant Holiday SERPs to really let this new algo go.
The 301 issues may be resolved by then.. or not. Haven't paid enough attention to them yet to tell if its a corollary issue or a unique issue.
If it hits one of my sites, I'll be much more inclined to study it.
[edited by: whitenight at 4:24 am (utc) on Nov. 17, 2008]
site:example.com "just bring me pages with this text"
One thing I suspect is this: Google has expanded the number of pages in their primary index and they've also brought in many new pages from the supplemental index in this index. I do notice that some sites with less link strength are just having their pages deindexed altogether (from the primary or supplemental index, but this observation may just be the site: command messed up as well).
About the same time, Google applied for a patent on selectively searching partitions of a database [webmasterworld.com]. That's "partitions" with an "s".
I'm assuming that Google has moved beyond a 2 partition infrastructure, and that whatever the /* hack shows us today, it isn't what we once saw.
Will add one thing. The proportion of pages shown when using /* on the site command has also shown big fluctuations in some of the sites I monitor which also correspond to ranking changes.
Here's something I like to think about. Google dropped the "supplemental results" tag over a year ago.
Ok, lets talk about it.
Did supplementals go away completely or just behind the curtain?
If A, how does this explain the SERPs?
If B, how does this explain the SERPs?
How will a non-2 partition affect the SERPs?
How can we use this information to our advantage?
One example: 3 of the top 10 results are 3 different domains all with redirects to the spam auto-generated content site/page.
This has knocked out people like AOL from the top 10.
I need to research more terms to see how prevalent this is but I haven't seen this kind of crap since I stopped doing it in 2005/6 (last used DSG) because google was just too good at detecting it.
Something is definitely amiss..
I have a application sorting form setup, so when a visitor answers the three pre-qualification steps/questions---(based on the criteria they enter), they will be sent to a different internal application/page.
I noticied in my web logs, these pages are being counted as 302 redirect...
when someone clicks my apply now button the call to action goes as: /appsorter.php to criteriabasedapp1.html or criteriabasedapp2.html
/appsorter.php stays constant throughout all this, and shows up in logs as being 302'd thousands of times internally...
Or, is all this a non-issue you think?
*I do have the folder (containing all the app sorting files, php stuff/logic stuff) blocked by robots.txt, so there is noindexing, nofollowing etc...if that makes a difference.
I do SEM/SEO work for local companies and have been involved in internet marketing since 1997.
One of my clients dropped off the map on October 31, and hasn't come back. I don't think this a result of any nefarious or black hat activities as I have worked with this client for several years and they are very serious about staying on the up-and-up with the search engines. They have been in operation since 1997 and get over 100,000 unique visitors a day.
Using the "site:" command on Google shows 92,900 pages indexed, but we don't show in SERPs -- even for the company name. if any one has any
I filed a re-inclusion request on the 11th and a second after doing additional research on the 18th. This clearly happened on or about Oct 31, and I was wondering if others are still experiencing this problem?
I tried looking info up on various data centers and this site doesn't show up on 1st page results anywhere else - but I may be looking at old data center lists.
If I do a "site:example.com" command the page that is now ranking (on just one datacenter) shows up with a cached version - just not on the search result page for that one datacenter. I'm confused.
If I can't reasonably fit text in a Title I bump it to the Description. I sometimes wonder if Google sees the meta Description as an extension of the Title, or Subtitle, which in a way it is. We know Google strongly weights the Title; maybe the Description gets more weight than we realized. (I doubt H2 > Description for ranking.)