One of my page, created in July 2012, ranks on page 1 (foxstart results), it always ranked on page 2/3 on main stream Google.
However a website I created like a week ago, doesn't show up in foxstart results (0 results), it shows up in mainstream Google results.
The total number of results in foxstart serps is very less. About 20 times less in my niche.
Interesting is that if you "view source" you can't see the SERP.... Hidden? Is that allowed by goog but not us?
Sure you can "view source" of the SERP ( and of the entire page ), use dragonfly or firebug.."drill down" with dragonfly ( Opera ), and you have absolutely everything visible..;)
I wonder if something went through this afternoon. Our traffic was average all day...and then like a light switch it has been turned up significantly.
Definitely looks like pre-panda results for some of my searches in Foxstart - pandalized sites in their old positions
I just checked a page that had the title and meta description changed on 2 September. Both Foxstart and Google are showing the new version.
They are also showing ranking changes on both, different top 10 though. I have to say I prefer Foxstarts, the spam that had cluttered page 1 for that term is gone.
Whether that means the algo is new or not, or if it will hit the main Google results is another thing altogether, but it doesn't look like old results.
Sorry - just to edit my previous post - it sghould have said "pre-penguin"
A number of searches on my keywords are definitely showing the types of results from before Penguin
Why, suddenly, is everyone referring to Foxstart?
The new buzz word or what? Google results are seen in Google, relying on a third party interpretation might be a bit dodgy. What is the reason for this sudden reliance on Foxstart - I would love to know?
For me I am seeing no signs of s Penguin 2.0 update.
|Why, suddenly, is everyone referring to Foxstart? |
You can check any other site which has custom Google search. Try blackle. Different results.
I can say for sure the foxstart results are not "old" at all. In fact I hope this is an update to the algorithm because our penguin penalties are all but gone in these SERPS.
I have made very strategic page changes to after penguin waiting for these pages to come back to page one or number one with no luck. On the foxstart SERPS I searched for a few of these long tail phrases targeted in these pages and surprisingly they appeared back at #1 where they were pre penguin except with the new title and meta information as of a few weeks ago.
The results can't be real, I know from experience that I am not that lucky :(
I have also made a new site 3-4 month ago it also dont show and some old titles also show on other sites, but I also like those results better then what we see now on google(altavista) yes its that bad.
Zeus, I don't think many of these folks were around during the altavista black Monday debacle.
[edited by: textex at 11:46 pm (utc) on Sep 7, 2012]
Ah..the days when we had a netscape icon on our green desktops..
zeus, textex - thats a bit harsh, I was involved in the engineering dept at Alta Vista at that time and I thought the results were rubbish then!
No excuse now that modern databases can store so much and access it so fast and crawlers can gather so much info.
A basic crawler and ranking engine now can pretty much do a good job of indexing enough of the web to give you a good search engine.
What seems to be happening is that Google is acting like it's index is so much smaller by giving brands higher rankings and host crowding.
That might be a good idea in theory but what it actually means is that it seems like a "lazy" result set. For example I could create a search engine that brings back less results - "because less is more" - however the real reason is I have crawled less than 1% of the web.
To me I don't get it at all - host crowding and brand promotion just aren't good for searchers, the metrics might look good but thats only because thats the only choice they got.
But take humans into consideration and it just doesn't stack up.
That is why Bing is so close now - it has a shallow index but yet still competes so well.
I feel Google has been struggling with the massive volume it has been crawling - and is in effect cutting back and simplifying everyting, both the algo and the depth of index.
More pages, more spam, more tolerance, more data - more problems.
If you discount more data and only rank what you feel is "higher quality" - then you have an easier job yes?
Then you need an algo to determine what goes into the index when you have crawled it, because you need to discount loads and loads of noise (or judged noise).
Now there you have hit it Swanson :) ..something I've noticed across the last 2 years is that whereas previously G would return a search for KW as being say of 2 billion results, now the same kW search says of 750 million results..
They do appear to be discarding a large number of "results"..and yet those pages still exist..so , upon what basis are they discarding so many results ( this happens on searches where the figures are much lower..such as "used to" be 4000K results, now 280K results and so on ) which are still in their index, from the "pool" of "results" from which they pick those which they display..
when we saw signs of them moving to 64 most of us figured that gave them capacity to spare, but are they indeed feeling the strain, and "culling" results ..in which case some may be "whitelisted" for inclusion, as you say "simplifying the algo" ..
The evidence that not all parts of the algos run at all times has always been there , even in such as simple a matter as white on white text etc..if the filter for that runs ..it must get all of it..but we still see hidden text..so they don't run that filter all the time, logically they do the same with other algo elements ..the "FUD" keeps webmasters never sure which may be running ..so the Fear keeps them ( for the most part ) in line..and self policing..
Smaller results pool ( pre sorted behind the scenes prior to searches being run )..saves overheads and allows results to be returned ever faster or at least not be swamped by the increase in pages in the index..
Indeed they have made their job easier, but the searchers true choice is lessened, as is the quality of the results when taken from a pre-sorted sub set of the actual number of pages crawled..
edit..you posted your second comment while I was typing ..yes Panda would fit..clumsy in many ways with possibility of errors, but a way of restricting what they have to search fast in response to any given query typed into the search box ..
pre-sorting also cannot be run continually , it must be cyclical..
Penguin is an attempt to run a concurrent and complementary sorting procedure ( targeting elements which if incorporated into Panda would make Panda too unwieldy ),But Penguin also requires a longer time frame to "pre-sort" the "full crawl index" and discard "surplus" and "not quality" so that what will be presented to the "typed in search" retrieval can be presented fast..
They do have a fixation upon "fast" Amit is apparently near manic on delivering faster and faster..without pre-sorting and an ever increasing size web ..it cannot be done, tricks like pre-fetching pages from returned results not withstanding..
At one time G boasted of the size of their "base"..now it appears they have been overwhelmed..
If you ask me then I do not think Google can create the results from billions of pages in 0.00023 seconds so my guess is that they have everything cached from a small set of "starred pages" that changes in the background but is not "live" directly. They gather all data they can from users searching, creates cached SERP's and releases them once an hour(?) or so.....
I have been reading most posts here and there seems to be a common thread with throttling, SERP limits and such. Now I need to go research :)
Im already in this SEO game for 10 years+ easy, but you guys make my stomach hurt. I have major issues on a site that has been up 10 years, we have been hit bad, and I still can't tell if we have any penalties or not. But I think these results and the recent changes from Google in 2012 are sick in a bad bad way. I'm considering taking action as it's killed us. I have been around long enough that I can talk to Matt C if needed but I don't want to go there yet.
Not happy. And really there needs to be more done to help people explain what is really happening. How would that hurt really.
Tired of this B$
|How would that hurt really. |
C'mon Hollywood, ( I'm the last one to make apologies for G or to go to bat for them ) that said, if they explain too much, it will be abused on a large scale..and the abuse will be very very bad for both their image, and for their bottom line..
Too many people work out how to keep high spots in organics, the adwords revenue stream will not increase at the speed that Wall street insists upon..
In their position, I think most of us would do the same..anyone not using adwords is a "potential revenue loss" for G ..they cannot make it too easy not to use adwords..FUD and obscurantism* and fear, is about the only way they have to try to keep control of us webmasters as a whole, and they know that it wont work for all of us, the brightest minds are not all in the plex..
Collateral damage has always been part of their algos, one might even say it is necessary to allow the most adaptable to rise, ( fits very much with the Montessori and the Jesuitical way of thinking too ), pre-sorting the SERPS ( in separate cyclical ways with major algos that can cross over and will cross over ( side reference to "mathematical sets" is obligatory here ) but which essentially have different criteria ) in order to keep up the appearance of speed and being the fastest ..
If I were G ..I'd do it that way..knowing that some initial breaking of eggs to make the omelet is unavoidable..
The logic must be..."the smart observant ones will get around it, and live to keep their sites going"..or..if they can't.. they can pay..
Not very "cuddly"..but then if you really look at the Montessori way ..and those who were / are raised in it.. it never was designed to be "cuddly"..nor did it ever pretend to be so..it's products are mostly hyper efficient, and to many would appear to be smilingly ruthless or dispassionate ( below the surface at least )..Bezos is another example..
* my spell checker is bi-lingual ,and sometimes I can't trust it, obscurantism may well not be the way the word for that concept is actually written in English, but my system doesn't give me a red line under it ..so :)
I have no idea if the custom search results foretell anything about a pending update (penguin/panda/other), but I will say that when I search snippets from my homepage (in quotes) on foxstart (and others), my site is the only one being returned, whereas on google.com, there are dozens, even hundreds of scrapers suppressing me to the supplemental body. The index is much, much smaller on some of the terms I am watching. I see a few of you talking about index size, but I'm not sure if you're referring to those results or not.
Exciting like the old days this!
"Custom search" ( from what I observe, is IMO and IME ) SERP is a "mix of sub sets" of "subsets" of what we normally refer to as the "index"..and one "custom search" site's SERP .may very well not be the same as another "custom search" site's SERP..even for the same query , performed from the same machine or IP address, in the same GEO area..and it may vary on the same site over a short period of time..
And that is not even getting into personalised or not, signed in or not , cookies cleared or not..think of "custom search" sites as like AOL search or BT search ( *BT actually use a custom Yahoo ( ex Overture ) feed, now of Bing ..but the point is the same..it is a "custom search" which they get a percentage of the click revenue from )..as sort of walled garden, a window with a restricted, sometimes changing view, cherry picked from some of what some parts of the index can be or might be..
When I at least, was talking about smaller index size ..or smaller results set numbers show on G..I mean the direct search SERP, as accessed directly via Google's own page ..not via "custom search" sites..
Those numbers have been reducing for the same searches since at least 2 years or two and half years that I've been aware of..while the actual web and the numbers of pages dealing with the search terms have been growing ..thus G is pre-processing the "set" used to provide search..Panda and Penguin would appear to be parts of that pre-processing or cull..and given the type of processing that would need, and the size of the main data set, and the smaller ones ..that would explain why they must be cyclical..and with comparatively long gaps between runs ..for now..
G hasn't reduced crawling..so faces ever expanding and increasing amounts of data to sift, and apparently pre sifting by concurrent separate algos ,Panda and Penguin and one I call "zig zag" ( the "catch the reactive SEOs" one that the patent from 2010 covers the basics of..it "zigs" and then watches, to see if you "zag" ) allows them to still return fast results..and to Amit ( and Amit decides what happens re search ) speed or returned serps and page load speeds/ time ( both Google's and ours ) is "all" ..
Bing crawls less frequently , less deeply, has a consequently smaller amount of data to deal with ..a different way of arriving at similar ends..
*BT..British Telecom..has a "custom search" for it's customers ..many ISPs around the world do ..some use G some Y ( originally Overture "feeds" ) or, as now Bing..
"Custom search" might "foretell" some things ..or give hints..but it is more likely useful as an indicator that sites or pages are "in there somewhere" and that were it not for ( at the moment black and white animals and birds..and "zig zag" <= "zebra" ? ;-) then they would not rank where they do in the results on Google's direct SERPS..but closer to where they may be glimpsed in through some "custom search" SERP windows as SERP built from a subset of a subset..kind of like looking though a crack in the side of the machine ..but what you see may not be what comes out the front ..and probably the order will have changed, when it does, because the "refining", isn't done in what you are seeing through the cracks..
I find "the why"..more interesting than "the what"..which ( "the why" ) is usually what interests me about Google anyway..;)..
Also bear in mind, as I mentioned earlier in this thread ..the foxstart site "custom search" sets a G history cookie ..even if you are signed out and have history turned off ..the more you search ..the more it is showing you what you appear to be looking for, or wanting to see..it will be showing you "tweaked results" based upon your "search history" performed on it ..and it is also watching your precise IP..
Other "Custom search" sites will be doing the same..
when I search snippets from my homepage (in quotes) on foxstart (and others), my site is the only one being returned, whereas on google.com, there are dozens, even hundreds of scrapers suppressing me to the supplemental body.
I wonder if G purposely makes its own organic results worse.
posting in this thread to be able to find the sage of Leosghost later on.
Caught this thread late...
I had a noticeable increase in traffic and conversions on Thursday the 6th, but then it quickly evaporated. Now it's Sunday and although there are plenty of people on the site all day, no conversions whatsoever. Bounce rate now up 7.5% from last week. As soon as it stabilizes you can always be sure they'll upset the apple cart again.
After further investigation I see in GA that on Wed & Thursday we had a huge spike in referrals from reddit dot com. The next day those referrals were gone.
|After further investigation I see in GA that on Wed & Thursday we had a huge spike in referrals from reddit dot com. The next day those referrals were gone. |
Reddit is very sporadic traffic you rarely have more than a days traffic from that particular site and has nothing to do with Google what so ever.
Also CSE searches are great for testing Google results in the USA if you don't happen to live in the USA, like me! There are a few good ones around like startpage and whitesearch. All the CSE sites give you basically the same results and they are never what a real Google search gives you. Why? I don't know maybe Google just likes to keep the best results for itself.
|The total number of results in foxstart serps is very less. About 20 times less in my niche. |
That's odd as in my SERP (one of them) in real google it has 2,030,000 and in Foxstart it has 38,000,000
In another of my SERPs the real google has 305,000,000 results and foxstart 53,700,000.
I am still seeing good results in foxstart but I see little sign of them moving into mainstream.
| This 73 message thread spans 3 pages: < < 73 ( 1  3 ) > > |