| 10:25 am on Oct 1, 2007 (gmt 0)|
I’m, seeing what looks to be a new index on these ip’s
Anyone else seeing these DC's populate?
| 10:57 am on Oct 1, 2007 (gmt 0)|
I think you got closest to the explanation when looking for magic words that transform the SERPs into trash in smaller niches.
Such as 'buy'...
... or other minor commercializing additions to just about anything that doesn't have mainstream (inter)national market (players). Add BUY to any such query, and bring up the usual nonsense, such as expired ebay pages on its n+1 subdomains. Remove BUY from your query to see somewhat OK results. ( Which btw sometimes are better for the previous queries than what you got presented for them. )
I suppose you tracked these SERPs during the last few weeks.
Weren't these littered with spam lately?
That could also be a key. I mean if the only way to get rid of the spam ( until the infrastructure update ) was to raise the trust thresholds to a certain level, which only these... uh... monsters ( ebay, amazon, etc ) can clear. That'd explain where the others have gone. I mean ebay and amazon are... probably in the top 20 most linked sites from the Web 1.0 era, so if the trust dial is turned up, it wouldn't be surprising to see a blast from the past.
Which would be more or less be in line with my theory.
Areas affected are very specific small niches, stuff not too many people look for.
- Google relied on relevancy alone to rank sites in a certain niche for...
- Niche doesn't have much academic references, not much of spotlight in the media either.
- Trust threshold was set low, no spam was expected.
- Then comes along a spam-network. Doesn't have trust, but it doesn't really need any.
- It cripples the rankings.
- Google tightens the trust filters.
- And removes about half of the niche, what remains are trusted sites, that aren't as relevant.
- Remember. Trust is a parameter.
- Some small niches are simply not reached by this parameter or not in sufficient amounts.
Or in other words (?)
- For smaller, very specific niches which fall far from the network of trust ( low search volume, not enough generic authority sites as mediators ) and thus could be easily run over by the latest spam surge, Google sacrificed relevancy for trust in order to keep their users safe until they implement a proper solution... ( as others have said before, they try to show trusted players instead of those that are linked to a lot by relevant but less trusted sites. Resulting in some big sites with irrelevant content showing in the SERPs in these areas. )
- What many sites in these niches are experiencing is pretty much... the Sandbox effect.
- Their trust parameter isn't high enough. They're relevant, but don't show up for certain words / word orders.
- Like 'buy', or 'sell' or 'country' or 'city' names.
- It's the same thing that hit most industries in 2005 when trust was introduced, except that with smaller niches, there are very few/no sites with the required parameter... aka. references from trusted hubs, authorities. ( are there any at all? )
- This probably won't stay like this, for the trust thresholds right now are set unrealistically high.
- No or few legit sites can compete in an environment set as if it was some international mainstream topic.
Again remember, trust is a parameter, it's not always about real life or user trust. Of course it's aiming to be, but there are some blind spots with no hubs propagating the parameter to niche sites.
What you can do...
- Wait... ( as if a SEO could do that )
- Get one or two ultra strong, trusted links to your site or your referrers ( regarding the latter: I'm not kidding )
- Alternatively send relevant gifts to major national and international newspapers and organizations, 'bringing the beauty of your industry' to their attention ( don't mention the internet ) and see if they'll write about you ( don't tell them I told you to do so ). This is called lobbying... oh, and... welcome to politics. *smirk*
| 12:14 pm on Oct 1, 2007 (gmt 0)|
Re: " I’m, seeing what looks to be a new index on these ip’s
I see different results on these ip's also. The differences I can see is that the above ip's contain most of my pages that have been deindexed over the past week from the current results. I also see a larger result set on these ip's than what is currently showing. I hope this is a new set of results. If it is an old index, it isn't more than 14 days old b/c a new subdomain of mine is showing and I just added it 14 days ago.
| 12:38 pm on Oct 1, 2007 (gmt 0)|
On these IP addresses I now see the "Related Searches" are back again.
Don't see my site back yet :(
| 12:47 pm on Oct 1, 2007 (gmt 0)|
nutrition ecommerce area.
All this trust linking, natural linking, new content stuff does it really work or are we just wasting our time...hmmm I wonder if it really matters really.
Why I say this is I see sites (plural) returning 60,000,000 results with 1 uno link in Google and 2250 links in yahoo on the fist page beating .gov sites with 2300 in Google and 1.1 Million in yahoo. These site do not have articles or fresh content added.
I see sites using hidden text that have been turned in spam report by different users with 1 link no articles no content other than products outranking sites that have articles new content. I see sites that haven't changed their front page in years out ranking sites that do add content.
I see sites interlinking from same owner ranking some are given the Trust Rank with multiple url's 6 top results. All the whois information is the same on 8 sites this company owns. They use the same content same product pages same urls all doing very very well in Google.
I have come to determine that white hat SEO is almost dead and that you have to begin looking at other ways to rank in Google.
With Yahoo coping the Google results we have to determine what works on Google will work just as well in Yahoo..
| 12:53 pm on Oct 1, 2007 (gmt 0)|
Miamacs - good post.
Other thing to consider is that are the text url's now being given high importance ( which would be suggested by the fact that pages with both backlink and text url are still showing up)
Or is backlink penalty so bad that the text links are the only thing left to display.
Or not so much penalty but backlinks from non authority type sites are now simply ignored and given no weight whatsoever. This would also explain mention of url in text featuring highly.
Think i will go with number 3
next thing to consider is what constitutes an authority type site. How do we find them and how do we become them. There would no doubt be hundreds of thousands, if not more, authority type sites so it seems highly unlikely that a list would be incorporated into algo but algo is deciding whats authority and whats not by other means. But what?
Also need to find the ratio of good to bad that sets off the trigger.
Sad day for SEO - Death of the non authority backlink.
| 2:38 pm on Oct 1, 2007 (gmt 0)|
I have been tracking these datacenters for a few weeks now and what I find is that they were cleaner two days ago than they are now.
The regular results for example seems to have been iontermixed with some other junk that you would typically not see in the same search.
I suspect G is just experimenting with these DC's
The real question though is how do your results vary between the DC'c and your regular search results....
The real answer is in identifying the algo between the DC's and your regular results.
| 2:53 pm on Oct 1, 2007 (gmt 0)|
|next thing to consider is what constitutes an authority type site. How do we find them and how do we become them. There would no doubt be hundreds of thousands, if not more, authority type sites so it seems highly unlikely that a list would be incorporated into algo but algo is deciding whats authority and whats not by other means. But what? |
Wasn't there a Google patent filing a while back about using as few as 200 "seed sites" to convey quality status, authority, or whatever via a trickle-down effect?
|Sad day for SEO - Death of the non authority backlink. |
If that were the case, I wonder how many years it would take for the current flood of link-exchange spam to subside?
| 3:39 pm on Oct 1, 2007 (gmt 0)|
This looks good. Less supplementals, more indexed pages.
[edited by: SEOPTI at 3:39 pm (utc) on Oct. 1, 2007]
| 3:54 pm on Oct 1, 2007 (gmt 0)|
|Wasn't there a Google patent filing a while back about using as few as 200 "seed sites" to convey quality status, authority, or whatever via a trickle-down effect? |
It's called TrustRank.
however: TrustRank isn't a Google patent. It was filed by 2 Yahoo! employees and a guy from Stanford, and later the word "TrustRank" got trademarked by Google. The patent has been modified not to include the word in its title, but the original copies still carry the name. Both Yahoo! and Google utilize a similar method which was originally set up to battle spam. Google uses different hubs (seeds) for different regions. These are mostly the ones you'd choose yourself btw.
| 4:21 pm on Oct 1, 2007 (gmt 0)|
The term that is affected in my market is the biggest grossing Adwords term by miles. In fact it outperforms all other terms (put together) by a ratio of 3:1.
For every $10,000 spent in my market $7,500 are on the affected term.
In terms of impressions around 60% are for this two word term.
Must be just a coincidence.
| 4:43 pm on Oct 1, 2007 (gmt 0)|
May have jumped the gun with my previous comments. Besides .gov's and .edu's every single site on first page results for industry I am dealing with, whether it be google directory, yahoo directory, pdf's, forums, blogs, newsletters, mailing lists ALL have url in text.
was thinking before that perhaps yahoo directory, google directory etc must have been authority sites but the ones showing up for me seem to be because they too have url written in text. Backlinks seem to mean nothing at the moment, even the good ones and its all about text url.
| 4:57 pm on Oct 1, 2007 (gmt 0)|
What do you mean with "all have URL in text"?
| 5:03 pm on Oct 1, 2007 (gmt 0)|
as in www.widgets.com as opposed to hyperlink. Check the results in google and look at the bold standing out in every listing.
As for widgets.com.cn when viewing source they also have widgets.com, I don,t think its anything to do with geolocation its just that they also have widgets.com in code thats not hyperlink.
| 5:42 pm on Oct 1, 2007 (gmt 0)|
Weird, on first source view of widgets.com.cn site was getting <code>blahblahblah/widgets.com/+blahbla.cn</code> with firefox
and second view not getting it. Site mess of iframes, anyway I beleive the above to be the reason for the .cn sight coming up. Just not sure why its intermittent.
| 6:09 pm on Oct 1, 2007 (gmt 0)|
Sorry, starting to make a fool of myself, but getting to the bottom of this none the less. Disregard the above .cn info (very sleepy and was viewing google cache and not the site)
The reason the .cn sites are coming up is because where widgets.com is not in text url it is in either title or url
ie a search on widgets.com will bring mainly sites with widgets.com in text url but it will also bring sites with widgets.com in title on occasion and also in url.
So the new algo is favouring sites that have coming from other sites:
1 widgets.com mentioned as text url
2 widgets.com mentioned in title
3 widgets.com mentioned in url, for instance widgets.com.cn
and its paying no attention to backlinks
Problem solved - going to bed :)
| 7:29 pm on Oct 1, 2007 (gmt 0)|
Those are some dam spooky results. It's the first time my site has ever dropped out of Google. I'm getting a “ton” of sites that I have never seen before duplicating my pages and displacing me in the results. This really blind-sided me because I know how to find duplicate content. Triple WOW because these junk blogs, RSS feeds, forums, cloakers, and directories are coming out of nowhere with duplications and scrapes of my pages. Just on a quick view I’ve found a dozen blogs that made site maps of my pages then replaced the URL’s with their own. Its mind blowing some of the schemes I’m seeing. It could take months to undo the damage.
Also it pretty well confirms to me who is linking to you “now” may actually damage you in the results.
Universal search pretty well embraces the underbelly of the Internet giving exposure to every huckster and schemer on the planet. Google needs to bury this idea that it has a better way immediately and learn how to run a search engine first.
| 11:45 pm on Oct 1, 2007 (gmt 0)|
yeh, I agree.
But this current algo change is so ridiculous and still in place so there must be something they are trying to acheive with it. Surely they could not get it this wrong by accident.
| 4:49 am on Oct 2, 2007 (gmt 0)|
Those DC's are working well for me with on an honest to goodness, all original content site. No scraping, no feeds, just good old fashion content.
| 7:03 am on Oct 2, 2007 (gmt 0)|
Surely those DC bring back up more legitimate sites like a breeze of fresh air.
Let's say the worst of the worst appears much less but it's not very good either, let's say not as bad.
However as Tedster mentioned earlier in this thread, querying a DC does not mean much these days.
|Surely they could not get it this wrong by accident. |
Your confidence in large corporation functioning is impeccable. :)
I'm not sure myself how this can happen to that degree. It's less visible on high volume keywords but Google strength used to be those thousands of weird keywords typed and returning no so bad results.
Currently you get lucky if you even find what you are looking for in the top 20, in your own language and without virus warnings.
At this point I see again more and more US specific businesses ranking on Google TLD's other than .com.
At the same time Google.com keeps showing these sites from abroad or locally specific while my queries are NOT locally specific.
Concerning the URL mentioned in backlinks I don't really find much difference now than in the past 2-3 months, no domain in URL and pure paid textlink from about anywhere rank about anything on Google.com - fast. It wouldn't be such a bad measure to use this as factor but before that Google should be able to determine whether or not the are able to weight incoming links as close as possible to what a human could do...and they are showing right now that the hell they can Not do it.
I also think that Google is able to parse domain names for some time now. They can understand widget1widget2 as widget1-widget2.
They started adding weight on some domains based on their names back a few months ago when they've implemented their "anti-bowling' algo.
I thought it was a huge mistake at that time and consequences are right in our SERP's now, it's a nightmare of cheasy:
[any number here]widget.com
Amazing how cheap mortgage, adult, and the like so-called blogs get indexed more frequently than some other pages that used to be valued and sometimes are not even indexed anymore. Ahhhh yeah that's right, I got it: scrapping, semi-automated copyright infringement, paraphrasing, are made from sites that often refresh content.....fresh content is good, fresh content is relevant, or is it not?
It really, really feels like they've turned off everything that used to filter the difference between Good and Bad, on theme and off-theme. Same if their algorithm was reduced to very few factors.
| 7:15 am on Oct 2, 2007 (gmt 0)|
|Check the results in google and look at the bold standing out in every listing. |
The bold standout means nothing and never has. It simply shows you where your query may have been found in the particular listing. It has nothing to do with where that website ranks.
For all of the genres I see, I see no example where keyword driven links from high quality websites doesn't win out.
| 7:18 am on Oct 2, 2007 (gmt 0)|
Junk for me:
But then again... the spammers and people that have come from nowhere will love this and the people who have been displaced will hate it.
Thats the problem with this datacenter watching. If someone goes down someone takes thier place. You will always have winners and losers.
Right now though I am finding my search terms to be pretty much junk. Yes there are some valid in the mix and probably enough to placate the joe public searcher but for an internet pro I know junk when I see it.
Ask.com are pretty cool at the moment and the layout and organisation is nowhere near as cluttered as that currently being offered by google.
I found the results at ask.com to be a lot better than they have been for a long time too.
| 8:17 am on Oct 2, 2007 (gmt 0)|
"The bold standout means nothing and never has. It simply shows you where your query may have been found in the particular listing. It has nothing to do with where that website ranks."
Then why was this not the case before?
my point is that they are ALL text mentions and not hyperlinks with the absence of any hyperlinks except from .gov's and .edu's
"For all of the genres I see, I see no example where keyword driven links from high quality websites doesn't win out."
For the industries I am looking at I'm seeing none of that.
Why do you think their are so many spam sites coming up - its because this algo is full of exploit and there the ones taking advantage of it.
| 4:29 pm on Oct 2, 2007 (gmt 0)|
|t really, really feels like they've turned off everything that used to filter the difference between Good and Bad, on theme and off-theme. Same if their algorithm was reduced to very few factors. |
If that's the case, what you're seeing is probably the storm before the calm. :-)
| 10:25 pm on Oct 2, 2007 (gmt 0)|
We have seen no changes. Although, this is the time of year google typically kicks up ecom site for the holiday shopping season.
| 10:28 pm on Oct 2, 2007 (gmt 0)|
-- If that's the case, what you're seeing is probably the storm before the calm --
I sure hope so, cause it looks like spamdex, and not index
| 10:32 pm on Oct 2, 2007 (gmt 0)|
I should agree with trinorthlighting, I have seen a 250% increase in traffic on one e-commerce site of mine (not selling directly).
But a major decrease of about 30% atleast on rest of content based sites/forums.
| 10:42 pm on Oct 2, 2007 (gmt 0)|
This is typically time where they test their holiday serps, so it will be up and down until November 1st Different Data centers are different tests. Typically October produces three different sets of results in data center groups.
| 2:44 am on Oct 3, 2007 (gmt 0)|
This is not conclusive by any means but those three Dc's do not seem to be updating. I changed some titles and meta tags the same day i saw them and all other Dc's appear to have picked up the change apart from those three so it could well be a test between the two sets of data and the storm is brewing....
| This 176 message thread spans 6 pages: 176 (  2 3 4 5 6 ) > > |