Welcome to WebmasterWorld Guest from 188.8.131.52
I received a communication last night that discussed a buzz going around that Google is going to shake up how it deals with backlinks. Specifically that Google is going to give less weight to less relevant linkage. This source also heard from another direction that a big change was coming in the next few days.
Anyone know anything?
brinked, there may be something in what you say but I think that there is more to it than that. There's one page in the SERPS of a term that I watch that has simply bought sitewide links in a few blogs. In some it is in the blog roll, in others there is a paragraph of text in the left or RH column with links to the site. In each case they use the exact 2 word term.
If these paid links are relevant google happy about it. When I look at some sectors the entire Top 1000 is spammed with sites buying links big time. Just the ones at the top buy on-topic links and sites below buy any links.
Now I see that the typical grey/black hat is working the hell way better than honest marketing within the Google guidelines.
It's like anything works right now.
For some of the keywords our site used to be on page 1 for years, now we have squidoo, thin affiliate (affiliate links being hidden though), ABSOLUTELY not content nor updates but very high density of keywords and their variations.
I am appaled when I see 1 site, that has 1 backlink from 1 squidoo page and not only the squidoo page ranks on page 1 but the site does as well.
I also see websites with heavy onsite SEO, with every keyword variations but not much of any other content ranking really high and steady for a few days. They actually started doing well a few week ago if I remember correctly but never that well.
Last I'd say that all penalties on link sculpting have been removed. They must have been removed somehow from what I see.
Also the actual theme of website my god, you can have websites that talk about cars setup an SEO landing page for pet food right now and it works.
Sitewide links seem to do well again.
And some sites that were penalized somehow back approximetaly 2 years ago are back in the game, they still are built for SEO, they are still spamming in my opinion, but now it's working again for them.
Not sure what Google's cooking but it looks like any bold SEO trick is working WAY better than it did in the past 3 or 4 years.
Well guys, black hat and bold cheap SEO tricks are back big time in my industry.
I actually am livid looking at some SERPs. I would have bet all I have that the some of the pages that are currently ranking so well would never make it through even close to the top 200. But they do now. wow!
They all look like disposable websites to me.
Not sure what can be done on our end. Users used to find what they were looking for and liked it, when they would find it....for years.
I don't even think that it's anything we do wrong, I think that cheap sites just got a boost and only so-called authorities keep their rankings high right now.
By the way, I was on the phone with a competitor yesterday. They got demoted thanks to another competitor trying and succeeding to harming their site. Well, Google don't tell me that it's almost impossible to harm a site, it's in fact very easy now that I see it anyone can demote any site with little effort as long as you're not trying to demote a super authority type of competitor, the Wikipedia or a .gov!
I myself don't see the polluted serps as your describing but then again I am not in ever sector and can only look at the nitches I am working under
It has been my observation followgreg when the serps get like what you describe above this is what G wants to happen so the Review team and Mats team can put the necessary data in place that will deal with what your describing. It is easier to review a site when they are on page 1 verses page 200 and G knows what filters were relaxed that would allow for the "New" 1st page ranking to pop up.
Good point. As Yogi Berra once said, it's "deja vu all over again." If history is any guide, we'll be hearing furious complaints from a different set of members when the new, improved filters are in place.
This is the perfect time of year for major tweaks, and one should never assume that what happens today is a new standard for tomorrow or next week.
This is the perfect time of year for major tweaks, and one should never assume that what happens today is a new standard for tomorrow or next week.In the past, that probably was a good strategy for the goog, because there really wasn't an alternative. To bad this time they did this right around the time that there actually is an alternative engine that is actually providing a better quality search experience. It's peeled off quite a few supporters.
These newcomers seem oddly random -- some are quite good, some are complete crap with no content at all, some are subsections of larger, unrelated sites. Some appear to be heavily SEO'd, others don't have unique titles for their subpages. They seem to be keyword-specific -- that is, I'm not seeing the same new site across all keywords, but rather different new sites for each keyword.
Quite strange. I doubt this will last.
Give Bing a try, it is returing far better results than Google.
Perhaps they are a bit more natural because very few have bothered to optimize for Bing. There algo looks even easier to spam than Google though.
Back to Google. There have been some major experiments this year that have been short lived and I think (really am hoping) what we are seeing now is another short term test and we will go back to something more palatable in the last week of the month.
Meanwhile I'm finding ways to pull run of site links out of template hot spots (like the footer) and into content. Those Textpattern custom fields will be finding a very useful purpose. This could be a good thing and make us a bit more creative and on-topic with each page.
I have seen sites lose natural search rank but keep allinanchor rank. Not sure what that means if anything. Do penalities usually affect allinanchor rank?
I guess at the beginning of this thread tedster said a change in evaluating links occurred.
Do people think its going after paid links in an algorthimic way or if its penalizing sites for lots of small crappy links? Or is it ignoring links from sites that are off topic?
Like maybe a site about widgets links to a site about some other unrelated topic. And that link would be devalued.
shouldn't the whole site getting kicked-off
Maybe. It could be that trust is now variable by phrase. Possibly less competitive (or less monetised, if there's any difference) KW phrases, require lower trust standards.
Whatever criteria used for variable trust levels, this differential would have two effects, which could be easily confused:
1) High trust sites get swamped as low trust sites get propelled above them on "low trust allowed" phrases.
2) Lower trust sites lose rankings as they no longer make the grade on "high trust required" phrases.
On the crappy SERPs, are you looking at vanity or long tail phrases?
How would you gage that? If your link building strategies have always been the same, basic stuff like, some recips, related sites only, article marketing etc...and they were okay to rank for months, it is possible that suddenly 1 year later trust was lost and the site got banished?
Also, can a site be sandboxed if it is over 1 1/2 years old? Is that considered new enough to work into the sandbox theory?
As far as I'm concerned trying to discuss "the sandbox" blurs the discussion rather than focusing it. I don't know if trust is a direct factor in the yo-yo or not, but I re-framed the question that way because it seemed a better focus.
What is trust - in Google terms? That's a thread on its own, or several threads.
Put simply, my understanding is that trust calculation begins with a hand-picked group of seed sites -- and it then measures the link distance from those seed sites. If a new set of seed sites gets chosen, then the trust of any linked domains can shift. And if a site is removed from the seed set, then trust may fall away from those sites it links to, with further repurcussions for the second- and third-generation sites in the overall web graph.
It does seem to me that a yo-yo ranking (especially on and off the first page) could be a way of saying "we are not sure whether to trust this url for this keyword". So it gets shown just some of the time until the data makes things clear.
Now just because that makes sense to me in an abstract way does not mean that Google is really doing this, but the idea does have a kind of logic. Deep changes in content might upset a trust value. Suddenly getting impressions for a search term where a site never ranked before also might raise questions about why that happened.
If the company is "big", does that mean they are always "trusted"? Clearly not. But might such a company get at least a little benefit of the doubt - enough to yo-yo in and out, just to see what the users think?
[edited by: tedster at 9:41 pm (utc) on July 8, 2009]
I agree with tedster. There is no such thing as a sandbox. If only it were that simple. There are many reasons and factors why a page will not rank for its targeted keyword(s). Either there is some kind of penalty (over optimization, keyword stuffing etc) or simply because your site does not have the content/back link authority it needs to be competitive in your industry.
People who fail at something and don't want to try again to succeed will give up and blame it on something other than themselves.