Welcome to WebmasterWorld Guest from 54.145.144.101

Message Too Old, No Replies

CNN: Growing Backlash to AdSense Farm Update

   
3:17 pm on Feb 26, 2011 (gmt 0)

WebmasterWorld Administrator brett_tabke is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month Best Post Of The Month



[money.cnn.com...]

Google made one of the biggest changes ever to its search results this week, which immediately had a noticeable effect on many Web properties that rely on the world's biggest search engine to drive traffic to their sites.

The major tweak aims to move better quality content to the top of Google's search rankings. The changes will affect 12% Google's results, the company said in a blog post late Thursday.

Comments from site operators lit up on the WebmasterWorld.com forum starting on Wednesday. Many webmasters complained that traffic to their sites dropped dramatically overnight, and others expressed concern that they can't adapt quickly enough to Google's changes to its algorithm.
7:36 pm on Feb 28, 2011 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



@TheMadScientist
Someone asked why people 'hate' eHow in a thread and I don't remember exactly which one, but I'll tell you why I despise eHow, it's simple: jsNoFollow in the source code instead of proper, linked attribution like it should be. Makes Me Livid!


It's funny that you mention them. I just got two google notifications of our name being used on their site. Yup they quoted us word for word on their stupid site.. gave us "bogus" link credit with their no follow... but our site has a explicit copyright notice and they copied us anyway. GRRrr.
7:40 pm on Feb 28, 2011 (gmt 0)

WebmasterWorld Senior Member themadscientist is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



@Bewenched

That sucks, but it sounds like the perfect opportunity to maybe raise a stink with a DMCA (or 3) and even possibly the media since there's so much talk about this one ... Might be the only way to get anything accomplished.
7:45 pm on Feb 28, 2011 (gmt 0)

WebmasterWorld Senior Member themadscientist is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



I've actually just started adding 'original archives' of my pages so I will have solid proof of copying based on last modified dates of the files on my server and hard drive ... Others may want to do the same.

[edited by: TheMadScientist at 7:46 pm (utc) on Feb 28, 2011]

7:46 pm on Feb 28, 2011 (gmt 0)



I've already contacted my local media folks. I can point out high profile blogs that got hit and are already raising a fuss. I've shot out at least a dozen DMCA forms to Google already. We shall see what happens.

On a good note, some of my key terms are recovering. Overall traffic still down, but there is recovery on some things that were hard hit post algo.
7:56 pm on Feb 28, 2011 (gmt 0)

5+ Year Member



12% hit is a giant step to protect hard working content companies.
8:11 pm on Feb 28, 2011 (gmt 0)

WebmasterWorld Administrator buckworks is a WebmasterWorld Top Contributor of All Time 10+ Year Member



gave us "bogus" link credit with their no follow... but our site has a explicit copyright notice and they copied us anyway


It sounds like it's time for more webmasters to get more aggressive about DMCA complaints.
9:08 pm on Feb 28, 2011 (gmt 0)

WebmasterWorld Senior Member netmeg is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



It sounds like it's time for more webmasters to get more aggressive about DMCA complaints.


Yea, that's true, but geezopete, there are only so many hours in a day, and we end up going from scrapers stealing content to scrapers stealing content AND time. I mean, I know it's the solution we have now, and back when you might find something to DMCA once a month or even once a week, it might have been manageable, but now, if I really went at it for my own sites and my clients - I'd be doing nothing else 20 hours a day.
9:14 pm on Feb 28, 2011 (gmt 0)

WebmasterWorld Senior Member themadscientist is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



I remember reading about a guy who started suing companies for sending spam email to his account and making all kinds of money (over $1 mil so far I think), I would very seriously consider looking into doing something in the same direction with DMCA if I was in your spot netmeg.

I've really cut down on the number of sites I'm involved with lately, but if I was running a bunch and DMCA was an issue, I would think very seriously about compensation for my time, if not profitability.
9:41 pm on Feb 28, 2011 (gmt 0)

5+ Year Member



12% hit is a giant step to protect hard working content companies.


Unless if your site happened to be one of the hard working content companies that got hit.
9:51 pm on Feb 28, 2011 (gmt 0)

5+ Year Member



guy who started suing companies for sending spam email to his account and making all kinds of money


You know, if there aren't already any, there's potentially a huge DMCA lawyer market... like the ubiquitous personal injury or DUI lawyers.

"STOLEN CONTENT? YOU HAVE RIGHTS! 1-800-DMCA-NOW!"

:-)
9:56 pm on Feb 28, 2011 (gmt 0)

5+ Year Member



Can anyone confirm if this has been rolled out on Google.ca yet?
11:10 pm on Feb 28, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>> I think G would be looking for a 'dead head' search (where the searcher stops searching for a result) as higher ranking.

Wast that the old DirectHit (?) algorithm?
11:15 pm on Feb 28, 2011 (gmt 0)

WebmasterWorld Senior Member lame_wolf is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



It sounds like it's time for more webmasters to get more aggressive about DMCA complaints

You try raising a DMCA with Care2 or Pizco. The former ignores them, and the latter ends up with a Mailer-Daemon email. They both state they take copyright infringements seriously. Yeah right.
12:09 am on Mar 1, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



For several years, Matt Cutts, along with other Google spokespeople, have been talking about the fact that "bounce" is a noisy signal. Yea, they measure it but apparently they can't use it for any heavy lifting in the algo.


This only makes sense. In many industries what defines a bounce is purely subjective, and often a bounce means success, in the case of a telephone call or sale delivered via a different domain
1:57 am on Mar 1, 2011 (gmt 0)

WebmasterWorld Senior Member whitey is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



a bounce means success, in the case of a telephone call or sale delivered via a different domain


I couldn't think of a scenario where Google would welcome a high bounce rate where a referal is involved. Doesn't mean there aren't any - i just couldn't think of any.

btw - I'm suspicious Google Chrome has ramped up a lot of extra intelligence involving usability that could be used in the knew algo.
1:58 am on Mar 1, 2011 (gmt 0)

5+ Year Member



I couldn't think of a scenario where Google would welcome a high bounce rate where a referal is involved. Doesn't mean there aren't any - i just couldn't think of any.


How about a landing page with a "call us" call to action
2:19 am on Mar 1, 2011 (gmt 0)

10+ Year Member



Our large enterprise client (B2B) seeing the significant improve of number of organic traffic, because it is unique content and has huge number of quality back links. In another way, I can perceive that Google filter out low quality site and makes Branded site perform better.
2:26 am on Mar 1, 2011 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Makes sense, Nuttakorn. So if you want to compete with the big sites, you at least need to bring BIG quality, even if your checking account is a lot smaller.
2:34 am on Mar 1, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I couldn't think of a scenario where Google would welcome a high bounce rate where a referal is involved. Doesn't mean there aren't any - i just couldn't think of any.


Not necessarily a referral - even separate domains to process orders for the same companies would cause huge potential problems in that data.
2:43 am on Mar 1, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member




Not necessarily a referral - even separate domains to process orders for the same companies would cause huge potential problems in that data.


If I remember DirectHit used to cookie users when they clicked on a result. If you came back and clicked on another result, some process would then determine that the initial site clicked on was low quality (simplifying this.. I'm sure they factored in time and a few other factors).

Here's a link to consider.

[webmasterworld.com...]
3:07 am on Mar 1, 2011 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



A common reason for a quick bounce back to the SERP for me is looking for a specific tidbit, finding it, and then wanting to see corroboration from other sites. So I click a new result, spot some reinforcing or conflicting information, click back to the SERP. Rinse and repeat until I have an idea of the consensus.
3:51 am on Mar 1, 2011 (gmt 0)

WebmasterWorld Administrator 5+ Year Member Top Contributors Of The Month



I may look at SERPS and open a few results in new tabs one after another. I have FF set not to automatically switch to new tab when opened. Then once I have scanned the SERPs and when I am happy there is nothing else I would like to check, I go through tabs and look at the sites I selected from SERPs.

I do pretty much the same when reading a web page. If a (usually in-content) link sounds interesting, I may open it in a new tab, but continue to read the current page (so not to interrupt my train of thoughts). Then once finished with the current page, I would go to these newly opened tabs. Some I may close straight away if when scanned their content, it was not what I thought it would be.

So yes, bounce rate would be a very noisy signal in this case.
4:39 am on Mar 1, 2011 (gmt 0)

10+ Year Member



I have a handful of quick reference pages that are very useful for my readers/visitors (on one particular site), but the search traffic hits it and is gone within 20, 25 seconds (since I have the info laid out step-by-step on one page without any bs as filler). I have no doubt that the visitors get exactly what they need.

Now some of these pages have thousands of search visitors a day (they rank high and have for a few years), and because the solutions on these pages can be looked up by a wide group of people, only a fraction of them would ever be interested in browsing around this particular site.

I question the wisdom of allowing these reference pages to sit in google's index. They're very well done and complete, exactly the page you would want to reach when looking for this info. They're referred to regularly by my readers. However, the traffic to these pages is so high that it does affect my overall time on site and pageviews/visitor stats.

I've paused developing these types of pages because the more successful they are, the more they bring down my overall stats. What I could do is no-index them so at least my regulars will have access to them, but then the competitors and content farms who cherry-pick my content to mix-master for their sites will get the content ideas, links and traffic.

So I'm in a quandry. Can pages be TOO GOOD? Can some topics be TOO POPULAR for too wide a group and ding and dent your site in the end?
4:47 am on Mar 1, 2011 (gmt 0)

WebmasterWorld Senior Member themadscientist is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



I don't understand why you care about the stats beyond:
The stats tell me the visitors found what they were looking for.

What did you build the site for?
To fill a purpose or to have 'bragging' stats...
4:55 am on Mar 1, 2011 (gmt 0)

10+ Year Member



TheMadScientist if Google puts a competitor ahead of you because his page or overall site stats have a higher time on page or more pageviews per visitor (say he spread the info across 2 or 3 pages so people have to click around to find the solution, or beefed up the content with a lot of bs so time on site is longer)...wouldn't that/shouldn't that be a legitimate concern?

If I'm way out in left field on this, I'm definitely open to hearing why.
5:11 am on Mar 1, 2011 (gmt 0)

WebmasterWorld Senior Member themadscientist is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



IMO It's too noisy of a metric ... For exactly why you say: a competitor could do that, just like a site could hit you with 3 pop-ups to change your time on the page before a bounce back.

Bounce rate, page views and time on site factors are really something for the site owner to gauge and try to draw conclusions from, but using it externally doesn't work very well for exactly the reasons you are saying ... And these:

For several years, Matt Cutts, along with other Google spokespeople, have been talking about the fact that "bounce" is a noisy signal. Yea, they measure it but apparently they can't use it for any heavy lifting in the algo.

tedster

This only makes sense. In many industries what defines a bounce is purely subjective, and often a bounce means success, in the case of a telephone call or sale delivered via a different domain

CainIV

A common reason for a quick bounce back to the SERP for me is looking for a specific tidbit, finding it, and then wanting to see corroboration from other sites. So I click a new result, spot some reinforcing or conflicting information, click back to the SERP. Rinse and repeat until I have an idea of the consensus.

tedster

I may look at SERPS and open a few results in new tabs one after another. I have FF set not to automatically switch to new tab when opened.

...

I do pretty much the same when reading a web page. If a (usually in-content) link sounds interesting, I may open it in a new tab, but continue to read the current page (so not to interrupt my train of thoughts). Then once finished with the current page, I would go to these newly opened tabs.

aakk9999


These types of behaviors make it so as far as search results go the signals have to much 'noise' (in other words, 'no tellin why a visit went the way it did or what it means' scenarios) to use reliably. (Or so Google's engineers and others keep saying.)

IMO they're something to worry about internally, but not as ranking signals...
5:24 am on Mar 1, 2011 (gmt 0)

WebmasterWorld Senior Member themadscientist is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



And, if your site is the right answer, why would you change it to fit the pattern of the wrong answer? Doesn't make sense...

If they're smart enough to figure out what behavior signals mean at a microgranular level, then you should probably make sure you keep sending the 'right answer' signal instead of changing your site to fit the 'wrong answer' profile, I would think anyway ... I could be totally wrong of course.

Also, remember: They don't often look at sites to rank them... They rank pages, so it shouldn't matter what your overall site stats are, what they would most likely use is page-by-page behavior, just like they do for almost everything else. They're still trying to rank the right page for the query, not sites afaik.

You might have one page that's the right answer for one query and another that's not for another, and it would be silly of them to not rank the right answer because you had another page on your site that didn't fit another query right, wouldn't it?

Sorry for all the edits and additions ... I find it tough to just say what I'm thinking in English sometimes. lol

[edited by: TheMadScientist at 5:34 am (utc) on Mar 1, 2011]

5:33 am on Mar 1, 2011 (gmt 0)

10+ Year Member



Bounce rate, page views and time on site factors are really something for the site owner to gauge, but using it externally doesn't work very well for exactly the reasons you are saying ... Using those factors doesn't really make for better results.


ITA with everything you're saying, there are too many variables to make it a consistant signal of quality and I see it with my own eyes on my sites. But it doesn't matter what we think, what matters is what Google actually puts stock in (not what they SAY but what they DO). I get a bit freaked out every algo change and the advice is stuff like:

--what are your bounce rates like
--are people hitting the back button quickly
--is the page "thin" (some reference material can be too concise or low on word counts unless it's beefed up with bs)

Only a small fraction of the pages on this site are of this type, but they generate enough traffic that they do bring down overall numbers (a bit).

No, I'm not interested in changing these pages so they're less than they are (to increase time on site, etc.), just on pause right now for creating new content of this type.
5:47 am on Mar 1, 2011 (gmt 0)

WebmasterWorld Senior Member themadscientist is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



--what are your bounce rates like
Mine on one site are close to 100% is that bad? (I think it's good. It tells me people found what they needed without looking all over the place. That's how it should be isn't it? Maybe a high bounce rate isn't always such a bad thing?)

--are people hitting the back button quickly
Yeah, this could be an issue if it's constant, but we're talking about constant < 2 or 3 second visits for it to send a 'more reliable' signal. (IMO) And yours aren't are they?

--is the page "thin"
You said the answer is no already.

I added and edited a bunch, so we could have been posting at the same time, so let me repeat this: They rank pages for queries, not entire sites.

If you want to look at behavior data wrt rankings, look at the page stats, not the site stats as a whole... They're going to rank a page for the query, so the stats related to that page would for the specific query would be the ones that matter if they use it, not the site as a whole, imo.

BTW: I read somewhere Matt Cutts said the 'over-optimization penalty' was what people got from spending too much time on SEO message boards, or something to that effect ... I really think you're in danger of giving yourself one, because those are definitely self induced and it sounds like you're talking about fixing something that's not broken, imo.
7:52 am on Mar 1, 2011 (gmt 0)

10+ Year Member



I have another finding here. My site that got hit has a lot of links to other site sections. For example if my site is about Widgets: a page about Hungary, has about 15 links like: Hungary Widget 1, Hungary Widget 2, etc. Also, it has about 5-15 links about Hungary Region 1 Widget, Hungary Region 2 Widget, etc., it has about 20 links about Hungary City 1 Widget, Hungary City 2 widget, etc. These are all relevant to visitor as this will help them browse more relevant pages. + on the footer we have about 20 other Main City Widget links + 10 other Main Country Widget links.
So, yes, it might be over optimized although I find value for visitors with these links.

Can this be a problem? Too many onsite links? Can anyone confirm if they have something similar and they are hit by this algo change?
This 198 message thread spans 7 pages: 198
 

Featured Threads

Hot Threads This Week

Hot Threads This Month