| 3:08 pm on May 25, 2012 (gmt 0)|
I can't believe even Google can have any control over what happens ON my sites (conversions). What happens before they get there? Sure. Once they're there? I just don't see it.
| 3:17 pm on May 25, 2012 (gmt 0)|
The theory that google is getting better at profiling users... Therefore providing results tailored to the users... Is probably the most exciting thing going on in the web today. The reality that our conversion rate has been slowly ticking up... With much less traffic, may be a sign of that theory coming to life.
| 3:23 pm on May 25, 2012 (gmt 0)|
I agree with you, netmeg, but if Google can *detect* what's happening on your site after their traffic arrives, couldn't they manipulate future traffic according to their own goals if they wanted? I.E., if they can detect that a certain number of people have hit a "thank you for your order" page, and for whatever reason that's all the sales they want you to get today, then they could stop sending you any traffic, or only send you irrelevant traffic after that point.
Although, even if this is not only possible but actually happening, it shows a couple of ways we webmasters can stop it: dump Analytics, and keep trying to diversify our traffic so it's not mostly from Google. Because without Analytics, I don't see how they can detect what's happening on your site, and if you have other healthy traffic streams, Google can't possibly manipulate your *overall* conversions, just their slice of them.
I'm not an expert on algos or computer systems or anything, so if someone knows better, I will stand corrected. I'm just a layperson trying to make sense of this stuff.
| 4:01 pm on May 25, 2012 (gmt 0)|
|if Google can *detect* what's happening on your site after their traffic arrives, couldn't they manipulate future traffic according to their own goals if they wanted |
Throttling traffic + goal rate = throttling sales.
Just because you don't have goals set for your site in GA or any other software doesn't mean that Google does not have your conversion rate / goal calculated. They can easily estimate what your goal pages or goals are via - thank you pages, sign up pages, Adsense clicks, etc.
| 4:17 pm on May 25, 2012 (gmt 0)|
|The reality that our conversion rate has been slowly ticking up... With much less traffic, may be a sign of that theory coming to life. |
I am starting to see this too.
My conversion rate has improved from "dismal" to just "really bad" and I think I have Panda / Penguin to thank for that.
(I wish I were joking.)
| 4:23 pm on May 25, 2012 (gmt 0)|
Well first of all, I've yet to hear a reason WHY they would do it.
Second of all, if we're talking users who come to the site and immediately sod off, that's one thing. My bounce rates have never reflected that kind of traffic (except for bots).
But if we talking users who come in on normal keywords (to the extent we can discern from (not provided) + landing page) and actually browse the site, with multiple pageviews, maybe put one or two items in the shopping cart or maybe they don't - if they don't complete a transaction, that's on *me*, not Google.
| 4:24 pm on May 25, 2012 (gmt 0)|
Please excuse me if this has already been noted,
After reading the post by themaninthejar I carried out a site search, but insted I put site:domain com KeyWords. I was astonished at the pages that were being listed for the keyword, most times they only were partialy related, the relationship being that they were from Same folder.
If this is the case on everyones domain there is sure to be a pattern that can be looked at.
I have already seen something of a pattern. The pattern I see is
1. I have external links to these pages,
2. They are all linked from my home.
3. Other pages I have external links to but are more than 1 click in appear to be unaffected, and appear #1 on sitesearch Keyword.
There is a definate pattern here, either it is because they are linked from my home page or because they have external links to them and linked from my home page. Try it and see if you find the same.
| 5:16 pm on May 25, 2012 (gmt 0)|
"Well first of all, I've yet to hear a reason WHY they would do it."
@netmeg, Timwilliams gave a why on the previous page (you probably cross-posted with it): "Now for the WHY. I know we are talking about the serps but this makes sense for both the serps and the ads. Google needs multiple advertisers for adwords to work in their benefit. If there was only 1 dominate advertiser in a niche that advertiser would not have to bid up their ad. Same goes for the serps, if a serp is dominating the sales then the advertisers go away. It's in google's best interest for everyone to make a little money and no one to dominate. Keep us all hungry but nourished..."
There IS an inherent conflict between Google selling adspace at the top of the SERPs and having control over who naturally lands on the top of the SERPs. As long as they're just letting the algo manage the SERPs through math without bias, everything's fine. But Yelp and others have alleged that they manipulate the SERPs to benefit Google, and that would be in violation of US law, which is why Eric Schmidt won a fun-filled visit to Congress last year.
What timwilliams describes would benefit Google. And it would be very hard to prove, so in terms of calculated risk, I would expect they might go for it - it would be so hard to prove to the Department of Justice's satisfaction, and no one else could force Google to stop it.
| 5:59 pm on May 25, 2012 (gmt 0)|
Not buying it. It would be far too much of a risk if it got out - and it would get out. That's a short con, and Google plays long.
And that still doesn't address the fact that whatever happens on my site after a user arrives there is up to ME. I have control of the user experience on my own site, not Google.
| 6:24 pm on May 25, 2012 (gmt 0)|
|but instead I put site:domain com KeyWords. I was astonished at the pages that were being listed for the keyword... |
Oddly enough, I see the opposite for my site.
without the site:domainname.com operator, my "on topic" page is nowhere to be found for the keyword, and the home page shows instead where the on topic page used to rank.
With the site:domainname.com operator, my on topic page ranks first, while the home page ranks second.
| 7:04 pm on May 25, 2012 (gmt 0)|
Penguin Recovery Tips - a think tank thread
| 7:43 pm on May 25, 2012 (gmt 0)|
"That's a short con, and Google plays long."
Not since Schmidt left, IMO.
| 8:37 pm on May 25, 2012 (gmt 0)|
|With the site:domainname.com operator, my on topic page ranks first, while the home page ranks second |
Is this page linked from your home page as in my example
| 8:49 pm on May 25, 2012 (gmt 0)|
|Is this page linked from your home page as in my example |
yes, it was (and still is).
| 4:02 am on May 26, 2012 (gmt 0)|
@backdraft7 sorry if I offended you, but I honestly think you need to take a break if you're starting to believe that Google is somehow controlling the conversions on your site. That's not a wise-crack at all, but a very serious suggestion.
For anyone who thinks this is even slightly plausible please riddle me this. Given the computational requirements for them to track and analyse conversions on sites in real-time, then why is it that they have to run Panda and Penguin as batch jobs? Having spoken directly with Googlers who are close to this about how the process works it's fairly clear to me that the reason these run as batches is because of the time it takes to collate and analyse the data required. Now consider what they would require to do what some here are suggesting. Add to that the need to justify spend based on return?
Traffic throttling is trivial to them, and in that I am far more comfortable agreeing that it can and does happen.
| 4:41 am on May 26, 2012 (gmt 0)|
RC, why would it take so much computing power? It seems to me Google could simply add a "goal" to your Analytics on their end, that's not visible to you, if they wanted to track landings to a certain page on your site. I would think this actually takes LESS work than traffic throttling, but again I'm no expert.
| 7:30 am on May 26, 2012 (gmt 0)|
@diberry couple of points in reply:
1. How would they simply add a goal? Someone manually visit your site, purchase something, and then note the URL? How would they do this algorithmically? How would they then be able to test that their algorithm was actually identifying the correct pages?
2. GA is updated in batches not real-time, so how would they use GA data to then throttle your site accurately by goal URL hits?
3. How would they integrate search with GA? Do you think they would use the API to see how many hits a page had at a given time and then throttle based on the response?
Honestly, I feel there's some irrationality creeping in this thread. I understand real people and businesses are hurt by Penguin, but a rational thinker would look at some of the ideas in this thread and simply say "why?". I've not seen one good reason mentioned here as to why Google might do what some are suggesting they're doing.
If Google doesn't like your site it's far easier for them to:
a. adjust your rankings
b. remove your site altogether
c. throttle the referrals they send you
I've yet to hear one good reason why they would throttle based on what actually happens on your site, and anything that correlates to this is IMO pure coincidence. Again all opinion, and again apologies if anyone was offended by my earlier comments.
| 9:21 am on May 26, 2012 (gmt 0)|
Getting back to the topic.
MC posted on April, 24th:
|We want people doing white hat search engine optimization (or even no search engine optimization at all) |
Has anyone tried to minimize SEO? Like removing the keywords or description meta tags from the HMTL header?
| 9:35 am on May 26, 2012 (gmt 0)|
Google just pushed data refresh for Penguin: [twitter.com...]
|Minor weather report: We pushed 1st Penguin algo data refresh an hour ago. Affects <0.1% of English searches. |
| 9:43 am on May 26, 2012 (gmt 0)|
Well, no change here so far :( Don't know how long any changes might take to propogate but absolutely no change so far.
| 10:43 am on May 26, 2012 (gmt 0)|
Tedster Wrote "Remember also that Penguin (just like Panda) is not a real-time algorithm for now - it will refresh only periodically. So you won't know if your changes helped or hurt until Penguin re-runs."
Tedster I am seeing almost daily changes in serps since early May in my niche. Top 4 spots seem to be rotating around. Is this a result of different servers or some kind of new algo? Makes it difficult to make or not make changes.
| 1:36 pm on May 26, 2012 (gmt 0)|
I suggest the non-believers in throttling move over to the Zombie / Traffic Shaping topic: [webmasterworld.com...]
| 2:57 pm on May 26, 2012 (gmt 0)|
If you have made on page changes, when doing a data refresh, would Penguin look at what is on the site or the latest cache version, which may not include the changes made?
| 3:01 pm on May 26, 2012 (gmt 0)|
Re: the shift. I'm up on some terms, down on others. But "my domain" now puts me at #1 again instead of #4. My sitelinks don't show up on that search, but they do show up on "mydomain.com."
It's hard for me to evaluate Penguin on the basis of my site. While Google does a pretty good job sending relevant traffic to most of my sites, it's always been odd with this one. This site has some great pages (IMO) that have been largely ignored by Google in favor of lesser quality sites that Google loved. The traffic Google sent me has never done much but look at the page in question and move on - they don't subscribe/bookmark, don't leave comments, only about half of them try other pages of the site, etc. So while I wish Google would recognize my higher quality pages, I don't really take issue with them finally waking up to the fact that they were ranking some of my less stellar pages way too high, LOL.
But as for the Post-Penguin SERPs in general, I'm still not loving them. Certain brands and sites like About.com seem to be ranking high whether or not they have a really good page for the query. As a user, nothing I'm seeing compels me to switch back from Bing.
| 4:44 pm on May 26, 2012 (gmt 0)|
I'm curious, how many of the sites that dropped were hand crafted html vs. how many were CMS based, like Wordpress, Joomla or Drupal? The hand crafted sites usually house the most valuable content while many of the CMS based sites are the quick, easy to setup scraper or thin MFA sites. Those sites, with their chameleon like ability to write different tag titles for the same few paragraphs of content seem to be ranking highest in my niche.
"Penguin" may have addressed web spam, but now we need an "Ocra" update to devour the thin content that has been left behind and of not much value to most end users. I see that as their next move...but again, this is just my guess.
| 10:38 am on May 27, 2012 (gmt 0)|
|how many of the sites that dropped were hand crafted html |
If it adds anything to the debate my site is handwritten html, and has suffered.
| 12:53 pm on May 27, 2012 (gmt 0)|
For the main UK 2 word term I track. If I add UK to the end of that term the serps I see are what I used to see for the 2 word term before penguin.
This really does seem to be based on a target term list. Other terms I rank for have not been affected at all only the big $ terms.
I've just tweaked on page seo for my page that ranks the best for this term. Reduced density and repeats. Reduced usage of the term in anchor text on this page. I'll be interested to see if it improves.
| 2:42 pm on May 27, 2012 (gmt 0)|
My site that suffered was Wordpress. But so are some of my sites that came through just fine. Then again, I haven't had it confirmed by Google that there's a manual penalty on my domain - when/if I do, I'll post that in here.
| 3:40 pm on May 27, 2012 (gmt 0)|
any of you see multiple sites near each other, on the same server, or with the same hosting company that were hit by Penguin? I'm trying to think outside the link box (I've done what I can to clean up backlinks), by utilizing some potential spam indicators mentioned in previous patents. I have seen a handful of sites that are with the same hosting company and were all Penguinized. They utilize different backlinks, and some have good authority/trust.
| 3:57 pm on May 27, 2012 (gmt 0)|
I'm trying to think outside the link box
I'm not sure if that will help. A friend who has been helping me with the problem put it like this:
| 4:01 pm on May 27, 2012 (gmt 0)|
|I'm trying to think outside the link box |
I'm not sure if that will help. A friend who has been helping me with the problem put it like this:
"But the question is: why does Penguin run intermittently if Google are convinced it yields better results? Presumably it must be because it is expensive or protracted to run. If this is so it should give us a strong clue as to its operation. I think this strongly points to Penguin performing a recursive algorithm on linking structures rather than on content. This may explain the strange disconnect between content and ranking."