| This 146 message thread spans 5 pages: < < 146 ( 1 2  4 5 ) > > || |
|Google Updates and SERP Changes - April 2011|
< continued from: [webmasterworld.com...] >
Here's another fail:
One of the major manufacturers in my field launched a new product today that generated quite a media frenzy. The name of the product is made up, but G treats it like a spelling error for some other word. When searching 'brand product' you get the 'did you mean' option. I know of more than 20 big media outlets that reported on this new product. Unfortunately, g has only 3 and a half pages of results on it, half of which are not related. All the other reports about this product, all legitimate, are nowhere to be found in the SERPs.
Big fail. I guess their machines are down at the pub, instead of learning...
[edited by: tedster at 4:42 pm (utc) on Apr 1, 2011]
Google is gyrating or yoyoing the results almost daily even on the deeper results. An increase or decrease of ten positions is not unusual from what I have seen.
|-50 for my last site today. Now all of my sites are -50. Congratulations Google! |
Wow that is really harsh. I feel for ya. So 100% of your sites are now -50? Do you suspect the 301 for this most recent site, as you did for the others or a different cause?
Thanks crobb, yes, they killed my adsense business for good.
Now only Yahoo and Bnig traffic. Yes, I think there were too many sitewide 301 redirects which raised a flag, but what I don't get, why do they crash all domains, even those without any redirects.
|When my domain didn't even rank for its own name. Do you see your domains ranking at all? |
|They were not even ranking for their own domain name so it was the usual -50 box. I checked all of them and not a single site was ranking for their domain name. |
hi @crobb305 and @SEOPTI,
Something changed on Friday, my site is actually on top now for my site domain name. It is not behind my twitter page.
I made the following changes
1. Removed G adsense
2. Removed G Analytics
3. Removed most of the thin pages
4. noindex/follow many thin pages
6. Added 80% images, 20% videos
Right now I have only quality content pages with more than 500 keywords. I gained around 15-20% last couple of weeks, but today lost another 20%.
One thing I did not do is site design, I heard that dark colors on top (example: hubpages) may cause some issues. I know it is weird, but we are talking about G search algo, it is already effed up, so may need to play to their rules including design and colors or just play the waiting game.
In the end the Goog Illuminati Billionaires will always win.
I believe there are two penalty layers.
1. You will not find your website keyword(or domain name) on top of the results (example: mahalo)
2. You will lose SERPS, your backlinks are not valuable anymore.
I am in the second layer now.
Has anybody tried really thinning a domain down to just a couple of key pages.. I have just been hit hard by the UK update and Im seriously considering 404ing 30,000 of my 31,4000 indexed pages.
Has any body recovered doing this.
|Has anybody tried really thinning a domain down to just a couple of key pages.. I have just been hit hard by the UK update and Im seriously considering 404ing 30,000 of my 31,4000 indexed pages. |
I'd be worried about losing traffic from other sources like Bing/Yahoo, if those are good entry pages. If you're wanting to do this, I'd do a handful at a time, basing your choices on some real data (like pages that rarely get search traffic, have few to no inbound links, or saw very large drops in position according to WebmasterTools).
|Has any body recovered doing this. |
I haven't heard of many official recoveries yet after 6 weeks here in the U.S., at least no one talking about it.
crobb305 is right. Before you do this look to see if you are getting any traffic at all from these pages. I had lots of these pages that were created by dynamic datafeed that I didn't know were cluttering the index and that Google bot was happily crawling and indexing despite the index,nofollow I put on the first page. Somehow the bot got on the 2nd pages, 3rd pages, etc... 99th page and were indexing them. Changed that now to noindex, follow for page 2+. As result, I have reduced my site by 43% and got rid of all the thin content (lots of blank pages) and seem to be recovering from being Panda-wacked.
Thanks for the feedback mslina2002 & crobb305.
I know what you guys mean I feel like im throwing away thousands of useful pages pages to get others to recover.
The pages Im thinking of dropping are pretty thin product pages that have no unique or in-depth content they basically have content on them from Affiliate product feeds. They do rank pretty well in Bing but the traffic they bring in is very small, if they helped the other pages recover it would be a small price to pay.
So is it everyone's feeling that we should noindex all thinner pages, that have minimum or no clicks on them? How can you noindex entire directories? Will this remove pages that google has already indexed? What is the code you would have to add to a flat html file for noindex? How about wordpress, are tag pages ok to leave in (I used to get a lot of clicks on these). It seems that trimming the site will help you recover from Panda if I am reading this right.
|It seems that trimming the site will help you recover from Panda if I am reading this right. |
So far that's the consensus of opinions, nothing more. To the best of my knowledge, nobody here has shown proof that trimming his site brought it back. In fact, I don't know that anyone whose site was really affected by Panda has come back.
|1. Removed G adsense |
2. Removed G Analytics
3. Removed most of the thin pages
4. noindex/follow many thin pages
6. Added 80% images, 20% videos
Very smart, according to what I've seen in forums or demoted sites. We've also heard the same from Google.
Too many ads are an obvious one to me, so I would cut down drastically and if Adsense makes only a bit, remove it totally. Why take the chance?
Looks like we're in for a 60-9 day penalty. Google is showing us who the boss is. They are so sure that their alpha algorithm is right that they were keeping us out of business for at least 2 months.
I'd be VERY surprised if the Panda changes act like a penalty, with some set time after which rankings return. Why would Google do that? It would be like throwing away their whole project.
@walkman, website design (with respect to human behavior) is also a big factor here. I've added detail analysis here.
Yeah, I think this is about as much of a penalty as Florida [webmasterworld.com] was.
Did they get the name right or what? Seemed to retired a bunch a people from what I've read.
Hmm... You know how Google said they've been working on the Panda update for about a year? Well, I just came across this:
|Amit Singhal: Well, we named it internally after an engineer, and his name is Panda. So internally we called a big Panda. He was one of the key guys. He basically came up with the breakthrough a few months back that made it possible. |
Source: TED 2011: The ‘Panda’ That Hates Farms: A Q&A With Google’s Top Search Engineers [wired.com] (This has probably been posted already, but I likely missed it.)
The article is dated March 3, 2011. Note that this update's namesake only had the breakthrough "a few months back". Is it possible that a significant factor in this update didn't get quite the rigorous testing and tweaking that we had assumed from Google's "a year in the works" comment?
My best guess, with a lot of help from Bill Slawski [seobythesea.com], we're talking about Biswanath Panda and his expertise in large scale machine learning.
What should we do with PHP pages where the main purpose of these are to redirect to another site depending on the parameter passed. Should these links be put in a directory that is blocked by robots? I can see that the spiders are indexing all the variations of these pages. Several hundred to be exact. Do these count as thin pages?
I had the same question for John mu a year ago. He advised to not only robot.txt out that directory, but add a no index,nofollow tag to the php file. It worked like a charm, although I was stuck by panda.
Is anybody yet seeing signs of the algo changes Matt Cutts said were coming including page load speed (slow reduces rankings) and domain name keyword value (less than before)?
I still can't tell if recent rankings drops are because of Panda 2.0 or the changes Cutts promised. Three of the sites in question were perfect .com domains for the main keyword search phrase. They were built years ago and were at the top of rankings for years, too. One page on another site that had been top 5 until recently is one of the slowest-loading pages on the site. It's now on page 4.
Today I saw a big change for a serp non-english.
Especially for the numbers of the first results for the first domain
Example, if you search skype now first 3 results are from skype.com before there were 4 results
I am seeing competitors who are #1, #2, and #3 for widgets get a big boost for not only related searches red widgets, white widgets, and blue widgets, but also getting an extra one or two search results.
For example, whereas perviously site A had #1 and #2 positions, my site was #3, and site B had #4 and #5; site A now has #1, #2, and #3; while I've been bumped to #10; but site B now is #4, #5, and #6.
On other searches, sites A and B are suddenly outranking other sites and getting top positions.
We are seeing the centralization of power. More power to fewer sites.
Perhaps connected to the related searches...
I've recently been seeing, beginning roughly Thursday (April 14), that, for a given query, Google appears to be shifting the entry page it returns for a site to pages that it's deciding are more useful. It could have been happening sooner.
In one case, Google dropped an inner page that was optimized for the query and returns instead the home page, much less traditionally optimized for the query, but which is a better entry page to find the related pages.
On another site, which ranks number one for the single word "widgets" (competitive, with 26-million pages returned for the word), I'm observing a shift in the other direction... ie, from the homepage to an inner page for this term. The home page is loosely Brandname Widgets, and Google continues to return the homepage for [brandname] searches, but for the single word [widgets], it's now returning the main catalog page, which links to the main widget subcategories.
These pages may have been returned as clustered or "more results" pages from time to time, but this is the first time I've seen them ranking alone, replacing the expected page.
|I've recently been seeing, beginning roughly Thursday (April 14), that, for a given query, Google appears to be shifting the entry page it returns for a site to pages that it's deciding are more useful. It could have been happening sooner. |
I have observed this on my site. Hard-hit by Panda overall (-60% traffic), I have one inner page that improved in ranking, and consistently shows up in the top 10 for some queries. Occasionally, Google will display my homepage there instead of that inner page -- in the exact same ranking. I've seen this happen a few times in the past 2 weeks.
|What should we do with PHP pages where the main purpose of these are to redirect to another site depending on the parameter passed. Should these links be put in a directory that is blocked by robots? I can see that the spiders are indexing all the variations of these pages. Several hundred to be exact. Do these count as thin pages? |
This has been one of my biggest questions. For years, I have housed my affiliate links in a php redirect file. I have always blocked that file via robots.txt, but because Google couldn't follow the links, it was indexing them (with no description). My guess is that they are/were being classified as "thin".
I have often changed out affiliate links. When I deleted links out of the redirect file, Google wasn't able to discover the 404, and continued to index them. Over the years, dozens were accumulating. At the time of Panda, 30 redirect links were indexed in Google and 90% of them were 404 (and have been for over a year, but Google didn't know it). I have since removed the robots.txt denial and the dead redirect links have been deleted from the index. Only 2 valid links remain, but I am trying to figure out the best way to handle this in the future. If I were to add the deny back into robots.txt, those links would simply reappear in the index because Googlebot keeps requesting them from memory...if it can't encounter a 404, then it will reindex them.
Can a noindex, nofollow be added into a php redirect file above the script?
Google has completed a really deep crawl in my site over these two days. Looked for pre-2004 file names too, presumably some obscure page linked to that page.
|I have often changed out affiliate links. When I deleted links out of the redirect file, Google wasn't able to discover the 404, and continued to index them. Over the years, dozens were accumulating. At the time of Panda, 30 redirect links were indexed in Google and 90% of them were 404 (and have been for over a year, but Google didn't know it). I have since removed the robots.txt denial and the dead redirect links have been deleted from the index. Only 2 valid links remain, but I am trying to figure out the best way to handle this in the future. If I were to add the deny back into robots.txt, those links would simply reappear in the index because Googlebot keeps requesting them from memory...if it can't encounter a 404, then it will reindex them. |
It may be best to create a new folder, block it in robots.txt first, then make changes to your code to look in this new folder for your redirect file. Then delete your old files and let the search engines search the old folder (where the file will 404 back and be removed from the index) Would this work?
|Can a noindex, nofollow be added into a php redirect file above the script? |
I tried it, but it did not work, the redirect did not work after I added noindex, nofollow at the top of the file. Anyone know how to do this?
I haven't seen anyone post this but did anyone see a huge spike in traffic before it dropped? on a few of my sites I saw a 30% increase in traffic for 2 days, april 5th on 1 site and april 12th on another. I was excited and thought yes finally google is recognizing the hard work I've put into unique content. Now for this past week those same sites have dropped 30% from my original traffic levels!
1 site is a comparison site with some duplicate content but plenty of unique articles. But another is ecommerce with hundreds of unique content in terms of images and descriptions and no duplicate content, plenty of social links and activity (facebook likes and comments) too I might add, proof that humans do like my content! good work google!
I'm hoping this will just blow over and the traffic will come back, anyone else see a spike in traffic before the drop?
The Panda update is causing the rich to get richer and the poor to get poorer.
For the first time just now I saw one site with pages in the top FOUR SERPs! And I thought three was extreme. It's a lame site and has stolen content.
What's next? At this time next year will one site get the entire first page?
It appears quite clear to me now that the ads on your site is what google is going after and downgrading you for having them. Whilst searching im clicking on preview and you may notice that the ads are now white space. This doesnt just affect google ads but also other cpm ads spaces.
publishers will need to totally revamp revenue streams, provide in house ad system.
stevelibby, they are ads and ads. The graphic, in house, ones get a pass IMO, since you need to have 'good' content to sell ads. The Google (and other text) ones on the other hand are for everyone, including the cr@piest 'content'.
On another note: Googlebot has so far today gotten 50% of my pages. Indexing has not slowed at all during these times.
| This 146 message thread spans 5 pages: < < 146 ( 1 2  4 5 ) > > |